Step size of Steepest Descent method

63 views
Skip to first unread message

Hamide Zebardast

unread,
Apr 18, 2023, 1:46:04 PM4/18/23
to Manopt
Hi,
Hope everything goes well,
I am using  Steepest Descent method to solve my problem. The stepsize has been decreasing very soon, so I get this message : "Last stepsize smaller than minimum allowed options.minstepsize". Changing which parameter will help me solve this problem?
Best,
Hamideh

Nicolas Boumal

unread,
Apr 19, 2023, 1:19:07 AM4/19/23
to Manopt
Hello Hamideh,
It might be an issue with the gradient or the retraction.
Which factory are you using? Can you show us the output of checkgradient (picture + text output)?

Hamideh Zebardast

unread,
Apr 20, 2023, 4:07:25 PM4/20/23
to Nicolas Boumal, Manopt

Hi,

 

The figure of checkgradient is attached, and the message of checkgradient is as follows: 

The slope should be 2. It appears to be: 2.0002.
If it is far from 2, then directional derivatives might be erroneous.
The residual should be 0, or very close. Residual: 3.3749e-14.
If it is far from 0, then the gradient is not in the tangent space.
In certain cases (e.g., hyperbolicfactory), the tangency test is inconclusive.

 

The steepestdescent algorithm stops at the following point, while I have set options.tolgradnorm = 1e-6; : 

iter :  369     cost : +7.9783923792445523e+02   gradnorm :1.91234496e+01     stepsize : 8.81232e-11

Last stepsize smaller than minimum allowed; options.minstepsize = 1e-10.
Total time is 12.382918 [s] (excludes statsfun)

 

 

Best


--
http://www.manopt.org
---
You received this message because you are subscribed to a topic in the Google Groups "Manopt" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/manopttoolbox/RbandY2VGVw/unsubscribe.
To unsubscribe from this group and all its topics, send an email to manopttoolbo...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/manopttoolbox/545dd7ed-3e92-4b93-b57e-b3b9a543de02n%40googlegroups.com.
1.png

Nicolas Boumal

unread,
Apr 21, 2023, 1:52:10 AM4/21/23
to Manopt
Thanks -- the gradient seems to be fine indeed. Can you also specify which manifold factory you are using?

Hamideh Zebardast

unread,
Apr 21, 2023, 11:05:58 AM4/21/23
to Nicolas Boumal, Manopt
Thanks, I am using complexcircle factory. 

Nicolas Boumal

unread,
Apr 21, 2023, 11:09:23 AM4/21/23
to Manopt
Is the cost function differentiable everywhere?

If not, then one might happen is that the gradient check passes because it is executed at a random point along a random direction, and often times nonsmooth functions are smooth almost everywhere, so that would succeed. But often as well, nonsmooth function are nonsmooth specifically at minimizers, and so it would be quite natural for an optimization algorithm to converge toward a point where the function is not differentiable. Close to convergence, the gradient would be large, but it is impossible to make a large gradient descent because the function unexpectedly goes "up" almost immediately, revealing the nonsmoothness.

Hamide Zebardast

unread,
May 2, 2023, 2:56:00 PM5/2/23
to Manopt
Actually, the cost function is polynomial and I can get its gradient easily. The initial point is random, and I have used this point as an initial point of the chechgradient.  The gradient is ok for some random points, but it isn't ok for others. Is it mean that my gradient is not correct? 

Nicolas Boumal

unread,
May 3, 2023, 2:31:07 AM5/3/23
to Manopt
If checkgradient fails at some points, this is the sign that something is off, indeed.
Reply all
Reply to author
Forward
0 new messages