Why the checkgradient failed except the first iteration?

34 views
Skip to first unread message

grandowife

unread,
May 25, 2019, 10:52:58 PM5/25/19
to Manopt
Hi,

Lately, I have a lot of problems about MADMM + manopt optimization scheme.
I upload my Matlab code to https://github.com/grandowife1234/madmm-manopt/blob/master/testRealMADMM4NSSPOC.m   (Both optNSSPOC/optNSSPOC1  reach to same problems )

I run this code and found the checkgradient shows:


1.png



And I am confused on the checkgradient failed except for the first iteration, can anyone help me?  

Best wishes,
Qiuying Shi

Nicolas Boumal

unread,
May 28, 2019, 11:47:59 AM5/28/19
to Manopt
Hello,

Did you check that Z and Y (which are updated at each outer iteration of MADMM) are also updated in the function handles that you create for cost and gradient? This might be a variable scope issue.

Best,
Nicolas

grandowife

unread,
May 28, 2019, 8:40:19 PM5/28/19
to Manopt
Dear Nicolas

Variables Z and Y are not updated by Manopt, so I have no idea of how to check their gradient.
And I wanna ask what is a variable scope issue??

Have you ever try MADMM algorithm? Does it reach to a fairly good performance in your problem?

And just now, I found the reason why the checkgradient failed, I have a typo in my gradient calculation!   It is a little bit awkward...   :(

Thank you very much!

Best wishes
Qiuying Shi

在 2019年5月28日星期二 UTC+8下午11:47:59,Nicolas Boumal写道:

Nicolas Boumal

unread,
May 29, 2019, 9:19:41 AM5/29/19
to Manopt
Hello,

I understand that Z, Y are not optimization variables. They are, however, used to define the cost function and its gradient. Hence, when they change, it is important that the cost and gradient function "see" that change (this is what I meant by variable scope: this is a general programming concept).

You mention a typo: does that mean that now checkgradient is fine?

About MADMM: I haven't used it myself, but the authors of the paper report good results. Then again, ADMM (even without manifolds) is known to converge rather slowly in some contexts, so perhaps this is what's going on here.

Best,
Nicolas

grandowife

unread,
May 29, 2019, 10:25:25 PM5/29/19
to Manopt
Dear Nicolas,

Yes, I do the checkgradient during the iterations of the MADMM, and it looks fine now.
And I also check the cost and gradient of variable U can 'see' the changes from variables Z, Y.

These are really brilliant suggestions, thank you very much!

I tried to use MADMM for solving a problem in the formulation like:

min F(U) + G(Z), 
s.t. Z = U'

where F(U) = f_1(U) + f_2(U)  is a smooth non-convex constrained (U is constrained on Grassmann manifold) problem and G(Z) is a non-smooth convex unconstrained problem.

It seems to nicely meet the general form of the MADMM algorithm, but I found it shows relatively poor recognition performance in my problem.
And I am trying to find the reasons, do you have any good suggestions?

Best 
Qiuying Shi



在 2019年5月29日星期三 UTC+8下午9:19:41,Nicolas Boumal写道:

Nicolas Boumal

unread,
Jun 4, 2019, 7:35:38 AM6/4/19
to Manopt
Hello again,

I have little personal experience with MADMM, so I'm not sure what to expect for practical performance: unfortunately, I have no prior as to whether what you observe is to be expected, or if it is the sign of an issue with the code.

Best of luck with it, and do let us know if you find a satisfactory way to resolve this, as it will surely be interesting for other users as well.
Nicolas
Reply all
Reply to author
Forward
0 new messages