A question on symfixedrankYYfactory(n, k) [set of fixed-rank symmetric positive semidefinite matrices]

36 views
Skip to first unread message

ujjal2...@gmail.com

unread,
Mar 8, 2018, 3:43:13 PM3/8/18
to Manopt
Hi

Pardon for the simple question, as I am new to the Riemannian optimization theory and the tool. I have a loss function L(M), where M is a parametric dxd matrix. It can also be factorized as M=AA^T. I want to constrain M to lie on a symmetric positive semi-definite manifold, but with a fixed rank (say k). Initially, I formulated my loss function in terms of M, i.e. L(M), and computed its Euclidean gradient egrad(L(M)). But from the description in the page related to this manifold (http://www.manopt.org/manifold_documentation_symfixedrank.html), it seems like we have to compute and pass the Euclidean gradient in terms of A instead.

My question is: I need to reformulate my loss function in terms of A as L(A) (But that will make my formulation a non-convex one, and I wanted to avoid that.), compute its Euclidean gradient to pass it to the tool. Am I right?

Thanks
Ujjal

Nicolas Boumal

unread,
Mar 8, 2018, 5:55:06 PM3/8/18
to Manopt
Hello,

You are correct, you need to give it the gradient wrt to the factor A, and indeed that will be nonconvex. But sometimes that is ok (there is some recent research about optimizing on the factors that shows for many applications this is fine -- not always, of course.)

Notice that if g(A) = f(AA^T), then grad g(A) = 2 grad f(AA^T) A, so you already have done most of the work (this formula assumes grad f(AA^T) is symmetric, which it should be.)

Best,
Nicolas

ujjal2...@gmail.com

unread,
Mar 9, 2018, 2:40:50 PM3/9/18
to Manopt
Thanks a lot for the reply Nicolas.

One more question: Do you recommend using the Riemannian Conjugate Gradient Descent on this manifold? Though I have not personally verified, came to know from a colleague of mine that Trust-region approach is empirically very slow, while RCGD works just fine. Or is it better to stick to trust-region (given that now my problem will be non-convex).

P.S. The paper: [JBAS10] M. Journée, F. Bach, P.-A. Absil and R. Sepulchre, Low-rank optimization on the cone of positive semidefinite matrices, SIOPT, 2010. also uses trust-region.

Thanks again.
Ujjal

Nicolas Boumal

unread,
Mar 9, 2018, 4:17:18 PM3/9/18
to Manopt
Hello Ujjal,

You can easily try both CG and TR in Manopt: the problem definition is the same, and you just call either conjugategradient on it or trustregions on it. Then see which one works best for you. In my experience, TR is usually the best solver, but it's not always the case. The fact that the problem is non-convex does not play a role in this decision.

Best,
Nicolas

ujjal2...@gmail.com

unread,
Mar 10, 2018, 3:03:45 PM3/10/18
to Manopt

Hi thanks for all the help,

Can you point me to some recent papers that show that dropping the convexity is not a problem for PSD optimization? I know of one (https://arxiv.org/abs/1509.03917) but any more will be really helpful.

Thanks
Ujjal

Nicolas Boumal

unread,
Mar 10, 2018, 4:50:29 PM3/10/18
to Manopt
Reply all
Reply to author
Forward
0 new messages