Complex circle manifold and orthogonality

160 views
Skip to first unread message

khalhuj...@gmail.com

unread,
Mar 7, 2018, 1:41:33 AM3/7/18
to Manopt
Hi every one,

I am trying to solve the following optimization problem:
min_{x,X} x^H A x
s.t. x in complex circle manifold (x is complex vector with n*m entries)
X in Stiefel manifold ( X is an n by m complex matrix, and x = vec(X))

I asked Nicolas about this problem, and is there any way to deal with these two manifolds simultaneously, he suggested to me optimize over one manifold and encouraging the second constraint by adding a penalty term to the cost function. With this idea the problem can be rewritten as:

min_{x} x^H A x+alpha ||X^HX-nI||^2_F
s.t. x in complex circle manifold (x is complex vector with n*m entries)

I started my solution by employing the steepest descent on the complex circle manifold with the following gradient
grad = Ax+2*alpha*vec(XX^HX-nX)

I have reasonable results until now.

I have the following questions:
1- I need a help or a reference to proof that the algorithm converges to a local minimum. Or alternatively, I need to find a condition on the step size to ensure the monotone decrease.

2- is possible to implement this idea by using Manopt? I ask this question because I tried to use the above gradient but using x and X complicated the task.

Regards,
khaled

khalhuj...@gmail.com

unread,
Mar 7, 2018, 8:50:46 AM3/7/18
to Manopt

I forgot to mention that the matrix A is nm by nm symmetric matrix, and the norm can be simplified to ||X^HX||^2_F-n^2*m due to the constant modulus constraint on the entries of the vector x.

Nicolas Boumal

unread,
Mar 7, 2018, 9:42:02 AM3/7/18
to Manopt
Hello Khaled,

Did you check that the gradient is correct using the tool checkgradient(problem) in manopt? (This tools is explained in the first example of the tutorial on manopt.org).

About your questions:

In general, GD does not converge to a local optimum, though it is the expected behavior. Something you can certainly ensure is that the norm of the gradient will converge to 0 at a certain global rate:

About implementing this in Manopt: by default, steepestdescent(problem) in manopt runs GD with a backtracking line-search algorithm, which is covered in the paper linked above. And your manifold is a compact submanifold of a Euclidean space, so you just need to check local lioschitzness of the (Euclidean) gradient of it in the ambient space, which is fine.


You could also optimize ADMM-style: optimize over both x and X, and incorporate a penalty for the distance between vec(X) and x. One possible reference for this is the MADMM paper:


Best,
Nicolas
Message has been deleted

khalhuj...@gmail.com

unread,
Mar 8, 2018, 2:25:14 PM3/8/18
to Manopt
Many thanks Nicolas.

I am working on your suggestions.

Reply all
Reply to author
Forward
0 new messages