Non Convex Optimization Using Manifold Optimization

168 views
Skip to first unread message

ujjwal jha

unread,
Mar 27, 2022, 6:20:02 PM3/27/22
to Manopt
Hello,
I have been working on solving  a non convex optimization problem. The problem is attached along, I have a slight idea to solve it using manifold optimization but still i am not sure if i can use it. Can any one of you having idea about this help to find a solution for this optimization problem where my feasible region is the area outside unit circle.Where m , h , x are vectors.
Hoping for your response !

Sincerely,
Ujjwal
Screenshot (1575).png

Nicolas Boumal

unread,
Mar 28, 2022, 4:23:33 AM3/28/22
to Manopt
Hello Ujjwal,

The short answer is that the inequality constraints do not define a manifold, hence manopt does not offer a direct way to handle them. This being said, manopt may still be helpful to address some reformulations.

I'm assuming m and x are vectors, not necessarily of the same size (I imagine x has fewer entries than m).
Are they complex or real?

If they are real, then I would split m in two components: one that is orthogonal to all h_k, and one that is a linear combination of them. Let us write:

m = m_1 + m_2

with m_1 orthogonal to all h_k, and m_2 = sum_k alpha_k h_k  where the alpha_k are unknown coefficients.

Since ||m||^2 = ||m_1||^2 + ||m_2||^2 and since the constraints are independent of m_1, we should just take m_1 to be the shortest possible vector that is orthogonal to all h_k, that is, we should set m_1 = 0.

Thus, we can rewrite the problem as: we are looking for m which is a linear combination of the h_k, whose norm is minimal, and which satisfies certain constraints. The whole problem can then be expressed in terms of the coefficients alpha_k, and it may take a cleaner form. In particular, I expect the h_k will appear only through their Gram matrix.

Maybe something like that still works for real vectors, I didn't think about it.

Also, this problem reminds of me SVM (support vector machines) -- there are all sorts of standard tricks there to reformulate the problem in effective ways.

Best,
Nicolas

ujjwal jha

unread,
Mar 29, 2022, 2:59:06 AM3/29/22
to Manopt
Hello,
m and h are the complex vectors, h_k is the member of C^{N x 1} while m  is the member of C ^ {N}.
Also, after reading at a few other places I realised that this is a non convex QCQP problem. I am not sure though how to formulate such a problem. Also thank you for your idea, but i could not completely get the idea which you have described here. Would you mind explaining it to me again?


Sincerely,
Ujjwal


Nicolas Boumal

unread,
Mar 29, 2022, 3:13:30 AM3/29/22
to Manopt
Hello,

My recommendation is to try the following: let the unknown m = a_1 h_1 + ... + a_K h_K be a linear combination of the given h_1, ..., h_K, where the (complex) coefficients a_1, ..., a_K are unknown. Collect the coefficients in a complex vector $a$.

Now, express the cost function ||m||^2 as a function of $a$ directly, and likewise express the constraints $|h_k^H m|^2 >= 1$ in terms of $a$ directly. (There is no need for $x$, really). Note that I squared |h_k^H m|: this might be helpful to get clean expressions without square roots when you take the complex modulus.

Overall, you should get a quadratic cost function in $a$, and quadratic constraints in $a$. You could then resort to typical constrained optimization algorithms such as Augmented Lagrangian Methods and the likes: see Nocedal and Wright's book for example, but there are also more modern references.

Best,
Nicolas

ujjwal jha

unread,
Apr 3, 2022, 7:32:09 AM4/3/22
to Manopt
Hello,
for the problem which i had, the feasible region was outside the sphere, i.e. |x_k| > 1, what if i want to optimise on the sphere i.e. |x_k| = 1. Then we can use the manifold optimisation right, if so how can i solve the optimisation problem for my case . I am actually not sure what are the steps i need to follow to solve an optimisation problem on manifold. would you mind providing me some assistance on how i solve manifold optimisation problem for my case.

Sincerely,
Ujjwal

Nicolas Boumal

unread,
Apr 5, 2022, 3:06:29 AM4/5/22
to Manopt
Hello Ujjwal,

The steps would be to write down the cost function and its gradient. You can work out the Euclidean gradient, and Manopt will take care of computing the Riemannian gradient from it.

You can take a look at the tutorial page on manopt.org: it explains how to go through the steps of selecting a manifold (there is a list too), defining your cost function and gradient, and putting all of that into a problem structure that you can pass to a solver. You can also pass that problem structure to checkgradient, to check that the gradient is correct.

If you have a very recent version of Matlab, you can even just define problem.M = ... (your manifold)...; and problem.cost = @(x) ... (your cost function) ...; then call problem = manoptAD(problem); to use Automatic Differentiation to get the gradient and Hessian automatically. There are some limitations (you may need to express your cost function in simpler terms), but it works rather well overall.

Best,
Nicolas

ujjwal jha

unread,
Apr 5, 2022, 3:20:39 PM4/5/22
to Manopt
Hello, 
I have used the  Lagrangian  method as was suggested by Nicolas Boumal, now i transformed my original optimization problem into a  Lagrangian   form using duality. Now the current case is such that it requires Semi Definite Relaxation method for solving it, but i am not sure how to use it properly, can you please guide me !! Thanks

Screenshot (1616).png

Nicolas Boumal

unread,
Apr 7, 2022, 3:44:46 PM4/7/22
to Manopt
Hello,
To solve a semidefinite program, your best bet would be to use CVX: http://cvxr.com/cvx/doc/quickstart.html
Best,
Nicolas

ujjwal jha

unread,
Apr 9, 2022, 7:32:48 PM4/9/22
to Manopt

Hi, 

can anyone of you suggest me the CVX code so as to implement this optimization problem, i am actually confused on how to implement it 

Thank You !



Screenshot (1620)_LI.jpg
Reply all
Reply to author
Forward
0 new messages