Conjugate Gradient Opt with Complex Circle Manifold Not Iterating

29 views
Skip to first unread message

Askin Altinoklu

unread,
Apr 12, 2024, 7:32:26 AMApr 12
to Manopt
Hello,
First of all, many thanks for your giant contribution with ManOpt codes.

I am trying to solve the following problem, but the optimization is not iterating, it is stacked at the initial guess vector solution.

A = A_w1; % Complex, PSD & Hessian Matrix
n = size(A_w1, 1);
manifold = complexcirclefactory(n,1);
problem.M = manifold;
problem.cost = @(x) -real(x'*(A*x));
problem.egrad = @(x) -(2*A*x); % notice the 'e' in 'egrad' for Euclidean
checkgradient(problem);
[x, xcost, info, options] = conjugategradient(problem);

A is N by N complex, PSD and Hessian matrix  , x is N by 1 complex vector where abs(x_i)=1 for each element of x.  Is there any particular condition that I need to check with my A matrix in order to assure the problem is feasible? Because for randomly generated A matrix it works, for my particular A matrix corresponding to the evaluations of some physical equations it does not work.



Command prompt is as following.
The linear model appears to be exact (within numerical precision),
hence the slope computation is irrelevant.
The residual should be 0, or very close. Residual: 2.7203e-24.
If it is far from 0, then the gradient is not in the tangent space.
In certain cases (e.g., hyperbolicfactory), the tangency test is inconclusive.
 iter                cost val     grad. norm
    0 -5.4058595235481914e-09 1.03394503e-08
Gradient norm tolerance reached; options.tolgradnorm = 1e-06.
Total time is 0.399878 [s] (excludes statsfun)


Best Regards,
Askin


Nicolas Boumal

unread,
Apr 13, 2024, 6:17:31 AMApr 13
to Manopt
Hello Askin,

Thank you for posting your question with all the helpful details.

Your code looks fine to me. I ran it with a random Hermitian matrix A generated as follows:

n = 10;
A = randn(n) + 1i*randn(n);
A = (A+A')/2;

And the method behaves fine. Running your code on this matrix A, I get this output:

manifold = complexcirclefactory(n,1);
problem.M = manifold;
problem.cost  = @(x) -real(x'*(A*x));
problem.egrad = @(x) -(2*A*x);      % notice the 'e' in 'egrad' for Euclidean
checkgradient(problem);
[x, xcost, info, options] = conjugategradient(problem);


The slope should be 2. It appears to be: 1.99998.
If it is far from 2, then directional derivatives might be erroneous.
The residual should be 0, or very close. Residual: 1.6291e-15.

If it is far from 0, then the gradient is not in the tangent space.
In certain cases (e.g., hyperbolicfactory), the tangency test is inconclusive.
 iter                cost val     grad. norm
    0 +8.9517469513430434e+00 1.26779023e+01
    1 -2.2137500517750137e+00 1.16026939e+01
    2 -1.0079667313835236e+01 8.21149715e+00
    3 -1.4255934292774189e+01 5.62922575e+00
    4 -1.7862462648699253e+01 3.75557074e+00
    5 -2.0476114928862774e+01 3.50936007e+00
...
   48 -2.9353165326766366e+01 7.44938409e-06
   49 -2.9353165326772114e+01 4.28791094e-06
   50 -2.9353165326774622e+01 2.45076314e-06
   51 -2.9353165326775756e+01 1.63378774e-06
   52 -2.9353165326775919e+01 7.75928002e-07

Gradient norm tolerance reached; options.tolgradnorm = 1e-06.
Total time is 0.503722 [s] (excludes statsfun)

Perhaps there is something special about the matrix A in your code?

In your run, the gradient norm tolerance is reached immediately. Perhaps A should be scaled up (made bigger), or options.tolgradnorm should be reduced?

Nicolas

Askin Altinoklu

unread,
Apr 14, 2024, 7:43:29 AMApr 14
to Manopt
Hi Nicolas,

Thank you for quick response. Actually, you were right. The matrix A in my case is associated with very low valued entries. Your suggestion solved my problem. Thank you for your contribution.

Best Regards,
Askin

13 Nisan 2024 Cumartesi tarihinde saat 11:17:31 UTC+1 itibarıyla Nicolas Boumal şunları yazdı:
Reply all
Reply to author
Forward
0 new messages