Proximal Point Method on Manifold

41 views
Skip to first unread message

Moin Uddin

unread,
Mar 19, 2024, 8:46:14 AMMar 19
to Manopt
Dear Manopt Team
  
I am designing a MATLAB code  using Manopt for the following problem: 
minimizing g(x) - h(x) over the manifold of positive definite matrices,
 where g(x) = ln(det(x))^2 
and h(x) = 2*ln(det(x)).
Here is my code 

function Proximal_Point_Algorithm()
n=10;
% Create the problem structure.
    problem.M = sympositivedefinitefactory(n);
    M=sympositivedefinitefactory(n);
       
    % Define the problem cost function and its gradient.
    [g]=@(x)(log(det(x)))^2;
    gradg=@(x) 2*log(det(x))*x;
    [h]=@(x)2*log(det(x));
    gradh=@(x) 2*x;
     A = randn(n);
     x= .5*(A+A');
    w=gradh(x);
    z=M.exp(x,w,0.5);
     problem.cost  = @(x)g(x)+ 0.25*M.dist(x,z)^2;
     %problem.egrad = @(x)2*log(det(x))-0.25*M.logarithm(x,z);
      problem.egrad = @(x) 2*log(det(x))-0.5*x*log(x*inv(z));
     % problem.egrad = @(x) 2*log(det(x))-0.5*x^0.5*log(x^-0.5*z*x^(-0.5))*x^0.5;
     checkgradient(problem);
      checkhessian(problem);
    % Solve.
    [x, xcost, info] =steepestdescent(problem);        
   
    % Display some statistics.
    figure;
    semilogy([info.iter], [info.gradnorm], '.-');
    xlabel('Iteration #');
    ylabel('Gradient norm');
    title('Convergence of the proximal point algorithm on the positive definite matrix')
end

I am using the algorithm attached in the PDF, but I encountered an error. I would be very thankful for any kind of help and suggestions.



2020_COA_A modified proximal point method for DC functions on Hadamard manifold.pdf

Nicolas Boumal

unread,
Mar 19, 2024, 8:56:15 AMMar 19
to Manopt
Hello,

checkgradient prints out the following text:

The slope should be 2. It appears to be: 1.
If it is far from 2, then directional derivatives might be erroneous.
The residual should be 0, or very close. Residual: 2.50622e-16.
If it is far from 0, then the gradient is not in the tangent space.
In certain cases (e.g., hyperbolicfactory), the tangency test is inconclusive.

This indicates that the gradient is incorrect. The first step is to fix that.

Before printing that text, there are a few warnings:

Warning: Principal matrix logarithm is not defined for A with nonpositive real
eigenvalues. A non-principal matrix logarithm is returned.
> In logm (line 91)
In sympositivedefinitefactory>@(X,Y)real(trAA(real(logm(X\Y)))) (line 101)
In @(x)g(x)+0.25*M.dist(x,z)^2
In getCost (line 59)


These warnings might offer a clue as to what is going on. It seems logm (matrix logarithm) is applied to matrices that are not positive definite. 

Maybe part of the issue is this bit in your code:

     A = randn(n);
     x= .5*(A+A');


There is no reason for x to be positive definite here. Don't know if that's needed, but thought I'd mention it.

Best,
Nicolas

Ronny Bergmann

unread,
Mar 20, 2024, 10:15:19 AMMar 20
to Manopt
Hi,
maybe it would also be easier to directly work with the Riemannian gradients here?

The problem at
is in Julia and the exponents are a bit different, but you can see that there the checks work fine. So besides the initial point, I would also first check your gradients.

Besides that, if the first step works, you of course still have to iterate your gradient descent calls (that solve the inner proximal maps) to get the actual DCPPA.

Finally “I encountered an error” is a bit vague, if you could provide a bit more details about the error, that would also be helpful.

Best
Ronny
Reply all
Reply to author
Forward
0 new messages