In a previous post on this forum, you kindly provided me the codes for handling optimization problems on symmetric definite positive (SDP) matrices and it works pretty well on various cost function on this manifold.
I am currently trying to optimize a cost function of the form :
F(X) = Trace(log( X^0.5 A X^0.5) log( X^0.5 B X^0.5) ) with X, A and B SDP matrices (and X^0.5 being the matrix square root of X).
It turns out that looking for the gradient of F (wrt X) is a rather tricky question. Indeed, when checking my gradient numerically with manopt, it seems that I made a mistake on my computation.
Even if it is not directly related to the manopt toolbox, I was wondering if you would have some hints about the calculation of this gradient ?
Thanks again for sharing all this work under an open-source licence !
Florian
Thank you for your answers.
Indeed, my cost function is very similar to the function g.
I will implement the gradient to chech this out (thanks for your source code) and I will let you know.
To answer your question, I am looking for the Riemannian gradient (and I intend to use the affine invariant metric).
For now, after reading those two papers, the details on how to compute the gradient are not totally clear to me. Anyway, thanks for your help !
This is indeed very helpful !!
Thanks again for your help.