Thanks very much for your time and kind help.
I worked out the Gradient and Hessian matrix of the objective function.
Now, I want to solve this minimization problem using the ARC algorithm in the 'manopt' toolbox.
In page 26 of your paper Adaptive regularization with cubics on manifolds, for ARC, the equation (3) are not checked to slove the subproblem. As far as I know, to avoid the saddle points of a optimization probllem, the gradient of the cost function should be 0, and the Hessian matrix should be positive define. If only the first-order condition is considered, can it be guaranteed that saddle points are avoided?
Moreover, in equation (2) and (3) on the paper, the gradient is not strictly equal to 0, but less than a value. The matrix is not strictly positive definite, but the smallest eigenvalue is greater than a value. What's the reason for doing this？
Thanks for your time again.