Hello, Mr. Boumal
I am sorry for letting you feel confused.
My problem is to jointly optimize W and U by gradient-based algorithms (e.g., SGD).
Matrix U is required to be orthogonal and I do not put any requirement to matrix W.
To this end, I first need to calculate the derivative of the upper objective function f() w.r.t W and U.
Then, iteratively update these targets with their derivatives until converge.
However, I found it is not easy to run this process on Grassmann manifold, since there are two different objective functions (i.e. f, and, g).
And I do not know how to exactly define the 'cost' when using the solvers in the manopt.
Therefore, I failed to ensure the orthogonality of U.