Hi Chen Shuai,
this seems to be a continuation of the previous topic? Sorry you first only sent the PDF just to me and (due to Christmas break) I saw your answer to me just today and before I could check your link, you also posted it here again.
In that PDF the authors themselves even write (p. 4, right column) that they obtain an optimization over an embedded Riemannian manifold, when they even include the last constraint (which is more than what I first wrote); there section 3 is even devoted to algorithms like gradient descent or newtons method (which are both part of manopt as well in exactly that form – though their manifold of oriented spheres is not).
The trick here is that the phrasing as an optimization problem on a manifold turns the original (constraint) problem (16) from the paper into an unconstraint problem on the manifold, so they can use just Quasi Newton for example (instead of more advanced constraint algorithms). The small cost for that is of course that instead of + (in R^n) you have to use a retraction (informally said).
Best,
Ronny
PS: “Dear Professor” is nice, but you are writing to a whole mailing list, some of the readers are professors (like Nicolas or myself), but some are also not.