Hello Hiro!
I think, one of the advantages of using the orthonormal representation is that it naturally takes care of some of the invariance and, more importantly, it avoids ill-conditioning. Orthonormal matrices are perfectly conditioned, and that's a nice property to have.
The price you pay for this is that you need to orthonormalize the matrices at each iteration. That's quite okay for most applications, even more so considering that in the non-ortohonormal version you have to invert small matrices anyway, but for some applications, it might be a reason to not work with orthonormal matrices and try something else instead.
I hope this made sense. Two further comments:
1) regarding the Grassmann geometry in Edelman et al., be careful with the metric: I remember that at least one geometry proposed in their paper uses a different metric that the one used in the Manopt implementation.
2) in general, Manopt will work with any proper representation of the Grassmannian, with any proper metric; Manopt ships with one, ready-to-use factory, but if you feel comfortable with that, you could write a different factory for a different representation: all the solvers will still work, and it would be easy to compare the merits of different factories.
Cheers,
Nicolas