In the example of dominant invariant subspace in the package, which optimization
problem is max tr(X'AX), where X \in \Gr(n,k) and A is symmetric, the maximum value should be equal to the sum of the k largest eigenvalues of A. However, I find the program converges to a local minimum which is quite far away from optimal solution in most cases. It works very well when A is diagonal. I wonder what's the reason behind? Is it about certain stopping criteria of the specific implementation of the Trust Region Method?
Thanks a million!
Best,
Ken