Hi there
I have a problem which is to maximise sum of trace(A' * Ki * A * A' * Ki * A), where Ki is a symmetric matrix (size is D-by-D, where A is D-by-d) that is given for i = 1, ..., n. The constraint is that A' *A = I. This means the problem is optimising over Grassmann manifold since objective function is invariant to transformation. However, when I try to implement this using conjugate gradient, (or even steepest descent), it will not converge. And it suggests my gradient calculation is not correct. Error message is as follow,
The slope should be 2. It appears to be: 2.00001.
If it is far from 2, then directional derivatives might be erroneous.
The residual should be 0, or very close. Residual: 8.50465e-18.
If it is far from 0, then the gradient is not in the tangent space.
Here is my cost function and gradient. Thanks for anyone that helps me with this!!
function [f, g] = mycostgrad(K, manifold, A)
[~, ~, N] = size(K);
[D, d] = size(A);
f = 0;
A_K_A = zeros(d,d,N);
for i = 1:N
A_Gi_A = A' * G(:,:,i) * A;
A_G_A(:,:,i) = A_Gi_A;
f = f - trace(A_Gi_A * A_Gi_A);
end
if nargout == 2
g = zeros(D,d);
for i = 1:N
g = g - (4*G(:,:,i)*A*A_G_A(:,:,i));
end
g = manifold.egrad2rgrad(A, g);
end
end