I have a real-valued loss function as follows:
L(Q)=|hQQTg|2,
where h∈C1×K, g∈CK×1, Q∈CK×K. I obtained the gradient of L wrt Q∗ as follows:
dL=2hQQTg(hHgH+g∗h∗)Q∗:dQ∗.
Moreover, the matrix Q itself can be written as Q=IK−2vvH, where v∈CK×1, and has a unit norm, i.e., spherecomplexfactory. I calculated the gradient of L with respect to v∗ as follows, assuming v is constant, but the optimization problem is not converging.
dL=−4h(IK−2vvH)(IK−2vvH)Tg(hHgH+g∗h∗)(IK−2vvH)∗v:dv∗.
Does anybody have an idea about the problem? I code is as follows:
M = spherecomplexfactory(K, 1);
P.M = M;
P.cost = @(v) -costFun(hr, ht, v);
P.egrad = @(v) -gradFun(P, hr, ht, v);
[v, ~] = conjugategradient(P, [], []);
checkgradient(P)
function c = costFun(hr, ht, v)
I = eye(length(hr));
Q = (I-2*v*v');
Theta = Q*Q.';
c = abs(hr*(Theta)*ht)^2;
end
function g = gradFun(P, hr, ht, v)
I = eye(length(hr));
Q = (I-2*v*v');
Theta = Q*Q.';
G = 2*hr*Theta*ht*((conj(ht)*conj(hr)).' + (conj(ht)*conj(hr)))*conj(Q);
g = -2*G*v;
end