kama...@gmail.com
unread,Feb 20, 2015, 7:35:34 AM2/20/15Sign in to reply to author
Sign in to forward
You do not have permission to delete messages in this group
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to manopt...@googlegroups.com
Sorry for the ambiguity of my question and my English.
Here is more clearer version.
I am trying to optimize the cost
F(alpha) = sum from i=1:N trace(logm((A_i*X))'*logm((A_i*X)))
alpha = input vector of size 5x1
logm = matrix logarithm
A_i = my data in the form of 4x4 matrices (So I have A_1, A_2, ..., A_N)
X = 4x4 matrix. This is a function of alpha
Thus, I am trying to find the best "alpha" that optimizes the cost F.
Here is my attempt. I re-write F(alpha) as summation of g'*g, where g is a vector [g_1, g_2, ... ,g_N]', each g_i represents sqrt(trace term). Therefore, my gradient would be J'*g, where J is Jacobian matrix.
dg/d(alpha) = dg(epsilon*alpha)/d(epsilon) when epsilon->0
Thus, we can obtain the derivative using chain rule.
dg(epsilon*alpha)/d(epsilon) = dg/d(trace term) * d(trace term)/dX * dX/d(epsilon) when epsilon -> 0
The first term of the derivative will be scalar.
The second term will be 4x4 matrix.
The third will also be 4x4 matrix.
I am not sure whether I calculate derivative correctly because in my understanding, my Jacobian should be Nx5 matrix. Do I need another step to proceed from this?
Thank you very much for your patience.