Do you know whether it is possible to do joint euclidean and rotation optimization with manifold optimization? I mean a SE(3) manifold. Thanks!
I am solving an optimization problem on the SE(2) group. The code looks as follows:
=========
function x=se_2_main(E,X,P)
manifold = specialeuclideanfactory(2, 1);
problem.M = manifold;
problem.cost = @(M) trace(X'*[M.R M.t;zeros(1,2) 1]'*E*[M.R' -M.R'*M.t;zeros(1,2) 1]'*[M.R' -M.R'*M.t;zeros(1,2) 1]*E*[M.R M.t;zeros(1,2) 1]*X)-2*trace(P'*X'*[M.R' -M.R'*M.t;zeros(1,2) 1]*E*[M.R M.t;zeros(1,2) 1]*X);
problem.egrad = @(M) 2*E*[M.R' -M.R'*M.t;zeros(1,2) 1]'*[M.R'-M.R'*M.t;zeros(1,2) 1]*E*[M.R M.t;zeros(1,2) 1]*X*X'-2*[M.R' -M.R'*M.t;zeros(1,2) 1]'*[M.R' -M.R'*M.t;zeros(1,2) 1]*E*[M.R M.t;zeros(1,2) 1]*X*X'*[M.R M.t;zeros(1,2) 1]'*E*[M.R' -M.R'*M.t;zeros(1,2) 1]'+2*[M.R' -M.R'*M.t;zeros(1,2) 1]'*X*P*X'*[M.R M.t;zeros(1,2) 1]'*E*[M.R' -M.R'*M.t;zeros(1,2) 1]'-2*E*[M.R' -M.R'*M.t;zeros(1,2) 1]'*X*P*X';
x = trustregions(problem);
end
============
And I am getting the following error:
============
Attempt to reference field of non-structure array.
Error in productmanifold/egrad2rgrad (line 123)
g.(elems{i}) = elements.(elems{i}).egrad2rgrad(...
=============
Where, X is a 3xn matrix , P is a nxn matrix and E is 3x3 matrix. Can you please suggest a solution.
thank you
I'm a Python user interested in performing optimization on SE(3) and SE(2). Since both are yet available in pymanopt, I reviewed manopt to see how I can easily implement both manifolds the required gradients in pymanopt.
I suspect that the current implementation of SE(3) in manopt is incorrect. The root of the error lies in the fact that SE(3) is not the direct product SO(3) x R^3. Where you hinting to this when you noted "the geometry given here to SE(n) is not necessarily appropriate for all applications"?
One implication of this difference is that the exponential function for SE(3) cannot be composed of the exponential mappings of SO(3) and R^3. i.e given H an element of the lie-algebra se(3):
Exp_SE3(H) \neq [ Exp_SO3(S) & Exp_R3(v) \\ 0, 0 ]
where Exp_SE3(), Exp_SO3() and Exp_R3(), are the exponential mappings for SE(3), SO(3) and R^3, respectively.
Since SE(3) is a subset of the General Linear group, one can easily verify this by computing its exponential (inefficiently) using the matrix exponential:
Exp_SE3(H) = Exp[ S & v \\ 0, 0 ] \neq [ Exp_SO3(S) & Exp_R3(v) \\ 0, 0 ]
More efficient computations exist (see e.g.: http://ethaneade.com/lie.pdf)
Do you agree with my findings?
I'm not sure what implications it holds for the gradient computation, do you have any idea about this?
Kind regards,
Martijn Zeestraten
Thank you for the quick reply!
See my response below.
Martijn
>
> Thank you for your detailed message. I do agree with your findings; this does not necessarily make the code in Manopt wrong (as I will argue), but at the very least I agree that the current documentation for the file specialeuclideanfactory is misleading and i'll change it.
>
>
The term 'incorrect' might have been too strong: The implementation of SO(3) x R^3 is completely correct.
> You are completely right that the group product of SO(3) with R^3 to form SE(3) is not a direct product; hence, the geometry in Manopt, obtained as a product of the geometries of SO(3) and R^3 is not the Lie geometry of SE(3). It is still a "correct" geometry as far as I can tell, but it is certainly not the geometry people would expect a priori, and this is what I need to clarify in the documentation.
>
>
> The reason this does not necessarily matter is that optimization is over a set: the optimization problem truly has nothing to do with the geometry with put on that set. So, in that respect, as long as we equip the set SE(3) with some geometry that Manopt can understand, then we are good to go.
>
>
> This being said, picking the right geometry for a given problem can be very important, because it has direct implications for the conditioning of the problem. You can see that clearly from taking a look at the folder that contains the factories for low rank matrices: they all describe the same set (matrices of a certain size and rank), yet they all have different geometries (different metrics), and this leads to varying performance when comes the time to optimize, because the condition number of the Hessian of the cost function can vary greatly as a function of the geometry.
>
>
> I hope this makes sense.
>
>
I think I've got the point
* Since the sets SE(3) and SO(3) x R^3 are equal, one can perform optimization on SE(3) through SO(3) x R^3: i.e. if H is a (local) optima in SE(3) it will also be in SO(3) x R^3.
* The optimization on SO(3) x R^3 might be less efficient than the optimization on SE(3).
Even-though the optimization might yield the same result, I find the factory name SE3 misleading for the manifold SO(3) x R^3: the sets are the same but the geometry is different. As a result the exponential and logarithmic mappings are completely different. A naive user will not realize this difference.
> By the way, I'll keep a note of the link to the PDF file you gave about computations with the Lie structure of SE(3), because it would be good to have a "proper" implementation of that in Manopt at some point. I understand that you are a Python user, but if you get a good factory in Python for this, I'll be happy to know about it to see if we can get that in Manopt as well.
>
I just finished the SO(3) manifold for pymanopt, and verified that I can do optimization on SO(3) x R^3. Currently, I'm trying to implement a correct version of SE(3). However, I'm not sure how to obtain the egrad2rgrad() and ehess2rhess() functions. Do you have any hints on how to derive them?
> For now, you can see the latest version on the master branch has some added comments:
> https://github.com/NicolasBoumal/manopt/blob/master/manopt/manifolds/specialeuclidean/specialeuclideanfactory.m
Ok, I fully understand this reasoning. I hope to contribute to the proper implementation of SE(3).
>
> About deriving egrad2rgrad and ehess2rhess: it is going to depend on your choice of representation (how do you represent points and tangent vectors), and also on your choice of metric (inner product; which I suppose isn't much of a choice since we want the Lie geometry.) Can you tell me more about that?
Indeed, I would like to implement the Lie-Geometry. The inner product should in that case be the 'standard' inner-product if you represent elements of se(3)---lower case indicates lie-algebra---by 6d vectors.
> (About representation of tangent vectors: the implementation in Manopt for SO(n) represents tangent vectors in the Lie algebra, that is, a tangent vector RS at R in SO(n) for some S skew-symmetric is stored simply as S. This choice presumably makes sense for SE(n) as well, although I must admit it has been the source of some confusion, because vectors in the embedding Euclidean space are still represented as RS ... we can discuss this more if need be.)
>
The representation of tangent space elements as being embedding in the Euclidean space has also puzzled me a bit when I first looked at (py)manopt. I'm fairly new to this field, but from my understanding the key behind the intrinsic approach to Riemannian Geometry is to move away form the Euclidean embedding.
For SO(3) the tangent space actually consists of skew symmetric matrices; the matrix logarithm, logm(), of R in SO(3) is skew symmetric. This is not the case for the tangent spaces of SE(3). Here, logm(H) = [S, v; 0, 0]. This matrix is not skew-symmetric.
What is the purpose of the egrad2rgrad and ehess2rhess functions? Are they meant to 'project' a Euclidean gradient/hessian to a 'proper' tangent space element?
Cheers,
Martijn
Thanks for the additional information.
I will share the final implementation.
Martijn