Using Eigen matrices and arrays with Autodiff

4,557 views
Skip to first unread message

Chris Sweeney

unread,
Feb 10, 2017, 4:16:31 PM2/10/17
to Ceres Solver
Hi everyone,

If you use Eigen matrices or arrays inside your templated autodiff cost functions, then this email is for you.

Ceres allows you to use Jets with Eigen matrices and arrays so that you can specify templated Eigen::Matrix objects inside of your templated cost function that you want to autodiff. This is hugely convenient if you're doing any linear algebra operations. Let's consider a simple function that will optimize the position of a homogeneous 3D point to minimize reprojection error given a known and constant projection matrix:

class ReprojectionError {
 public:
  ReprojectionError(
      const Eigen::Matrix<double, 3, 4>& projection_matrix,
      const Eigen::Vector2d& feature)
      : projection_matrix_(projection_matrix), feature_(feature) {}

  template <typename T>
  bool operator()(const T* input_point, T* reprojection_error) const {
    Eigen::Map<const Eigen::Matrix<T, 4, 1> > point(input_point);

    // Multiply the point with the projection matrix, then perform homogeneous
    // normalization to obtain the 2D pixel location of the reprojection.
    const Eigen::Matrix<T, 2, 1> reprojected_pixel =
        (projection_matrix_.cast<T>() * input_point).hnormalized();

    // Reprojection error is the distance from the reprojection to the observed
    // feature location.
    reprojection_error[0] = feature_[0] - reprojected_pixel[0];
    reprojection_error[1] = feature_[1] - reprojected_pixel[1];
    return true;
  }

 private:
  const Eigen::Matrix<double, 3, 4>& projection_matrix_;
  const Eigen::Vector2d& feature_;
};


The convenience of Eigen's map, matrix multiply, and hnormalized function is really great! However, Jet binary operations (in particular multiplication and division) are much more expensive than their scalar (double) counterparts. Casting the projection matrix to type T results in the matrix multiplication to be performed as jet-to-jet multiplication and addition during autodiff, leading to an unnecessary overhead since operations between a Jet and scalar are actually optimized in ceres to be faster than Jet-to-Jet operations. Up to this point, you were required to cast the projection matrix to T since ceres only supported jet-to-jet matrix operations.

I just pushed a change that allows you to use binary operators (e.g., addition, subtraction, multiplication, division, etc.) between Eigen::Matrix or Eigen::Arrays when one matrix/array is a Jet and one is a double. This means we can change the projection step above to be:

const Eigen::Matrix<T, 2, 1> reprojected_pixel =  (projection_matrix_ * input_point).hnormalized();

While this doesn't look like a huge change, we actually get a nice performance increase because Ceres will use its optimized jet-double operations instead of jet-jet everywhere. The speed gains you'll see will depend on your cost function, but for some cost functions I have I saw 30% increase in efficiency. This also gives you the ability to use Eigen's methods like dot and cross product without having to cast your Eigen::Vector types to Jet.

So the take home message is that now you should feel free to mix Eigen::Matrix and Eigen::Array operations with Jets and double inside your templated autodiff cost functions. Further, as a general rule you should try to avoid Jet-to-Jet operations and prefer double-type-only or Jet-to-double operations when possible to speed up the evaluation of your cost function. Don't make autodiff evaluate the parts of your expression that are constant!!

Chris

Chris Sweeney

unread,
Feb 10, 2017, 4:19:06 PM2/10/17
to Ceres Solver
"input_point" should be "point" in the matrix multiplication in the code.

--
You received this message because you are subscribed to the Google Groups "Ceres Solver" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ceres-solver...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ceres-solver/ca9f2230-732a-43cb-997a-9d200e4c0095%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Sameer Agarwal

unread,
Feb 10, 2017, 4:22:13 PM2/10/17
to Ceres Solver
Thanks Chris. I have also just updated the docs and example code to remove redundant double to jet conversion. That CL had been sitting around for a while and this conversation and the email thread earlier today pushed me to get it done.
Sameer


Juraj Oršulić

unread,
Feb 14, 2018, 1:03:04 AM2/14/18
to Ceres Solver
On Friday, February 10, 2017 at 10:16:31 PM UTC+1, Chris Sweeney wrote:
Hi everyone,

If you use Eigen matrices or arrays inside your templated autodiff cost functions, then this email is for you.

Is this possible to use on Eigen quaternions? If I have a static Quaterniond, and a Quaternion<T>, trying to multiply them together without casting the Quaterniond fails the following assert in eigen's Quaternion.h:

  EIGEN_STATIC_ASSERT((internal::is_same<typename Derived::Scalar, typename OtherDerived::Scalar>::value),
 YOU_MIXED_DIFFERENT_NUMERIC_TYPES__YOU_NEED_TO_USE_THE_CAST_METHOD_OF_MATRIXBASE_TO_CAST_NUMERIC_TYPES_EXPLICITLY)

Thanks, Juraj

Juraj Oršulić

unread,
Mar 16, 2018, 7:49:53 AM3/16/18
to Ceres Solver
Anyone? :)

Sameer Agarwal

unread,
Mar 16, 2018, 12:38:36 PM3/16/18
to ceres-...@googlegroups.com
I doubt it will work, whats wrong with casting to T?

--
You received this message because you are subscribed to the Google Groups "Ceres Solver" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ceres-solver...@googlegroups.com.
Reply all
Reply to author
Forward
0 new messages