Manopt 7.0 released with automatic differentiation!

Skip to first unread message

Nicolas Boumal

Sep 6, 2021, 1:50:03 AMSep 6
to Manopt
Dear Manopt users,

We are excited to announce the Manopt 7.0 release, with numerous improvements.

The star new feature is automatic differentiation (AD), implemented by Xiaowen Jiang during a summer internship -- many thanks to him!

See the change log and upgrade your code here:
(Don't forget to execute importmanopt.m at the Matlab prompt.)

To give AD a spin, make sure you have Matlab R2021a (or later) and an up-to-date Deep Learning Toolbox, then run:
n = 5;
A = randn(n); A = A+A';
problem.M = spherefactory(n);
problem.cost = @(x) x'*A*x;
problem = manoptAD(problem); % figures out gradient and Hessian
x = trustregions(problem);

On the tutorial page, you can now also find links to books about Riemannian optimization as well as to videos introducing the necessary basics of geometry:

Don't hesitate to pass the news around,
The Manopt team

PS: Manopt is also on GitHub, open to your input.

Reply all
Reply to author
0 new messages