IPOPT how to supply gradients, jacobian, hessian??

5,144 views
Skip to first unread message

myx...@gmail.com

unread,
Jun 13, 2013, 3:42:52 PM6/13/13
to casadi...@googlegroups.com
 Hello everybody!

Due to my task, I have to supply the CasADi IPOPT Solver with own gradient of the objective function, jacobian of the constraints and hessian matrix.
Can anybody answer me please and probably give a code example?

Refer to user guide I can supply the IPOPT solver only with objective function and constraint function.

It is probably misunderstanding. I hope for the support!

Thank you very much for help!!

Best regards,
Jenya


Joel Andersson

unread,
Jun 13, 2013, 4:20:51 PM6/13/13
to casadi...@googlegroups.com
Hello Jenya!

CasADi will automatically calculate gradient of the objective function, the Jacobian of the constraints function and the Hessian of the Lagrangian function. This calculation uses algorithmic differentiation (see http://en.wikipedia.org/wiki/Automatic_differentiation) and normally, this will be much more efficient than something that you would be able to code yourself. Why do you not want to use the automatically generated derivative information?

There are ways to provide derivative the derivative information manually, see for example the example "examples/cplusplus/nlp_codegen.cpp". But you might also use IPOPT directly, instead of via CasADi.

Greetings!
Joel

myx...@gmail.com

unread,
Jun 13, 2013, 5:00:21 PM6/13/13
to casadi...@googlegroups.com
Hi Joel,

thank you for your asnswer!

I'm working with python...but. I have to implement multiple shooting with collocation. I concern for OCP the control vector as optimization variables and in every iteration in IPOPT I have to solve the system of nonlinear equation (that means I give initial guess for control vector and calculate x vector by for example Newton-Raphson method). Due to the structure of the method I need to calculate this derivative dx/du. It is possible to calculate it with help of constraint function. And after this I need to give IPOPT the gradient of the objective function and jacobian.
For clarity:

min F(x(u),u)                (1)
u                 
w.r.t.
G(x(u),u) = 0                (2)
umin <= u <= umax      (3)
xmin <= x(u) <= xmax   (4)

This is the reason, why I asked such a question. I want to supply gradient of the objective function (1) and jacobian of the constraint function (4) in every IPOPT iteration.
The (3) is just a bounds for optimization variables.
The system (2) I can solve with Newton method.

I will be glad if you can please help me!

Best regards,
Jenya

myx...@gmail.com

unread,
Jun 13, 2013, 5:05:28 PM6/13/13
to casadi...@googlegroups.com
I forgot some issues.

Do you mean that I can use IPOPT without CasADi?
I am working in JModelica.org and supply nonlinear model via Modelica. But firstly, I use CasADi to create model, and it is very easy to use IpoptSolver there.
Or did you mention something else?

Thank you for you help, Joel!

Best regards,
Jenya

Joel Andersson

unread,
Jun 13, 2013, 5:25:16 PM6/13/13
to casadi...@googlegroups.com
Hello!

I'm not sure I understand. What do you mean by "multiple shooting with collocation"? Do you mean multiple shooting using a collocation scheme for the integration (i.e. a type of implicit Runge-Kutta)? But if you use (2) to eliminate x, them you get single shooting.

In any case, your strategy does not make sense. As far as I know, IPOPT will introduce slack variables for the nonlinear constraints internally. So, even if you use (2) to eliminate x, IPOPT will still reintroduce the x to be able to treat (4). In addition to this, eliminating x will destroy the sparsity of the Jacobian and Hessian and IPOPT's linear solvers are written to handle large and sparse NLPs.

Instead, you should keep both x and u as NLP variables. This will make the optimization problem much larger, but IPOPT is written exactly for handling that kind of large and sparse NLPs.

About derivatives, I would try to rely on the automatic calculation. It should work also for very large-scale problems.

Hope this helps,
Joel

myx...@gmail.com

unread,
Jun 14, 2013, 4:26:52 AM6/14/13
to casadi...@googlegroups.com
Hi Joel!

Yeah, you are absolutely right. Thank you for help! Concerning derivatives - it will be better to do it automatically, as you advised.

I'm so sorry for the that, I gave you a wrong problem formulation for this combined method.

Nevertheless, my question rely on how to provide any additional information in every iteration?
For example: I can give IPOPT command to limit the number of the iteration to one, then solve nonlinear equation system, then again calling IPOPT with iteration number = 1, as cycle.

Can you advise me something?

Best regards,
Jenya

Joel Andersson

unread,
Jun 14, 2013, 4:37:51 AM6/14/13
to casadi...@googlegroups.com
Hello Jenya!

I'm not sure I understand. So you want to eliminate x after all? Why? Why not let IPOPT solve that nonlinear system of equations for you?

You cannot just set the maximum number of iterations in IPOPT to one, because IPOPT is an interior point method, and before the method has converged, you are not actually solving the right problem. You can provide a good primal and dual initial guess though and hope that it will converge faster, though.

Joel

myx...@gmail.com

unread,
Jun 14, 2013, 5:01:29 AM6/14/13
to casadi...@googlegroups.com
Hi Joel,

no, the states will be still in IPOPT, but only parameterized initial values for each interval.This represents a reduction of total number of optimization variables. But, due to the continuity, I have to now the end state in each interval. This value I can get by solition of nonlinear equation system.

Yep, you are right. But in every iteration IPOPT provides a new values for control and parameterized initial values for states in each interval.  After certain amount of this cycle we will have a solution.
From the software technical aspect of IPOPT, can I limit the maximum number of iteration to 1? I understand that IPOPT can not give me a solution of NLP problem in 1 iteration.

Greetings!

Jenya

Joel Andersson

unread,
Jun 14, 2013, 5:18:17 AM6/14/13
to casadi...@googlegroups.com
OK, so you are trying to do multiple shooting. In CasADi there are ways to embed solvers of nonlinear systems of equations into other functions. But I would not recommend using this until this feature is mature.

If I were you, I would just include all the states in the NLP, not only at the initial time. Especially if you are using IPOPT. That is the most likely to work well. So if you want to solve your optimization problem, this is definitely the easiest way to go.

After one iteration of IPOPT, you will probably not have a feasible or optimal solution (in any sense) and the barrier parameter will not be infinite. This has no meaning. Also, calling IPOPT several time with maximum 1 iteration is _not_ the same as calling it once with several iterations! If you want this kind of behaviour, you need to use an SQP method (with the right settings).

Joel

myx...@gmail.com

unread,
Jun 14, 2013, 5:27:58 AM6/14/13
to casadi...@googlegroups.com
Joel, thank you very much for sharing your expertise! It was truly helpful!

Can you please give me a link and example how to use SQP in CasADi? What do you mean under "SQP method (with the right settings)"?
This is python of course? And this SQP method deals with symbolical expressions?

Jenya

Joel Andersson

unread,
Jun 14, 2013, 6:33:01 AM6/14/13
to casadi...@googlegroups.com
If you want to do SQP, you can use WORHP. It's free for academic use. The CasADi interface is still a bit shaky though. With the right settings, I mean that there are things that can be reset between calls. For example, line-search parameters, regularization, Hessian approximations.

You can also write your own SQP method. If you just want a basic method, it's not too hard. You can use CasADi to calculate first and second order derivatives and CasADi's interfaces to QP-solvers (currently qpOASES, OOQP and CPLEX). 

There is an SQPMethod class in CasADi which implements SQP using either exact Hessian or Gauss-Newton Hessian:

It's a very basic method and not yet mature. So don't expect it to be able to compete with IPOPT. But feel free to modify it and make it better. With github it's easy, just make a fork.

There is also a second SQP method in CasADi, called SCPgen. But this is even less mature.

Greetings!
Joel


myx...@gmail.com

unread,
Jun 14, 2013, 6:37:34 AM6/14/13
to casadi...@googlegroups.com
Joel, many thanks for your help!!!!

Arnab Joardar

unread,
Mar 30, 2023, 2:10:27 PM3/30/23
to CasADi
Hello Joel,

Regarding the submission of a custom Jacobian and Hessian in formation, I also needed to do the same thing. In my problem, due to the structure of the equations, the derivation of the Jacobian and Hessian is simple. I am expecting to see a difference in solving speed due to this structure.
Regarding the example, I wasn't able to identify the line where the solver was supplied with a custom function. Could you please point out where it is? Is it defined in the file "nlp.so" or "nlp.c"?

I was hoping to construct the Hessian and Jacobian symbolically and then submit it to the solver.

Regards,
Arnab

victor forss

unread,
Mar 30, 2023, 8:37:13 PM3/30/23
to CasADi
Here is what I did in python to manually supply the jacobian and hessian functions to nlpsol when using IPOPT.

        w = nlp['x']
        J = nlp['f']
        g = nlp['g']
        lam_f = casadi.SX.sym("lam_f")
        lam_g = casadi.SX.sym("lam_g", g.sparsity())
        param = casadi.SX.sym('param')

        grad_f = casadi.Function("grad_f",[w, param],[J, casadi.gradient(J,w)])

        jac_g = casadi.Function("jac_g",[w, param],[g, casadi.jacobian(g,w)])

        L = lam_f*J + casadi.dot(lam_g, g)
        H, j = casadi.hessian(L,w)
        Htriu = casadi.triu(H)
        hess_lag = casadi.Function("hess_lag", [w, param, lam_f, lam_g], [Htriu])

        opts_dict = {
            'grad_f': grad_f, 'jac_g':jac_g, 'hess_lag':hess_lag
        }

        solver = casadi.nlpsol('solver', 'ipopt', nlp, opts_dict)
Reply all
Reply to author
Forward
0 new messages