error computing gradients with casadi.callback for nlp

1,323 views
Skip to first unread message

sanket...@gmail.com

unread,
Nov 27, 2017, 11:11:01 AM11/27/17
to CasADi
Hi Joris,

I managed to compute gradients, hessians etc on the gpu using a callback subclass in python. I was trying to use the callbacks thus defined to solve an optimization using the nlpsol class.

I have defined three callbacks one to calculate the objective, gradient and hessian respectively. I verified that all three return correct results for numerical evaluation.

Now to setup the nlp problem, I called the objective callback with a symbolic input.

x = cs.SX.sym('x',2,6)
y = cs.SX.sym('y',2,6)
objcl = objective.clcallback(x,y)   # the objective is a real valued function

nlp = { 'f': objcl, 'x': cs.reshape(y,1,-1), 'p': cs.reshape(x,1,-1) }

opts = {}
opts["ipopt.warm_start_init_point"] = "yes";
optimizer = casadi.nlpsol('solver','ipopt',nlp,opts)

At this point I get an error in the creation of optimizer:


RuntimeError
: .../casadi/core/function_internal.cpp:123: Error calling IpoptInterface::init for 'solver':

Error in Function::factory for 'nlp' [MXFunction] at .../casadi/core/function.cpp:1348:

Failed to create nlp_grad_f:[x, p]->[f, grad:f:x] with {}:

.../casadi/core/factory.hpp:340: Gradient generation failed:

.../casadi/core/function_internal.cpp:1601: Assertion "ret.n_in()==n_in_ + n_out_" failed:

Notify the CasADi developers.


I dont get what is going wrong with the callback exactly. Am I supposed to return the function and the function gradient as output from my callback?
Currently my callbacks have the following two functions to help with the jacobian computation.

class functioncallback(casadi.Callback):
    ...
    def has_jacobian(self):
        return True

    def get_jacobian(self,*args):
  return self.jacobiancb
    ...  


the self.jacobiancb points to the callback that computes the jacobian for the objective function.

And in the jacobian callback the corresponding member points to the callback for the hessian callback.

Am I doing something wrong here?

Best,
Sanket

sanket...@gmail.com

unread,
Nov 27, 2017, 2:20:31 PM11/27/17
to CasADi
Thanks Joris,

Got the error.

The MX function with the same dimensions indeed has an extra dummy input corresponding to the output.

Function(jac_jobj:(i0[2x6],i1[2x6],out_o0[1x12,0nz])->(jac[12x24,48nz]) MXFunction)

whereas the jacobian callback I defined is of the form 

Function(jacf:(i0[2x2],i1[2x2])->(o0[1x4]) SXFunction)

Is there a reason why the MX function comes with a redundant, dummy variable?




On Mon, Nov 27, 2017 at 5:56 PM, Joris Gillis wrote:
The error message suggest that your Jacobian is missing (dummy) inputs corresponding to the outputs of the nominal Function. Tip: create an MX function with same dimensions as your problem and inspect its .jacobian()

Joel Andersson

unread,
Nov 27, 2017, 2:50:59 PM11/27/17
to CasADi

Function(jacf:(i0[2x2],i1[2x2])->(o0[1x4]) SXFunction)

Is there a reason why the MX function comes with a redundant, dummy variable?


The Jacobian signature is always the same and to calculate the Jacobian you can sometimes make use of the non-differentiated output.
Examples include Jacobians calculated using finite difference approximations and implicitly defined functions: f(z, x) = 0 <=> z = g(x).

Joel



sanket...@gmail.com

unread,
Nov 28, 2017, 12:17:18 AM11/28/17
to CasADi
Thanks Joel,

I added an extra dummy input to my call and the error went away.

But I got a new error and I cant interpret what it means.

RuntimeError: .../casadi/core/function_internal.cpp:123: Error calling IpoptInterface::init for 'solver':

Error in Function::factory for 'nlp' [MXFunction] at .../casadi/core/function.cpp:1348:

Failed to create nlp_grad_f:[x, p]->[f, grad:f:x] with {}:

.../casadi/core/factory.hpp:340: Gradient generation failed:

.../casadi/core/mx.cpp:1020: Assertion "offset.back()==x.size1()" failed:

Notify the CasADi developers.


Best,
Sanket

sanket...@gmail.com

unread,
Nov 30, 2017, 1:12:04 PM11/30/17
to CasADi
Hi guys,

I still haven't been able to get ipopt working with the callback functions. I get the same error as mentioned in the previous post.

Upon digging a bit I compared the signatures of the callbacks defined with the corresponding MXfunctions. The signatures are a bit different due to the output size of MXfunction.jacobian() being different from what it should be for solving an nlp.
The MXfunction.jacobian() calculates the jacobian with respect to every input of the function, but for optimization I need derivatives only with respect to the decision variables and not with respect to the parameters which are constant for the parametric optimization problem being solved. Below are the outputs for my callback signatures compared to MXfunction.jacobian() signatures.

In [1]: objfunc  # objective MX function

Out[1]: Function(obj:(i0[2x2000],i1[2x2000])->(o0) MXFunction)

In [2]: ffunc.clcallback   # objective callback 

Out[2]: Callback(f_cb:(i0[2x2000],i1[2x2000])->(o0) CallbackInternal)


In [3]: objfunc.jacobian()  # MXfunction.jacobian

Out[3]: Function(jac_obj:(i0[2x2000],i1[2x2000],out_o0[1x1,0nz])->(jac[1x8000]) MXFunction)

In [4]: jffunc.clcallback    # jacobian call back

Out[4]: Callback(jacf_cb:(i0[2x2000],i1[2x2000],i2[1x1,0nz])->(o0[1x4000]) CallbackInternal)


In [5]: a = objfunc.jacobian()

In [6]: a.jacobian()    # hessian MX function

Out[6]: Function(jac_jac_obj:(i0[2x2000],i1[2x2000],out_o0[1x1,0nz],out_jac[1x8000,0nz])->(jac[8000x8001,32000nz]) MXFunction)

In [7]: hffunc.clcallback  # hessian callback

Out[7]: Callback(hessf_cb:(i0[2x2000],i1[2x2000],i2[1x1,0nz],i3[1x4,0nz])->(o0[4000x4000,8000nz]) CallbackInternal)

So the signatures match for the objective function but differ for the jacobian and hessian. Could you suggest what I can do to fix my problem with the error message above? What should the correct signature look like?

I dont see why my callbacks should have to compute derivates with respect to the parameter values defined in the nlp, if that is the problem. This increases the computation cost for the callback.

 Any help would be appreciated.

Thanks,
Sanket
Reply all
Reply to author
Forward
0 new messages