RuntimeError: .../casadi/core/function_internal.cpp:123: Error calling IpoptInterface::init for 'solver':
Error in Function::factory for 'nlp' [MXFunction] at .../casadi/core/function.cpp:1348:
Failed to create nlp_grad_f:[x, p]->[f, grad:f:x] with {}:
.../casadi/core/factory.hpp:340: Gradient generation failed:
.../casadi/core/function_internal.cpp:1601: Assertion "ret.n_in()==n_in_ + n_out_" failed:
Notify the CasADi developers.
Function(jac_jobj:(i0[2x6],i1[2x6],out_o0[1x12,0nz])->(jac[12x24,48nz]) MXFunction)
whereas the jacobian callback I defined is of the form
Function(jacf:(i0[2x2],i1[2x2])->(o0[1x4]) SXFunction)
Is there a reason why the MX function comes with a redundant, dummy variable?
The error message suggest that your Jacobian is missing (dummy) inputs corresponding to the outputs of the nominal Function. Tip: create an MX function with same dimensions as your problem and inspect its .jacobian()
Function(jacf:(i0[2x2],i1[2x2])->(o0[1x4]) SXFunction)
Is there a reason why the MX function comes with a redundant, dummy variable?
RuntimeError: .../casadi/core/function_internal.cpp:123: Error calling IpoptInterface::init for 'solver':
Error in Function::factory for 'nlp' [MXFunction] at .../casadi/core/function.cpp:1348:
Failed to create nlp_grad_f:[x, p]->[f, grad:f:x] with {}:
.../casadi/core/factory.hpp:340: Gradient generation failed:
.../casadi/core/mx.cpp:1020: Assertion "offset.back()==x.size1()" failed:
Notify the CasADi developers.
In [1]: objfunc # objective MX function
Out[1]: Function(obj:(i0[2x2000],i1[2x2000])->(o0) MXFunction)
In [2]: ffunc.clcallback # objective callback
Out[2]: Callback(f_cb:(i0[2x2000],i1[2x2000])->(o0) CallbackInternal)
In [3]: objfunc.jacobian() # MXfunction.jacobian
Out[3]: Function(jac_obj:(i0[2x2000],i1[2x2000],out_o0[1x1,0nz])->(jac[1x8000]) MXFunction)
In [4]: jffunc.clcallback # jacobian call back
Out[4]: Callback(jacf_cb:(i0[2x2000],i1[2x2000],i2[1x1,0nz])->(o0[1x4000]) CallbackInternal)
In [5]: a = objfunc.jacobian()
In [6]: a.jacobian() # hessian MX function
Out[6]: Function(jac_jac_obj:(i0[2x2000],i1[2x2000],out_o0[1x1,0nz],out_jac[1x8000,0nz])->(jac[8000x8001,32000nz]) MXFunction)
In [7]: hffunc.clcallback # hessian callback
Out[7]: Callback(hessf_cb:(i0[2x2000],i1[2x2000],i2[1x1,0nz],i3[1x4,0nz])->(o0[4000x4000,8000nz]) CallbackInternal)
So the signatures match for the objective function but differ for the jacobian and hessian. Could you suggest what I can do to fix my problem with the error message above? What should the correct signature look like?
I dont see why my callbacks should have to compute derivates with respect to the parameter values defined in the nlp, if that is the problem. This increases the computation cost for the callback.