User-defined functions

877 views
Skip to first unread message

Mario Zanon

unread,
May 12, 2017, 8:01:57 AM5/12/17
to CasADi
Hi guys,

for some problems I am considering, it would be practical to write optimisation problems in which some functions are non-trivial algorithms with derivatives that exist, can be easily computed, but cannot be obtained from algorithmic differentiation. 

Just to be clearer, imagine a QP: the algorithm has if statements and I would like to handle it as a black-box. First and second order sensitivities, however, can be easily computed using parametric sensitivity analysis.

If I want to use such a function to e.g. define a constraint, at the moment I don't know how to pass this to the nlp solver class. Would it be possible to introduce a function class for which I define the derivatives myself? This would also allow me to e.g. neglect the contribution of the constraint to the Hessian and only pass the gradient to the optimiser.

Anyway, regardless of how I compute the function and its sensitivities, I would like to be able to just define a black-box function that returns its evaluation and sensitivities up to some order.

Ciao,
Mario

Andres Codas Duarte

unread,
May 12, 2017, 8:58:01 AM5/12/17
to CasADi
Hi Mario,

   I'm also interested on this sort of applications.

   You can implement black-box functions by inheriting 'Callback'.  There are related messages on that in the forum.

    Did you try it?  Or are you after something different?

    Regards,
    Andres

Joel Andersson

unread,
May 12, 2017, 9:19:25 AM5/12/17
to CasADi
Please read Chapter 6 in the User Guide: http://guide.casadi.org/

There are some ways to provide derivatives, but it's not completely stable and still work in progress. You can expect to see improvements for user-defined functions in the next couple of months.

Joel

Mario Zanon

unread,
May 12, 2017, 9:51:30 AM5/12/17
to CasADi
Thanks, I actually saw that right after I wrote. I also found this file

One problem I am having is that I would like to use stuff in the function which is defined outside of it. I explain better: I would like to e.g. use previously defined variables, possibly in Joris style and maybe also casadi functions.

Also, I am not sure I understand how the python function is transformed into a casadi function in the file in the link. How are the input and output arguments linked to each other? can I pass some extra argument to the python function? How does the class actually work?

Thank you again!

Ciao,
Mario

Mario Zanon

unread,
May 12, 2017, 10:59:19 AM5/12/17
to CasADi
I think I start understanding how the class approach works. Is it possible to directly define the jacobian in the class without defining forward and adjoint derivatives? That will be less efficient in general, but very practical to test things...

Ciao,
Mario

Joel Andersson

unread,
May 12, 2017, 11:16:26 AM5/12/17
to CasADi
"CustomFunction" does not exist anymore in CasADi 3+. The supported ways are described in the user guide.
About defining derivatives, you should be able to define any type of derivative information (forward directional derivatives, adjoint directional derivatives, Jacobian of all outputs with respect to all inputs), and CasADi will figure it out from there. We plan to add finite difference support in soon too, so that you always have some way of getting derivatives.

Joel

Mario Zanon

unread,
May 13, 2017, 8:05:09 AM5/13/17
to CasADi
Thanks for the heads up!

I am currently using casadi 2.4.1 and the code for the callback function is not working. I suppose I have to update to the latest version...
If this is not strictly necessary I'd prefer to stick to 2.4.1 for simplicity of management of my code. 

If I have to update, I asked in a separate post how to have several versions of casadi installed.

Mario
Message has been deleted

Mario Zanon

unread,
May 15, 2017, 5:59:30 AM5/15/17
to CasADi
Hi Joel!

I had a look at the documentation and I tried to implement the callback constructor, but I am facing some issues. In order to better explain what i want to do, I prepared some simple code:

from casadi import *

class test(Callback):
 
def __init__(self, name, opts = {}):
  Callback.__init__(self)
  self.construct(name, opts)

 
def get_n_in(self):  return 1
 
def get_n_out(self): return 1


 
def init(self):
   
print 'initialising object'


 
def eval(self, arg):
    x
= arg[0]
    f
= sin(x)
   
return [f]


 
def has_jacobian(self): return True


 
def get_jacobian(self,args):
    x
= arg[0]
   
[f] = self.eval(arg)
    jac
= cos(x)

   
return [jac, f]

my_sin
= test('my_sin')
x
= SX.sym('x')

nlp
= {'x':x,'f':-my_sin(x)}
solver
= nlpsol('solver',"ipopt",nlp)

arg
= {}
arg
['lbx'] = 0
arg
['ubx'] = 3.14

sol
= solver(**arg)

I get the following error
In [1]: run test.py
initialising
object
---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
/usr/lib/python2.7/dist-packages/IPython/utils/py3compat.pyc in execfile(fname, *where)
   
202             else:
   
203                 filename = fname
--> 204             __builtin__.execfile(filename, *where)


/home/mzanon/data/git/communication_mpc/papers/comm_aware_rob_mpc/code/test.py in <module>()
     
37
     
38
---> 39 nlp = {'x':x,'f':-my_sin(x)}
     
40
     
41 solver = nlpsol('solver',"ipopt",nlp)


/home/mzanon/src/casadi-3/casadi/casadi.pyc in __call__(self, *args, **kwargs)
 
12576       if len(args)>0:
 
12577     # Ordered inputs -> return tuple
> 12578         ret = self.call(args)
 
12579         if len(ret)==0:
 
12580           return None


/home/mzanon/src/casadi-3/casadi/casadi.pyc in call(self, *args)
 
11643
 
11644         """
> 11645         return _casadi.Function_call(self, *args)
  11646
  11647


RuntimeError:  on line 1376 of file "
/home/travis/build/casadi/binaries/casadi/casadi/core/function/function_internal.cpp"
'eval_sx' not defined for N6casadi16CallbackInternalE

My understanding is that I should provide some symbolic evaluation too. But then my question is: will casadi try to differentiate the symbolic call to the function instead of using my provided jacobian?

Thanks in advance!

Ciao,
Mario

Mario Zanon

unread,
May 16, 2017, 7:27:14 AM5/16/17
to CasADi
Hi!

With the help of Joris I got the previous code to run!

Now I am having another problem: because I cannot compute the Jacobian symbolically, I need to use a slightly different structure in my code. I tried using two callback functions as follows:

from casadi import *

class test0(Callback):

 
def __init__(self, name, opts = {}):
   
Callback.__init__(self)
   
self.construct(name, opts)

 
def get_n_in(self):  return 1

 
def get_n_out(self): return 2
 
def get_sparsity_in(self,n_in): return Sparsity.dense(2,1)
 
def get_sparsity_out(self,n_out):
   
if n_out == 0: return Sparsity.dense(1,2)
   
if n_out == 1: return Sparsity.dense(1,1)


 
def init(self):
   
print 'initialising object'

 
def eval(self, arg):
    x
= arg[0]

    f
= sin(x[0])
    jac
= horzcat(cos(x[0]),0)
   
return [jac, f]

 
def has_jacobian(self): return True

 
def get_jacobian(self,name,opts):
   
return Function('jacobian',[x],[DM(2,2),DM(2,1)])


class test(Callback):
 
def __init__(self, name, opts = {}):
   
Callback.__init__(self)
   
self.construct(name, opts)

 
def get_n_in(self):  return 1
 
def get_n_out(self): return 1

 
def get_sparsity_in(self,n_in): return Sparsity.dense(2,1)
 
def get_sparsity_out(self,n_out): return Sparsity.dense(1,1)

 
def init(self):
   
self.low_level = test0('low_level')

 
def eval(self, arg):
   
[jac,f] = self.low_level(arg)

   
return [f]

 
def has_jacobian(self): return True


 
def get_jacobian(self,name,opts):
    x
= MX.sym('x',2,1)
    jacSym
= self.low_level(x)[0]
   
return Function('jacobian',[x],[jacSym])


my_sin
= test('my_sin')


x
= MX.sym('x',2,1)


nlp
= {'x':x,'f':-my_sin(x)}
solver
= nlpsol('solver',"ipopt",nlp)

arg
= {}
arg
['lbx'] = 0
arg
['ubx'] = 3.14

sol
= solver(**arg)

I am actually not sure what I should put in get_jacobian of test0. Anyway, the shape should be correct, but then I get the error


In [1]: run test2.py

initialising object

---------------------------------------------------------------------------

RuntimeError                              Traceback (most recent call last)

/Users/zmario/data/git/projects/communication_mpc/papers/comm_aware_rob_mpc/code/test2.py in <module>()

     67 nlp = {'x':x,'f':-my_sin(x)}

     68 

---> 69 solver = nlpsol('solver',"ipopt",nlp)

     70 

     71 arg = {}


/Users/zmario/src/casadi-py27-np1.9.1-v3.1.1/casadi/casadi.pyc in nlpsol(*args)

  15086 

  15087     """

> 15088     return _casadi.nlpsol(*args)

  15089 

  15090 def nlpsol_in(*args):


RuntimeError: The assertion "ret.n_out()==1" on line 1673 of file "/Users/travis/build/casadi/binaries/casadi/casadi/core/function/function_internal.cpp" failed. 

Please notify the CasADi developers.



Ciao,
Mario

Mario Zanon

unread,
May 16, 2017, 8:11:00 AM5/16/17
to CasADi
OK I think I fixed it!



from casadi import *


class test0(Callback):
 
def __init__(self, name, opts = {}):
   
Callback.__init__(self)
   
self.construct(name, opts)

 
def get_n_in(self):  return 1
 
def get_n_out(self): return 2
 
def get_sparsity_in(self,n_in): return Sparsity.dense(2,1)
 
def get_sparsity_out(self,n_out):
   
if n_out == 0: return Sparsity.dense(1,2)
   
if n_out == 1: return Sparsity.dense(1,1)

 
def init(self):
   
print 'initialising object'

 
def eval(self, arg):
    x
= arg[0]
    f
= sin(x[0])
    jac
= horzcat(cos(x[0]),0)
   
return [jac, f]

 
def has_jacobian(self): return True

 
def get_jacobian(self,name,opts):

   
return Function('jacobian',[x],[vertcat(DM(2,2),DM(1,2))])


class test(Callback):
 
def __init__(self, name, opts = {}):
   
Callback.__init__(self)
   
self.construct(name, opts)

 
def get_n_in(self):  return 1
 
def get_n_out(self): return 1
 
def get_sparsity_in(self,n_in): return Sparsity.dense(2,1)
 
def get_sparsity_out(self,n_out): return Sparsity.dense(1,1)


 
def init(self):
   
self.low_level = test0('low_level')

 
def eval(self, arg):

   
[jac,f] = self.low_level(arg[0])

   
return [f]

 
def has_jacobian(self): return True

 
def get_jacobian(self,name,opts):
    x
= MX.sym('x',2,1)
    jacSym
= self.low_level(x)[0]
   
return Function('jacobian',[x],[jacSym])

my_sin
= test('my_sin')
x
= MX.sym('x',2,1)

nlp
= {'x':x,'f':-my_sin(x)}
solver
= nlpsol('solver',"ipopt",nlp)

arg
= {}
arg
['lbx'] = 0
arg
['ubx'] = 3.14
sol
= solver(**arg)


nlp2
= {'x':x,'f':-sin(x[0])}
solver2
= nlpsol('solver',"ipopt",nlp2)
sol2
= solver2(**arg)

print sol['x']-sol2['x']

Of course my implementation if sin(x) converges slower, since I do not provide a Hessian, but this is expected.

Mario Zanon

unread,
May 16, 2017, 9:51:22 AM5/16/17
to CasADi
I actually found some weird behaviour if I add a simple constraint
nlp = {'x':x,'f':-my_sin(x),'g':x[1]}
with e.g.
arg['lbg'] = 1
arg
['ubg'] = 1

Then Ipopt does not converge: though it finds the right solution, the KKTs are nonzero and it keeps iterating until it reaches max_iter.

Any clue why?

Thanks again!

Joel Andersson

unread,
May 16, 2017, 10:29:43 AM5/16/17
to CasADi
Not sure why IPOPT isn't converging - maybe it doesn't get the right correct derivative information. Did you try enabling the derivative-checker in IPOPT?

Also, as said, user-provided functions with derivative information is work in progress.

Joel

Mario Zanon

unread,
May 16, 2017, 10:32:11 AM5/16/17
to CasADi
How can I enable the derivative-checker?
I did check and the derivatives seem correct, but I will check better, to make sure everything is alright!

I am wondering if this can be related with Ipopt scaling cost and constraints internally...

Ciao,
Mario

Andres Codas Duarte

unread,
May 16, 2017, 11:02:05 AM5/16/17
to CasADi
Hi Mario,

   I guess that IPOPT got lost because you provided a Hessian == 0.  I have been using Ipopt whith BFGS approximations combined with Callbacks and that works very well.  For your example, I just provided an identity Hessian and Ipopt managed to converge again, however I would go for BFGS if you don't provide a better approximation there.


  def get_jacobian(self,name,opts):
   
print('getting jacobian')
   
return Function('jacobian',[x],[vertcat(DM.eye(2),DM(1,2))],{'verbose': True})

   Best,
   Andres

Joel Andersson

unread,
May 16, 2017, 11:11:25 AM5/16/17
to CasADi
Yeah, you're probably better off not providing any Hessian than providing a wrong one...

Derivative checker: Set the option "ipopt.derivative_test": https://www.coin-or.org/Ipopt/documentation/node54.html#opt:derivative_test

Joel

Mario Zanon

unread,
May 16, 2017, 12:18:51 PM5/16/17
to CasADi
True, I was doing this for constraints and forgot that the objective needs some positive definite Hessian. I am still a bit surprised that the gradients were wrong though...

Anyway, Joel, I am forced to provide a Hessian, otherwise I get an error. Or is there a way to not provide the Hessian?

Thanks again!
Mario

Mario Zanon

unread,
May 16, 2017, 12:26:50 PM5/16/17
to CasADi
One thing I find confusing in the code I posted is actually related to derivatives.

If I get it right, the Hessian is taken from the "low-level" callback, but the "low-level" Jacobian is overwritten by that given by the wrapper callback. This is rather confusing to me and I would find it more natural that derivatives are only required in the wrapper and not in the "low-level" callback.

Then I would also find it natural that if no derivative is provided in the wrapper, then the derivative is taken to be 0.

Andres Codas Duarte

unread,
May 16, 2017, 1:07:40 PM5/16/17
to CasADi
Hi Mario,

   Ipopt will not evaluate the Hessian function from your wrappers if you use the following option:


opts ={}
opts
['ipopt.hessian_approximation'] = 'limited-memory'

solver
= nlpsol("solver", "ipopt", nlp, opts)

   See: https://www.coin-or.org/Ipopt/documentation/node53.html#SECTION0001113010000000000000

   Best,
   Andres

Mario Zanon

unread,
May 16, 2017, 1:15:29 PM5/16/17
to CasADi
Hi Andres!

Thanks a lot for your advice ;-)

I got an error when building the NLP solver if I did not define the Jacobian for the lower-level function, maybe this error is indeed not thrown with the BFGS option you suggest (I did not try).

In any case, I find it a bit surprising from a user perspective that I need to define stuff that is overwritten and some of my derivatives come from one function, while other derivatives come from a different one.

Of course mine is not a complaint! 
In the end I got everything to work :-) it's more of a suggestion on how to improve this nice functionality.

Ciao,
Mario

Mario Zanon

unread,
May 17, 2017, 8:59:58 AM5/17/17
to CasADi
One more question about using callback in NLP solvers.
Now everything works, but it's extremely slow, every iterate the callback function evaluation takes more than it would take if evaluated in Python, which surprises me very much.
Is there a way to speed this up?

Thanks a lot again!

Ciao,
Mario

Joel Andersson

unread,
May 17, 2017, 10:39:54 AM5/17/17
to CasADi
Please try to keep the threads on topic and start new threads for new questions! But to answer your question: I don't know what is causing your observed overhead in the NLP callback function.
Joel

Mario Zanon

unread,
May 24, 2017, 11:27:53 AM5/24/17
to CasADi
Hi again :-)

I am wondering how I can pass the Hessian if it is not constant, but I need to evaluate it, just like I do with the Jacobian. Is there a similar way of passing it? Do I need to create a third callback which is called when the Hessian is generated in the low-level callback?

Ciao,
Mario

Mario Zanon

unread,
Oct 11, 2017, 9:42:53 AM10/11/17
to CasADi
Hi again!

I am wondering if it is possible to make my callback function parametric, which would be a quite handy feature for most uses...

I tried to use a multiple input definition, but that seems to break down everything... Any idea?

I guess that a brute-force approach could consist in passing the variable and parameter in the same vector, but I would find it cleaner to keep them as separate inputs...

Ciao,
Mario

Mario Zanon

unread,
Oct 11, 2017, 11:01:08 AM10/11/17
to CasADi
Again, thanks Joris for the precious help!!

Here is the code so that everyone can benefit! Maybe one nice addition could be computing the Hessian too, then I guess it could be added to the documentation examples...

This is in casadi 3.1.1
from casadi import *


class test0(Callback):
 
def __init__(self, name, opts = {}):
 
Callback.__init__(self)
 
self.construct(name, opts)


 
def get_n_in(self):  return 2

 
def get_n_out(self): return 2
 
def get_sparsity_in(self,n_in):

 
if n_in == 0: return Sparsity.dense(2,1)
 
if n_in == 1: return Sparsity.dense(1,1)
 
def get_sparsity_out(self,n_out):
 
if n_out == 0: return Sparsity.dense(1,3)

 
if n_out == 1: return Sparsity.dense(1,1)

 
def init(self):
 
print 'initialising object'

 
def eval(self, arg):
 x
= arg[0]

 p
= arg[1]
 f
= sin(x[0]+p)
 jac
= horzcat(cos(x[0]+p),0,0)

 
return [jac, f]

 
def has_jacobian(self): return True

 
def get_jacobian(self,name,opts):

 x
= MX.sym('x',2,1)

 p
= MX.sym('p',1,1)
 
return Function('jacobian',[x,p],[vertcat(DM.eye(3),DM(1,3))])



class test(Callback):
 
def __init__(self, name, opts = {}):
 
Callback.__init__(self)
 
self.construct(name, opts)


 
def get_n_in(self):  return 2

 
def get_n_out(self): return 1
 
def get_sparsity_in(self,n_in):

 
if n_in == 0: return Sparsity.dense(2,1)
 
if n_in == 1: return Sparsity.dense(1,1)

 
def get_sparsity_out(self,n_out): return Sparsity.dense(1,1)

 
def init(self):
 
self.low_level = test0('low_level')

 
def eval(self, arg):

 
[jac,f] = self.low_level(arg[0],arg[1])

 
return [f]

 
def has_jacobian(self): return True

 
def get_jacobian(self,name,opts):
 x
= MX.sym('x',2,1)

 p
= MX.sym('p',1,1)
 jacSym
= self.low_level(x,p)[0]

 
return Function('jacobian',[x,p],[jacSym])


my_sin
= test('my_sin')

x
= MX.sym('x',2,1)

p
= MX.sym('p',1,1)

nlp
= {'x':x,'p':p,'f':-my_sin(x,p),'g':x[1]}

solver
= nlpsol('solver',"ipopt",nlp)


pnum
= 0.5

arg
= {}
arg
['lbx'] = 0
arg
['ubx'] = 3.14

arg
['lbg'] = 1
arg
['ubg'] = 1

arg
['p'] = pnum

sol
= solver(**arg)

nlp2
= {'x':x,'f':-sin(x[0]+pnum),'g':x[1]}

solver2
= nlpsol('solver',"ipopt",nlp2)

sol2
= solver2(**arg)

print sol['x']-sol2['x']
Reply all
Reply to author
Forward
0 new messages