Give function Derivative

739 views
Skip to first unread message

Jonas

unread,
Jul 26, 2018, 7:10:40 AM7/26/18
to CasADi
Hey,
I have a follow up question to my post yesterday here. What I basically want to do is tell the Callback function its explicit analytic derivative, lets say f(x)=x^2, f'(x) = 2x.
My first idea was to use "derivative_of"
class derivative(Callback):
 

  #...I don't paste def init etc. for convenient reasons...

 
def eval(self, arg):
      f
= 2*arg[0]
     
return [f]



class example(Callback):

   
 
def eval(self, arg):
      f
= arg[0]**2
     
return [f]

example
= example("example")
derivative
= derivative("derivative",{"derivative_of":example})
which fails
 /casadi/core/function_internal.cpp:2304: Assertion "has_derivative()" failed:
Derivatives cannot be calculated for example

as well as the second idea

class example_der(Callback):
 

 
def has_forward(self,nfwd): return True

 
def get_forward(self, name, nfwd, inames, onames,opt):
     opt
={}
 
     
return Function("derivative",[arg],[2*arg[0]])

     
 
def eval(self, arg):
      f
= arg[0]**2
     
return [f]
 
der
= example_der("der",{"enable_forward": True})   #enable forward not necessary if has_forward True alrdy I think

Within the second code I struggle finding the right syntax within get_forward. I canot put arg into the get_forward call properly..

.../casadi/core/callback_internal.cpp:124: Error calling "get_forward" for object der:
.../casadi/build/swig/casadiPYTHON_wrap.cxx:3757: name 'arg' is not defined

Do you guys have any suggestions on how to implement this?

Best regards,
Jonas

Joris Gillis

unread,
Jul 26, 2018, 7:13:11 AM7/26/18
to CasADi
Dear Jonas,


Best,
  Joris

Jonas

unread,
Jul 26, 2018, 8:35:33 AM7/26/18
to CasADi
Dear Joris,

even though the example is a great help, I still don't see how to integrate the forward derivative into the original callback function.
test = 3
f
= example_fwd("f")
x
= MX.sym("x",1)
J
= Function('J',[x],[jacobian(f(x),x)])
print(J(vertcat(3)))
works fine, but solving the OCP with
L = example(variable)
crashes since it does not find a derivative for the callback "example"


class example(Callback):
   
 
def __init__(self, name,  opts={}):
   
Callback.__init__(self)
   
self.construct(name, opts)
 
# Number of inputs and outputs
 
def get_n_in(self): return 1
 
def get_n_out(self): return 1

 
def init(self):
     
print('initializing object')

 
def get_sparsity_in(self,i):
     
return Sparsity.dense(1,1)

 
def get_sparsity_out(self,i):
     
return Sparsity.dense(1,1)
 
   
 
def get_forward(self,arg):     #somehow integrate the derivative?!
     
return self.fwd_callback
     
 
def eval(self, arg):

      f
= arg[0]**2
     
return [f]


class example_fwd(example):
 
def has_forward(self,nfwd):
   
return nfwd==1

 
def get_forward(self,nfwd,name,inames,onames,opts):
   
   
class ForwardFun(Callback):
     
def __init__(self, opts={}):
       
Callback.__init__(self)
       
self.construct(name, opts)

     
def get_n_in(self): return 3
     
def get_n_out(self): return 1

     
def get_sparsity_in(self,i):
       
return Sparsity.dense(1,1)

     
def get_sparsity_out(self,i):
       
# Forward sensitivity
       
return Sparsity.dense(1,1)

     
# Evaluate numerically
     
def eval(self, arg):
         
return [2*arg[0]]
   
self.fwd_callback = ForwardFun()
   
return self.fwd_callback


Joris Gillis

unread,
Jul 26, 2018, 8:59:24 AM7/26/18
to CasADi
Probably because IPOPT is requesting second-order sensitivities?
Remember that you need 'limited-memory' if you only want to work with first-order sensitivities?


Following works for me:

x = MX.sym("x",1)
f
= example_fwd("f")


options
= {"ipopt":{"hessian_approximation":"limited-memory"}}
solver
= nlpsol("solver","ipopt",{"x":x,"f":f(x)},options)
solver
(x0=2)

Best,
  Joris

Jonas

unread,
Jul 26, 2018, 9:16:50 AM7/26/18
to CasADi
Hey Joris,
I don't want to use the forward call as rhs in the ODE system. It looks like thats what you implement in solver, right?

I want to let Casadi know the derivative of a function. So that the object function is f and while solving the problem it doesn't calculate using finite differences but just evaluate the analytical solution f'.
It looks to me like this will be possible if I implement it the way I did above, using either
{"derivative_of":}
or
get_forward

My motivation is that evaluating a Gaussian Process Regressor takes a long time using finite differences, but I can calculate the GPR's gradient which is again a GPR. Then, I am hopefully more efficient if Casadi calls the "real" derivative and is not using finite differences (which means calling the original function at least twice).

Using the example and example_fwd ends up in
Assertion "has_derivative()" failed:
Derivatives cannot be calculated for example

which shows that the connection between class and subclass is missing, I tried to link it but can't finde a way.

Best,
Jonas

Joris Gillis

unread,
Jul 26, 2018, 9:25:13 AM7/26/18
to CasADi
Dear Jonas,

The example does not include an ODE system, so I'm not sure why you
would interpret it that way.
The example serves to show several mechanisms to create custom
functions with sensitivities.
Any mechanism will work when embedded in a solver that works with
first-order sensitivities, albeit with wildly different performances..

Also, please check yesterday's private email to you...

Best,
Joris
> --
> Sent from CasADi's user forum at http://forum.casadi.org.
> ---
> You received this message because you are subscribed to the Google Groups
> "CasADi" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to casadi-users...@googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/casadi-users/0b7328ea-af8a-4323-bcfb-91f3efec99f4%40googlegroups.com.
>
> For more options, visit https://groups.google.com/d/optout.
Reply all
Reply to author
Forward
0 new messages