Hi all,
I'm trying to implement a C++ MIMO external function in casadi, and to import it in Python.
Let us say that my function is:
f : (x,y,z) -> (p,q).
For this example let us assume that x,y,z,p and q are 1x6 vectors.
In C++, the function is implemented following the nomenclature described in
5.3.In particular, fname_n_in (=3), fname_n_out (=2), fname_sparsity_in (={(6,1,1);(6,1,1);(6,1,1)}), fname_sparsity_out (={(6,1,1);(6,1,1)}) are implemented.
In C++ everything seems to work as desired.
The strange behavior that I'm experiencing in Python is as follows :
f = external('f','mylib.so',{'enable_fd':True})
input = MX.sym('input',3,6)
x = input[0,:]
y = input[1,:]
z = input[2,:]
fx = Function('fx',[input],[f(input)])
// Here I should be able to pass (x,y,z) as an input but :
print(f.n_in()) // Out: 1, should be 3
print(f.n_out()) // Out: 1, should be 2
print(f.sparsity_in(0)) // Out: 1x1, should be 6x1print(f.sparsity_in(1)) // Out: Error M_range_check
print(f.sparsity_out(0)) // Out: 1x1, should be 6x1
print(f.sparsity_in(1)) // Out: Error M_range_check
// Hence I'm only able to pass the full input matrix, and the it results in:
print(fx.sparsity_in(0)) // Out: 3x6
print(fx.sparsity_out(0)) // Out: 3x6, should at least be 2x6
// Last point, despite the dimension mismatch, it succeeds in computing a
// symbolic Finite-Difference Jacobian
J = Function('J',[x],[jacobian(f(x),x)])
print(J.sparsity_out(0)) // Out: 18x18, wich is consitent with the rest
In conclusion, I am confused about what is computed through f in Python, since the output should not be of the same dimension as the input.
And last point, I am not able to debug numerically since the following :
_q = np.array([1,2,3,4,5,6])
_dq = np.array([1,2,3,4,5,6])
_ddq = np.array([1,2,3,4,5,6])
_x = vertcat(_q,_dq,_ddq).reshape((3,6))
print(f(_x))
produces a runtime error :
.../casadi/core/function_internal.cpp:2815: Failed to evaluate 'eval_dm' for f:
.../casadi/core/function_internal.cpp:1418: 'eval', 'eval_dm' not defined for External
Any help or advice would really be appreciated !
Thanks for the support,
FB