The
blogpost mentions that
when there are general callback nonlinearities acting on parameters (functions such as exp, log etc), they can (currently) only act on simple parametric variables, i.e., not on any kind of compound expression involving parameters.
So apparently this also includes inequality/constraint expressions involving parameters, like
>> b=sdpvar(2,3,'full'); blim=sdpvar(1,1);
>> P=optimizer([blim<=b<=1-blim], sum(b.^2), [], blim, b); % this is OK
>> P=optimizer([blim<=b<=1-blim], -entropy(b), [], blim, b); % this fails
Error using optimizer (line 303)
Parameters are currently only allowed to enter function such as exp, sin etc as exp(a), sin(b) etc.
I just want to verify that this the intended behavior, not a bug.
Now, in my actual problem, I somehow bypassed this error and created an optimizer, but then optimizer crashes when given a problem, basically like this:
>> x=sdpvar(1,3); xlim=sdpvar(3,1); b=sdpvar(5,3,'full'); c=sdpvar(3,1); cons=[mean(b)==x, c >= x(:) .* xlim];
>> P=optimizer(cons, -sum(entropy(b))+sum(c), [], xlim, {x,c})
Optimizer object with 3 inputs (1 blocks) and 6 outputs (2 blocks). Solver: FMINCON-STANDARD
>> P([0.1 0.2 0.3]')
Matrix dimensions must agree.
Error in optimizer/subsref (line 302)
self.model.evalMap{k}.variableIndex = find(self.model.evalMap{k}.variableIndex == keptvariablesIndex);
Error in optimizer/subsref (line 9)
[varargout{1:nargout}] = subsref(self,subs);
But if I change the objective to something simple, then it works:
>> P=optimizer(cons, -sum(sum(b.^2))+sum(c), [], xlim, {x,c})
Optimizer object with 3 inputs (1 blocks) and 6 outputs (2 blocks). Solver: FMINCON-STANDARD
>> P([0.1 0.2 0.3]')
ans =
1×2 cell array
{1×3 double} {3×1 double}
Is this a bug?