http://www.mathworks.com/help/toolbox/optim/ug/brhkghv-7.html
but I am having trouble. I have a complicated objective to minimize
subject to complicated nonlinear constraints, all of which are
functions of my 3 choice variables, and a wide variety of parameters.
My script that starts everything looks like this:
% define a bunch of parameters
gamma= .2;
beta = .3;
x0=[.5 .5 .5];
%etc
solution = nested_minimization_program(x0,gamma,beta)
Nested_minimization_program looks like this:
function out = nest_minimization-program(x0,gamma,beta)
options = optimset('GradObj','on');
out = fmincon(@objective,x0,[],[],[],[],[0 0 0],[1 1
1],@nonlin,options)
function [obj obj_gradient] = objective(x)
[obj obj_gradient] = complicated_objective(x,gamma,beta);
end
function [ineq_constriant eq_constraint] = nonlin(x)
[ineq_constriant eq_constraint] =
complicated_constaints(x,beta,gamma)
end
end
Complicated_objective is a file that returns the value of the
objective for its first argument, and the value of anlaytical gradient
for its second. Complicated_constaints returns a vector of nonlinear
inequality constraints for its first argument, and a vector of
nonlinear equality constraints for its second.
The reason to do this is so that I can use the @objective and @nonlin
syntax for fmincon; objective and nonlin are only functions of x, not
of the parameters, because they are subfunctions of a function that
has been passed the parameters already. I believe this is the form I
should use in order to pass the gradient and the nonlinear constraints
on to fmincon. My problem is that when I run this code, I get the
following error
>Warning: Trust-region-reflective algorithm does not solve this type of problem,
using active-set algorithm. You could also try the interior-point or
sqp
algorithms: set the Algorithm option to 'interior-point' or 'sqp' and
rerun. For
more help, see Choosing the Algorithm in the documentation.
IE, for some reason fmincon is leaving the Trust-region-reflective
algorithm and going to active set, which does not make use of my
analytical gradient. The requirements for fmincon to use analytical
gradients is, according to http://www.mathworks.com/help/toolbox/optim/ug/brhkghv-3.html,
>Write code that returns:
> The objective function (scalar) as the first output
> The gradient (vector) as the second output
>Set the GradObj option to 'on' with optimset.
objective returns a scalar value of the objective and a gradient as
required, Gradobj is turned on, so I don't see my problem.
You mean "warning".
> IE, for some reason fmincon is leaving the Trust-region-reflective
> algorithm
============================
because trust-region only supports
xor(bound constraints, linear equality constraints)
as mentioned in the FMINCON doc page.
> and going to active set, which does not make use of my
> analytical gradient.
====================
Active set will not ignore your analytical gradient.
It simply doesn't require you to supply it (unlike trust-region)
http://www.mathworks.com/help/toolbox/optim/ug/brhkghv-3.html#bsj1e55
However, you might try using interior-point as the warning message suggests to see if performance improves.
The active-set algorithm does make use of your gradient calculation
assuming that
1. You set GradObj to 'on'
2. You set GradConstr to 'on'
You might also want to try setting Algorithm to 'interior-point', as
recommended in the link above.
It is possible that your nonlinear constraints are not written
correctly, since you indicate that you have more than one, and that you
give a vector as the gradient of the constraints. As described here:
http://www.mathworks.com/help/toolbox/optim/ug/brhkghv-11.html#brhkghv-16
the gradient of the constraints should be a matrix, with the same number
of columns as the number of constraints, and the number of rows is the
dimension of your vector x.
Good luck,
Alan Weiss
MATLAB mathematical toolbox documentation
But as I understand things, you do not have to do both, if you only want to supply an analytical gradient for one or the other....
I'm not sure I understand; I am not supplying any linear equality
constraints - those matrices are empty. I am using upper and lower
bounds and nonlinear constraints. Is the problem therefore the very
use of a nonlinear constraint? IE the trust-region method will not
work in my case in any event?
As for your advice to try interior point, I am again having some
trouble that I think stems from the use of parameters. Here is my
code:
---
function [out fval]
=nested_ramsey_minimization(b,g,lambda,gamma,rho,kavg,beta,delta,firm_alpha,initial_guess)
options = optimset('Algorithm','interior-point', ...
'GradObj','on','GradConstr','on', ...
'Hessian','user-supplied','HessFcn',@hessianfn);
A=[];
littleb=[];
Aeq=[];
littlebeq=[];
%[out fval] =
fmincon(@ramsey_obj,initial_guess,A,littleb,Aeq,littlebeq,[0 0 0],[1 1
1],@nonlin,options);
[out fval] = fmincon(@ramsey_obj,initial_guess,A,littleb,Aeq,littlebeq,
[],[],@nonlin,options);
function [obj_value obj_gradient] = ramsey_obj(x)
[obj_value obj_gradient] =
ramsey_direct_minimization(x,lambda,gamma,rho,kavg,beta);
end
function [cineq ceq gradc gradceq] = nonlin(x)
[cineq ceq gradc gradceq] =
ramsey_direct_constraints(x,b,g,lambda,gamma,rho,kavg,firm_alpha,delta,beta);
end
% Seperate subfunction for hessian when using interior point
function out = hessianfn(x)
out = ramsey_hessian(x,lambda,gamma,rho,kavg);
end
end
---
What is different from before: the objective function @ramsey_obj now
only supplies the scalar objective and the vector of gradients. The
hessian is supplied by a separate function @hessianfn. This new
function is again defined as a subfunction so that it is only a
function of x, since the parameters required by ramsey_hessian have
been passed to nested_ramsey_minimization.
When I run this code I receive the following error:
--
??? Error using ==> nested_ramsey_minimization>hessianfn
Too many input arguments.
---
I don't understand the problem; the only input to hessianfn is x,
which is a 3-vector, and in the code for ramsey_hessian, only x(1)
through x(3) are used. So I don't know what this error is really
telling me.
Thanks very much for your help!
It is correct that I have two nonlinear constraints, and they are
returned in a vector. The 'gradient' for these constraints is then
returned as a 3x2 matrix, which I believe is correct. So I don't
think that is the problem. You are correct that it was not clear to
me that the trus-region methods do not support nonlinear constraints,
so I am attempting to rewrite as an interior-point problem, but as
detailed in my reply above, that is not quite working yet either.
Thanks for your help!
Yes.
> --
> ??? Error using ==> nested_ramsey_minimization>hessianfn
> Too many input arguments.
> ---
>
> I don't understand the problem; the only input to hessianfn is x,
> which is a 3-vector, and in the code for ramsey_hessian, only x(1)
> through x(3) are used. So I don't know what this error is really
> telling me.
=================
hessianfn is not supposed to be a function only of x. It is supposed to be a function of two arguments
hessian = hessianfn(x, lambda)
as described here
http://www.mathworks.com/help/toolbox/optim/ug/fmincon.html#brh002z
The error message is saying that FMINCON wants to pass more than 1 argument to hessianfn, but you have not allowed it to.
Granted, the FMINCON documentation could be clearer about this. It just says that the HessianFcn has to a return a Hessian. It doesn't tell you that it's supposed to be the Hessian of the Lagrangian, as opposed to just the objective function.
I see, this makes it much more clear what is required. Thanks very
much for your help, I think I see what I need to do.