My problem is about the use of the gradient of the objective function,
to solve an optimization problem with fmincon.
At the start of the project, I have only used the finite difference
evaluation of the gradient -> optimset('gradobj', 'off). The results
are good but the tilme of calculation is too long. This is the reason
that I want to use the gradient of the objective function.
Now, optimset('gradobj', 'on) and I have added a second output on my
objective function [f, g] = objfun(x).
fmincon doesn't give the sames results and finishes after the first
iteration. Matlab displays the following warning:
max Directional
First-order
Iter F-count f(x) constraint Step-size derivative
optimality Procedure
0 61 1423.91
0.6347 Infeasible start point
1 122 1423.91 0.6347 2
0 32.2 infeasible
Optimization terminated: no feasible solution found. Magnitude of
search
direction less than 2*options.TolX but constraints are not satisfied.
I have try several start points, but there are always the same warning
I don't understand this warning. Does someone have an idea ?
thank you
One of the way is do the check is this
x <- random given starting point
g = grad(f)(x)
take h "small" number
compute c(h) = [f(x + h*g) - f(x)] / |g|^2
This c(h) should converges towards 1 when h -> 0. Usually you should get something like c(h)=0.999999 (with five or six "9") before roundoff kick in and c(h) degenerates to 0.
Bruno
I agree with Bruno that this may be due to an error in the supplied
gradient. An alternative way to check your supplied gradient is to use the
'DerivativeCheck' option for fmincon.
You can find more information on the 'DerivativeCheck' option in the
documentation
Go to http://www.mathworks.com/access/helpdesk/help/toolbox/optim/
and select Argument and Options Reference -> Optimization Options -> Options
Structure from the left hand menu.
Hope this helps.
Best regards,
Paul
"Bruno Luong" <b.l...@fogale.findmycountry> wrote in message
news:gd2u3a$nm6$1...@fred.mathworks.com...
thanks for your answers
When I check the gradient of the objective function with the option
optimset('DerivativeCheck', 'on'), I obtain the following results:
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Diagnostic Information
Number of variables: 6
Functions
Objective and gradient: optimfcnchk/checkfun
Hessian: finite-differencing (or Quasi-
Newton)
Nonlinear constraints: optimfcnchk/checkfun
Gradient of nonlinear constraints: finite-differencing
Constraints
Number of nonlinear inequality constraints: 20610
Number of nonlinear equality constraints: 0
Number of linear inequality constraints: 0
Number of linear equality constraints: 0
Number of lower bound constraints: 0
Number of upper bound constraints: 0
Algorithm selected
medium-scale
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
End diagnostic information
Function derivative
Maximum discrepancy between derivatives = 1.83044e-006
max
Directional First-order
Iter F-count f(x) constraint Step-size derivative
optimality Procedure
0 7 1486.52
0.2751 Infeasible start point
1 15 1478.57 0.04466 1 -5.82
6.08e+008
2 23 1478.57 0.04466 1 0
0.000117 infeasible
Optimization terminated: no feasible solution found. Magnitude of
search
direction less than 2*options.TolX but constraints are not satisfied.
The difference between the analytics calculation of the gradient and
the finite difference
evaluation of the gradient is, in this case, correct.
It's very strange
Bruno
It does appear that you have implemented the derivatives correctly. Without
seeing your code it is difficult to diagnose why your problem converges with
gradients switched off and not with them switched on. If you can post your
problem, we can take a look at it for you.
However, looking at the iterative output of your optimization again, it
appears that the initial point that you supply to fmincon is infeasible.
This may be the reason why your problem takes a long time to solve (when you
run it without using gradients). If you can specify an initial point which
is feasible, this may provide a speed up.
If you cannot easily specify an initial feasible point, you could try
solving a relaxed form of your original problem by loosening some of your
constraints. The solution of the relaxed problem can then be used as an
initial point for your original problem.
Hope this helps
Best regards,
Paul
<bgarab...@gmail.com> wrote in message
news:c30434de-a915-4b2b...@u29g2000pro.googlegroups.com...
What i am more in need is the possible gradient function for the objective function that i have mentioned. I could try and possibly know the starting point for the fmincon and also know how to check if the gradient supplied is correct with the DerivativeCheck option. But at the moment i cannot figure out what possibly could be the gradient for the function mentioned as i have tried a few without success.
Any further help could be of lot of use.
Thanks...