Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Pb with fmincon and gradient of the objective function

55 views
Skip to first unread message

bgarab...@gmail.com

unread,
Oct 14, 2008, 12:15:17 PM10/14/08
to
Hi,

My problem is about the use of the gradient of the objective function,
to solve an optimization problem with fmincon.

At the start of the project, I have only used the finite difference
evaluation of the gradient -> optimset('gradobj', 'off). The results
are good but the tilme of calculation is too long. This is the reason
that I want to use the gradient of the objective function.

Now, optimset('gradobj', 'on) and I have added a second output on my
objective function [f, g] = objfun(x).

fmincon doesn't give the sames results and finishes after the first
iteration. Matlab displays the following warning:

max Directional
First-order
Iter F-count f(x) constraint Step-size derivative
optimality Procedure
0 61 1423.91
0.6347 Infeasible start point
1 122 1423.91 0.6347 2
0 32.2 infeasible
Optimization terminated: no feasible solution found. Magnitude of
search
direction less than 2*options.TolX but constraints are not satisfied.

I have try several start points, but there are always the same warning

I don't understand this warning. Does someone have an idea ?

thank you

Bruno Luong

unread,
Oct 14, 2008, 4:06:02 PM10/14/08
to
It's a typical encountered error when the gradient supplied is wrong. Have you check the calculation ?

One of the way is do the check is this

x <- random given starting point
g = grad(f)(x)
take h "small" number
compute c(h) = [f(x + h*g) - f(x)] / |g|^2

This c(h) should converges towards 1 when h -> 0. Usually you should get something like c(h)=0.999999 (with five or six "9") before roundoff kick in and c(h) degenerates to 0.

Bruno

Paul Kerr-Delworth

unread,
Oct 15, 2008, 4:53:11 AM10/15/08
to
Hi,

I agree with Bruno that this may be due to an error in the supplied
gradient. An alternative way to check your supplied gradient is to use the
'DerivativeCheck' option for fmincon.

You can find more information on the 'DerivativeCheck' option in the
documentation
Go to http://www.mathworks.com/access/helpdesk/help/toolbox/optim/
and select Argument and Options Reference -> Optimization Options -> Options
Structure from the left hand menu.

Hope this helps.

Best regards,

Paul

"Bruno Luong" <b.l...@fogale.findmycountry> wrote in message
news:gd2u3a$nm6$1...@fred.mathworks.com...

bgarab...@gmail.com

unread,
Oct 15, 2008, 8:42:25 AM10/15/08
to
Hello,

thanks for your answers

When I check the gradient of the objective function with the option
optimset('DerivativeCheck', 'on'), I obtain the following results:

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Diagnostic Information

Number of variables: 6

Functions
Objective and gradient: optimfcnchk/checkfun
Hessian: finite-differencing (or Quasi-
Newton)
Nonlinear constraints: optimfcnchk/checkfun
Gradient of nonlinear constraints: finite-differencing

Constraints
Number of nonlinear inequality constraints: 20610
Number of nonlinear equality constraints: 0

Number of linear inequality constraints: 0
Number of linear equality constraints: 0
Number of lower bound constraints: 0
Number of upper bound constraints: 0

Algorithm selected
medium-scale


%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
End diagnostic information

Function derivative
Maximum discrepancy between derivatives = 1.83044e-006

max
Directional First-order
Iter F-count f(x) constraint Step-size derivative
optimality Procedure

0 7 1486.52
0.2751 Infeasible start point
1 15 1478.57 0.04466 1 -5.82
6.08e+008
2 23 1478.57 0.04466 1 0
0.000117 infeasible


Optimization terminated: no feasible solution found. Magnitude of
search
direction less than 2*options.TolX but constraints are not satisfied.


The difference between the analytics calculation of the gradient and
the finite difference
evaluation of the gradient is, in this case, correct.

It's very strange

Bruno

Paul Kerr-Delworth

unread,
Oct 17, 2008, 6:22:43 AM10/17/08
to
Hi Bruno,

It does appear that you have implemented the derivatives correctly. Without
seeing your code it is difficult to diagnose why your problem converges with
gradients switched off and not with them switched on. If you can post your
problem, we can take a look at it for you.

However, looking at the iterative output of your optimization again, it
appears that the initial point that you supply to fmincon is infeasible.
This may be the reason why your problem takes a long time to solve (when you
run it without using gradients). If you can specify an initial point which
is feasible, this may provide a speed up.

If you cannot easily specify an initial feasible point, you could try
solving a relaxed form of your original problem by loosening some of your
constraints. The solution of the relaxed problem can then be used as an
initial point for your original problem.

Hope this helps

Best regards,

Paul

<bgarab...@gmail.com> wrote in message
news:c30434de-a915-4b2b...@u29g2000pro.googlegroups.com...

Satish

unread,
Jun 8, 2009, 4:02:01 AM6/8/09
to
Hi Michael thanks for the efforts. However i guess i have not cleared my doubt well. The links that you suggest talk of checking if the gradient set is correct, which i had already done and do not get correct results indicating the need to improve the gradient. Also the second link talks of printing the values which at this point i am not interested in.

What i am more in need is the possible gradient function for the objective function that i have mentioned. I could try and possibly know the starting point for the fmincon and also know how to check if the gradient supplied is correct with the DerivativeCheck option. But at the moment i cannot figure out what possibly could be the gradient for the function mentioned as i have tried a few without success.

Any further help could be of lot of use.

Thanks...

0 new messages