Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

fminsearch sticks in local minimum

257 views
Skip to first unread message

kees de Kapper

unread,
Apr 21, 2009, 3:45:02 AM4/21/09
to
Hi all,

I have an optimization problem which will not give a proper result since the optimization appears to stick into a local minimum.

I have tried to limit the optimization to a minimum step size, which should be larger enough to step over the local minimum. I did an exhaustive search and found out the local minimum is just a few percent less that the start value, but the global minimum is about a factor of 100 less than the start value. However, still it won't find the global one.
fminunc gives a similar result.

I did the following (simplified script)
==============================================
p0 = 0;
options = optimset( 'DiffMinChange', 0.1);
[param, fval]=fminsearch(@(param) FuncEval(param, f), p0, options);

[param, fval]=fminunc(@(param) FuncEval(param, f), p0, options);

function val = FuncEval(param, f)
v = testfunction(100*param);
val = -sum((v.*f).^2);
==============================================

has anybody a clue to solve this problem?

Thanx,
Kees

Stefan

unread,
Apr 21, 2009, 4:19:01 AM4/21/09
to
Hi there,

try a global optimization approach:

http://www.mat.univie.ac.at/~neum/glopt/software_g.html

one possibility would be Direct:

www.mat.univie.ac.at/~neum/glopt/mss/BjoeH99.pdf

Regards,
Stefan


"kees de Kapper" <kees_de...@hotmail.com> wrote in message <gsjthu$3b7$1...@fred.mathworks.com>...

John D'Errico

unread,
Apr 21, 2009, 7:36:02 AM4/21/09
to
"kees de Kapper" <kees_de...@hotmail.com> wrote in message <gsjthu$3b7$1...@fred.mathworks.com>...


Many things to say here.

First of all, is f a vector? You are trying to
minimize

val = - sum((v.*f).^2)

so in effect, you are maximizing the sum of
squares of elements of (v.*f).

Are you sure that you really wish to MAXIMIZE
the sum of squares, and not MINIMIZE the sum
of squares? (lsqnonlin will be a better choice to
minimize a sum of squares.)

Next, there is no place in the help for fminsearch
where it tells you that diffminchange will have
any impact on the way fminsearch works. This is
good, because if you look at the code, you will
see that fminsearch has cheerfully ignored what
you have set. Telling fminsearch that your car is
blue and that the day is Tuesday will be of no
more impact than setting the value of a parameter
like diffminchange. Likewise, that parameter will
not be used for the purpose you have chosen,
even by fminunc.

I'll next suggest that a global optimization, as
suggested by Stefan, is likely to be difficult to
use, especially by an obvious rank novice. And
there is really no purpose in throwing a
sophisticated tool at the problem if this is not
the problem the OP really wishes to solve anyway.
Solve the correct problem, and things will work
much better without recourse to a big hammer.

Finally, almost always when an optimizer gets
stuck at the wrong local minimizer, the real
solution is the starting values supplied are terrible.
Use somewhat reasonable starting values, and the
optimizer will converge. If you know that the
solution generated is crap, then your starting
values must be crap too! In the event that the
optimizer is diverging into a bad place, despite
having "good" starting values, one solution is to
use a constrained optimizer. This makes some
sense, but really it says that one of four things
has usually happened:

1. The starting values really were poor, even
though you thought they were decent.

2. Your code has a bug in it, and you are not
actually solving the problem you thought you
were solving.

3. Your data is poor, noisy, crap. You simply
cannot make a silk purse from a sow's ear. Desire
is not important.

4. You really do need a constrained optimizer.
This is rare, and often suggests that you actually
needed better starting values, and do not know
how to find them.

John

Dave

unread,
Jul 13, 2015, 8:08:12 PM7/13/15
to
"Kees de Kapper" wrote in message <gsjthu$3b7$1...@fred.mathworks.com>...
_____________________________________________

Man, that last reply was a bit harsh. To all those out there actually trying to solve your problems:

If you find step size in the optimization function's (fminunc) code, you can save your own version of the file with a different step size. Might have delta or something in the variable name.

Otherwise, doing a parameter search (like picking the best out of combinations of array inputs) then using values close to that for your actual optimization can work. Don't forget to plot all your variables against what you're optimizing for (before you decide on a strategy) to see if you have smooth curves or whatever, if it's some crazy function with a ton of local solutions, good luck with any optimizer, you'll need something that tests the whole range of values. Or you can code to run multiple times with multiple start values, should still be way faster than arrays.

I, for one, need my code to run no matter how crappy the starting values, it actually being used by other people and all, with who knows what input. I'm using fminsearch because it comes standard with MATLAB. Larger step size produced crap results for me, I recommend running for multiple start values.

Cheers, all!

John D'Errico

unread,
Jul 13, 2015, 9:57:11 PM7/13/15
to
"Dave" <kittc...@yahoo.com> wrote in message <mo1jt3$7i8$1...@newscl01ah.mathworks.com>...

Of course, you just replied to a question that had been
untouched for 6 years. I wonder what are the odds that
the person who originally asked the question will actually
read your response?

> Man, that last reply was a bit harsh. To all those out there actually trying to solve your problems:
>

I gave a lot of good advice. More than you did.

> If you find step size in the optimization function's (fminunc) code, you can save your own version of the file with a different step size. Might have delta or something in the variable name.
>

User edits of fminunc, a TMW provided code, is a TERRIBLE
recommendation. This will surely encourage some to create buggy
versions of the code, which they will then forget they changed.

For example, if anyone edits some of my provided software, and
THEN has a problem with it, it is THEIR code. If you edit the code,
you own it if you have created problems.


> Otherwise, doing a parameter search (like picking the best out of combinations of array inputs) then using values close to that for your actual optimization can work. Don't forget to plot all your variables against what you're optimizing for (before you decide on a strategy) to see if you have smooth curves or whatever, if it's some crazy function with a ton of local solutions, good luck with any optimizer, you'll need something that tests the whole range of values. Or you can code to run multiple times with multiple start values, should still be way faster than arrays.
>
> I, for one, need my code to run no matter how crappy the starting values, it actually being used by other people and all, with who knows what input. I'm using fminsearch because it comes standard with MATLAB. Larger step size produced crap results for me, I recommend running for multiple start values.
>

If you need your code to solve the problem, no matter how
crappy your starting values, then don't use fminsearch!

Use a global solver. fminsearch is NOT a global solver, and
it never will be. There is no expectation that fminsearch
will do any better than a local solution, and it often need not
even do that.

John
0 new messages