I have an optimization problem which will not give a proper result since the optimization appears to stick into a local minimum.
I have tried to limit the optimization to a minimum step size, which should be larger enough to step over the local minimum. I did an exhaustive search and found out the local minimum is just a few percent less that the start value, but the global minimum is about a factor of 100 less than the start value. However, still it won't find the global one.
fminunc gives a similar result.
I did the following (simplified script)
==============================================
p0 = 0;
options = optimset( 'DiffMinChange', 0.1);
[param, fval]=fminsearch(@(param) FuncEval(param, f), p0, options);
[param, fval]=fminunc(@(param) FuncEval(param, f), p0, options);
function val = FuncEval(param, f)
v = testfunction(100*param);
val = -sum((v.*f).^2);
==============================================
has anybody a clue to solve this problem?
Thanx,
Kees
try a global optimization approach:
http://www.mat.univie.ac.at/~neum/glopt/software_g.html
one possibility would be Direct:
www.mat.univie.ac.at/~neum/glopt/mss/BjoeH99.pdf
Regards,
Stefan
"kees de Kapper" <kees_de...@hotmail.com> wrote in message <gsjthu$3b7$1...@fred.mathworks.com>...
Many things to say here.
First of all, is f a vector? You are trying to
minimize
val = - sum((v.*f).^2)
so in effect, you are maximizing the sum of
squares of elements of (v.*f).
Are you sure that you really wish to MAXIMIZE
the sum of squares, and not MINIMIZE the sum
of squares? (lsqnonlin will be a better choice to
minimize a sum of squares.)
Next, there is no place in the help for fminsearch
where it tells you that diffminchange will have
any impact on the way fminsearch works. This is
good, because if you look at the code, you will
see that fminsearch has cheerfully ignored what
you have set. Telling fminsearch that your car is
blue and that the day is Tuesday will be of no
more impact than setting the value of a parameter
like diffminchange. Likewise, that parameter will
not be used for the purpose you have chosen,
even by fminunc.
I'll next suggest that a global optimization, as
suggested by Stefan, is likely to be difficult to
use, especially by an obvious rank novice. And
there is really no purpose in throwing a
sophisticated tool at the problem if this is not
the problem the OP really wishes to solve anyway.
Solve the correct problem, and things will work
much better without recourse to a big hammer.
Finally, almost always when an optimizer gets
stuck at the wrong local minimizer, the real
solution is the starting values supplied are terrible.
Use somewhat reasonable starting values, and the
optimizer will converge. If you know that the
solution generated is crap, then your starting
values must be crap too! In the event that the
optimizer is diverging into a bad place, despite
having "good" starting values, one solution is to
use a constrained optimizer. This makes some
sense, but really it says that one of four things
has usually happened:
1. The starting values really were poor, even
though you thought they were decent.
2. Your code has a bug in it, and you are not
actually solving the problem you thought you
were solving.
3. Your data is poor, noisy, crap. You simply
cannot make a silk purse from a sow's ear. Desire
is not important.
4. You really do need a constrained optimizer.
This is rare, and often suggests that you actually
needed better starting values, and do not know
how to find them.
John