Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

lsqcurvefit versus fminsearchbnd

50 views
Skip to first unread message

Matthew

unread,
Nov 3, 2011, 3:39:11 PM11/3/11
to
When is it appropriate to use lsqcurvefit versus fminsearchbnd?

I am fitting around 11 parameters, and lsqcurvefit seems to be superior. I am simply inquiring why.

Thanks

Alan Weiss

unread,
Nov 4, 2011, 10:29:42 AM11/4/11
to
The algorithm in lsqcurvefit uses gradient information, and is usually
much faster and more reliable. fminsearchbnd is based on fmransearch,
which uses the Nelder-Mead simplex algorithm. This algorithm does not
use gradients, is poor in more than a few dimensions, and is not
guaranteed to converge.

Alan Weiss
MATLAB mathematical toolbox documentation

John D'Errico

unread,
Nov 4, 2011, 12:46:13 PM11/4/11
to
"Matthew" <matthew.mer...@utsouthwestern.edu> wrote in message <j8uqkv$6rg$1...@newscl01ah.mathworks.com>...
> When is it appropriate to use lsqcurvefit versus fminsearchbnd?
>
> I am fitting around 11 parameters, and lsqcurvefit seems to be superior. I am simply inquiring why.
>
> Thanks

In addition to Alan's comments, I will NEVER use fminsearch, or its
bounded cousin, fminsearchbnd, on large problems. "Large" is of
course in the eye of the beholder, but a typical upper limit of
practicality is somewhere around 6 variables, with 3 variables a
better realistic limit for efficient behavior.

11 variables is well beyond my limit in any case.

John
0 new messages