Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

nlinfit vs. lsqcurvefit

1,408 views
Skip to first unread message

Christoph B.

unread,
Oct 7, 2002, 4:30:16 AM10/7/02
to
Dear specialists!

I have a problem involving a minimization of a sum of squared
differences (non-linear in parameters). Should I use nlinfit or
lsqcurvefit, and what is the difference between these to functions.
Anyway, I have not the statistics toolbos.

Thanks a lot, all comments are appreciated.

Peter Perkins

unread,
Oct 7, 2002, 10:51:51 AM10/7/02
to
Hi Christoph -

> I have a problem involving a minimization of a sum of squared
> differences (non-linear in parameters). Should I use nlinfit or
> lsqcurvefit, and what is the difference between these to functions.
> Anyway, I have not the statistics toolbos.

If you have the Optimization Toolbox, but not the Statistics Toolbox, then you
will want to use either LSQCURVEFIT or LSQNONLIN. Frim your description, it
sounds like you should be using LSQCURVEFIT.

NLINFIT (in Stats) and LSQCURVEFIT are very similar and have the same calling
API -- you specify a "regression" function that returns fitted values, and
LSQCURVEFIT computes the residuals and squares and sums them up. LSQCURVEFIT is
a bit more powerful in that you can specify constraints on the cofficients.

With LSQNONLIN, on the other hand, you specify a "error" function that returns
errors (not fitted values), and LSQNONLIN squares them and sums them up.
LSQNONLIN is for cases when you can't write the errors as e = fitted - observed.

The help says:

LSQCURVEFIT solves problems of the form:
min sum {(FUN(X,XDATA)-YDATA).^2} where X, XDATA, YDATA and the values
X returned by FUN can be vectors or
matrices.

LSQNONLIN solves problems of the form:
min sum {FUN(X).^2} where X and the values returned by FUN can be
x vectors or matrices.

Notice that with LSQCURVEFIT, FUN takes coefficients and predictors (XDATA), and
you specify responses (YDATA) separately. The problem is structured in terms of
regression.

With LSQNLIN, FUN takes coefficients, and anything else it needs to compute the
errors can have any structure at all -- you use LSQNONLIN's P1,P2,... arguments
to pass FUN anything you want. This is often, but not always, predictor and
response variables. LSQNONLIN solves problems that cannot be written as
regression, and is more general in this sense.

Hope this helps

- Peter Perkins
The MathWorks, Inc.

Lars Gregersen

unread,
Oct 7, 2002, 1:49:29 PM10/7/02
to
On 7 Oct 2002 01:30:16 -0700, christo...@web.de (Christoph B.)
wrote:

>Dear specialists!
>
>I have a problem involving a minimization of a sum of squared
>differences (non-linear in parameters). Should I use nlinfit or
>lsqcurvefit, and what is the difference between these to functions.
>Anyway, I have not the statistics toolbos.

Anyway, then you can't use the nlinfit function...

Use lsqcurvefit instead.

http://www.mathworks.com/access/helpdesk/help/toolbox/stats/nlinfit.shtml
http://www.mathworks.com/access/helpdesk/help/toolbox/optim/lsqcurvefit.shtml

Lars

Lars Gregersen
www.rndee.dk
l...@rndee.dk

Christoph B.

unread,
Oct 9, 2002, 3:54:31 AM10/9/02
to
Peter Perkins <pperkins-...@mathworks.com> wrote in message news:<ans727$lah$1...@news.mathworks.com>...


Thanks a lot for your comments on my problem!

Maybe you also know, if there is a possibility to obtain summary
statistics (st. error etc) for lsqcurvefit as is usual to deliver
carrying out a regression (linear as well as non-linear?

Thanks!!!!

Peter Perkins

unread,
Oct 10, 2002, 1:22:52 PM10/10/02
to
> Maybe you also know, if there is a possibility to obtain summary
> statistics (st. error etc) for lsqcurvefit as is usual to deliver
> carrying out a regression (linear as well as non-linear?

Hi Christoph -

The Optimization Toolbox has lots of powerful algorithms for doing optimization,
but because the functions there are not specifically aimed at doing statistical
inference, they do not return things like std. errs. You do get all the pieces
that you need, but you have to compute them yourself.

On the other hand, NLINFIT in the Statistics Toolbox has helper functions that
give you confidence and prediction intervals, but it is a little less powerful
than LSQCURVEFIT and LSQNLIN in the sense that you cannot use it to fit
constrained models.

John D'Errico

unread,
Oct 11, 2002, 8:26:14 AM10/11/02
to
In article <ao4d1c$p6s$1...@news.mathworks.com>, Peter Perkins
<pperkins-...@mathworks.com> wrote:

Of course, when you do have active constraints on some
parameters in a model, the simple tools for confidence
estimation are no longer truly appropriate anyway.

John

--

Jason Park

unread,
Nov 15, 2010, 3:43:04 AM11/15/10
to
Well, it's been years since this was posted, so I don't know whether anyone would respond to my question here.

John, since you've mentioned:

> Of course, when you do have active constraints on some
> parameters in a model, the simple tools for confidence
> estimation are no longer truly appropriate anyway

could you explain more about WHY that is so? and if there is any relevant reference, you could kindly let me know?

I fitted my model to the data with some constraints, and having calculated the covariance matrix estimate using the finite-difference jacobian that the LSQNONLIN yielded, none of the parameter estimates have found significant and I freaked out! Maybe the reason is because the resulting covariance matrix estimate was not calculated at the global minimum due to the constraints, but I'm not very certain about my intuition.

Regards,
Jason

Donnacha

unread,
Feb 7, 2014, 6:45:07 AM2/7/14
to
And even more years pass...

The statistics toolbox has nlinfit.m and fitnlm.m
The optimization toolbox has lsqcurvefit.m and lsqnonlin.m

Could someone tell me if the previous (12 year old) answer is up to date and if not what are the pros and cons of these fitters?

DD


"Jason Park" wrote in message <ibqrqo$q1r$1...@fred.mathworks.com>...
0 new messages