I have a problem involving a minimization of a sum of squared
differences (non-linear in parameters). Should I use nlinfit or
lsqcurvefit, and what is the difference between these to functions.
Anyway, I have not the statistics toolbos.
Thanks a lot, all comments are appreciated.
> I have a problem involving a minimization of a sum of squared
> differences (non-linear in parameters). Should I use nlinfit or
> lsqcurvefit, and what is the difference between these to functions.
> Anyway, I have not the statistics toolbos.
If you have the Optimization Toolbox, but not the Statistics Toolbox, then you
will want to use either LSQCURVEFIT or LSQNONLIN. Frim your description, it
sounds like you should be using LSQCURVEFIT.
NLINFIT (in Stats) and LSQCURVEFIT are very similar and have the same calling
API -- you specify a "regression" function that returns fitted values, and
LSQCURVEFIT computes the residuals and squares and sums them up. LSQCURVEFIT is
a bit more powerful in that you can specify constraints on the cofficients.
With LSQNONLIN, on the other hand, you specify a "error" function that returns
errors (not fitted values), and LSQNONLIN squares them and sums them up.
LSQNONLIN is for cases when you can't write the errors as e = fitted - observed.
The help says:
LSQCURVEFIT solves problems of the form:
min sum {(FUN(X,XDATA)-YDATA).^2} where X, XDATA, YDATA and the values
X returned by FUN can be vectors or
matrices.
LSQNONLIN solves problems of the form:
min sum {FUN(X).^2} where X and the values returned by FUN can be
x vectors or matrices.
Notice that with LSQCURVEFIT, FUN takes coefficients and predictors (XDATA), and
you specify responses (YDATA) separately. The problem is structured in terms of
regression.
With LSQNLIN, FUN takes coefficients, and anything else it needs to compute the
errors can have any structure at all -- you use LSQNONLIN's P1,P2,... arguments
to pass FUN anything you want. This is often, but not always, predictor and
response variables. LSQNONLIN solves problems that cannot be written as
regression, and is more general in this sense.
Hope this helps
- Peter Perkins
The MathWorks, Inc.
>Dear specialists!
>
>I have a problem involving a minimization of a sum of squared
>differences (non-linear in parameters). Should I use nlinfit or
>lsqcurvefit, and what is the difference between these to functions.
>Anyway, I have not the statistics toolbos.
Anyway, then you can't use the nlinfit function...
Use lsqcurvefit instead.
http://www.mathworks.com/access/helpdesk/help/toolbox/stats/nlinfit.shtml
http://www.mathworks.com/access/helpdesk/help/toolbox/optim/lsqcurvefit.shtml
Lars
Lars Gregersen
www.rndee.dk
l...@rndee.dk
Thanks a lot for your comments on my problem!
Maybe you also know, if there is a possibility to obtain summary
statistics (st. error etc) for lsqcurvefit as is usual to deliver
carrying out a regression (linear as well as non-linear?
Thanks!!!!
Hi Christoph -
The Optimization Toolbox has lots of powerful algorithms for doing optimization,
but because the functions there are not specifically aimed at doing statistical
inference, they do not return things like std. errs. You do get all the pieces
that you need, but you have to compute them yourself.
On the other hand, NLINFIT in the Statistics Toolbox has helper functions that
give you confidence and prediction intervals, but it is a little less powerful
than LSQCURVEFIT and LSQNLIN in the sense that you cannot use it to fit
constrained models.
Of course, when you do have active constraints on some
parameters in a model, the simple tools for confidence
estimation are no longer truly appropriate anyway.
John
--
John, since you've mentioned:
> Of course, when you do have active constraints on some
> parameters in a model, the simple tools for confidence
> estimation are no longer truly appropriate anyway
could you explain more about WHY that is so? and if there is any relevant reference, you could kindly let me know?
I fitted my model to the data with some constraints, and having calculated the covariance matrix estimate using the finite-difference jacobian that the LSQNONLIN yielded, none of the parameter estimates have found significant and I freaked out! Maybe the reason is because the resulting covariance matrix estimate was not calculated at the global minimum due to the constraints, but I'm not very certain about my intuition.
Regards,
Jason