"Nonlinear least-squares regression" or "conjugate gradient" in julia

681 views
Skip to first unread message

M R

unread,
May 9, 2013, 1:20:33 PM5/9/13
to julia...@googlegroups.com
Does anyone know if there is a julia implementation of nonlinear least-squares regression or conjugate gradient? This would be great!

cheers,

mike

John Myles White

unread,
May 9, 2013, 1:36:57 PM5/9/13
to julia...@googlegroups.com
Yes, both are available in the Optim package: https://github.com/johnmyleswhite/Optim.jl/

-- John

M R

unread,
May 9, 2013, 2:01:15 PM5/9/13
to julia...@googlegroups.com
Thanks for the information. Great!

Steven G. Johnson

unread,
May 9, 2013, 11:10:24 PM5/9/13
to julia...@googlegroups.com


On Thursday, May 9, 2013 1:36:57 PM UTC-4, John Myles White wrote:
Yes, both are available in the Optim package: https://github.com/johnmyleswhite/Optim.jl/

It would be good to have a nonlinear least-squares algorithm that takes advantage of the special structure of the objective in nonlinear regression, e.g. Levenberg-Marquardt and others, in addition to just using generic optimization methods to minimize the sum-of-squares error.

Steven G. Johnson

unread,
May 9, 2013, 11:11:52 PM5/9/13
to julia...@googlegroups.com
Various nonlinear optimization algorithms are provided in the NLopt package (https://github.com/stevengj/NLopt.jl).  And you can also access the nonlinear optimization and nonlinear fitting routines in the GNU Scientific Library via the GSL package (https://github.com/jiahao/GSL.jl).

Blake Johnson

unread,
May 10, 2013, 11:48:25 AM5/10/13
to julia...@googlegroups.com
The Optim package has an implementation of Levenberg-Marquardt. It is not exposed under the same API, but it is there...

For example usage, see: https://github.com/johnmyleswhite/Optim.jl/blob/master/test/levenberg_marquardt.jl. You can also use the curve_fit() method if you want to use approximate gradients.

--Blake

Douglas Bates

unread,
May 10, 2013, 12:25:55 PM5/10/13
to julia...@googlegroups.com
Another enhancement to nonlinear regression algorithms is Golub and Pereyra's technique of projecting over any conditionally linear parameters,  Statisticians could characterize it as profiling the residual sum of squares function to the nonlinear parameters only.

It does help to stabilize the estimation but most importantly it helps by reducing the number of parameters for which initial estimates are required.

I wrote the implementation of the nonlinear least squares algorithms in R and would be happy to help with such implementations in Julia.  In fact it would be a good idea because I am supposed to be writing a second edition of our book on nonlinear regression (Bates and Watts, 1988, Wiley) but instead am writing Julia code for several other types of models.

Douglas Bates

unread,
May 10, 2013, 12:37:39 PM5/10/13
to julia...@googlegroups.com
I forgot to give the reference, http://epubs.siam.org/doi/abs/10.1137/0710036 is the original paper.  A 30-year retrospective by the same authors is http://iopscience.iop.org/0266-5611/19/2/201 

The idea is that most model functions used in nonlinear regression have one or more parameters that occur linearly in the model.  Hence the conditionally optimal values of these parameters can be determined by linear least squares given the values of the other parameters.  So if your model is `y ~ a + b*exp(-r*t)`, where `y` is the response variable, `t` is the observed covariate (input variable) while `a`, `b` and `r` are the parameters to be estimated, then given a value of r you can determine the conditionally optimum a and b by linear least squares.  What Golub and Pereyra did is to obtain the Jacobian of the projected residual sum of squares function.

I happen to think that my implementation of the calculation is one of the best but unfortunately Mary Lindstrom and I published the derivation in an obscure conference proceedings and it is not widely available.

Blake Johnson

unread,
May 10, 2013, 3:14:17 PM5/10/13
to julia...@googlegroups.com
That sounds very useful. Out of curiosity, have you seen these recent results on Levenberg-Marquardt with geodesic acceleration? (http://arxiv.org/pdf/1201.5885v1.pdf)

--Blake

Douglas Bates

unread,
May 10, 2013, 5:34:33 PM5/10/13
to julia...@googlegroups.com
On Friday, May 10, 2013 2:14:17 PM UTC-5, Blake Johnson wrote:
That sounds very useful. Out of curiosity, have you seen these recent results on Levenberg-Marquardt with geodesic acceleration? (http://arxiv.org/pdf/1201.5885v1.pdf)

I have attended talks on those methods but I haven't read too much yet.  It sort of takes me back to the beginning of my career.  My Ph.D. thesis was on applying differential geometry concept to the "expectation surface" in nonlinear regression and at one point I toyed with the idea of trying to find geodesics.  But that was in the 1970's and well beyond the capability of computers at that time.
Reply all
Reply to author
Forward
0 new messages