Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.

Dismiss

7 views

Skip to first unread message

Feb 1, 1996, 3:00:00 AM2/1/96

to

What the bloody hell does 'conjugate' ACTUALLY MEAN? All the references

that I can find say something useless such as: 'and from equation 1, it

can clearly be seen that the new direction is conjugate, blah, blah...'.

This sort of nonsense is not helpful.

that I can find say something useless such as: 'and from equation 1, it

can clearly be seen that the new direction is conjugate, blah, blah...'.

This sort of nonsense is not helpful.

In a similar vein, does anybody have a really good explanation of

Levenberg-Marquardt. Once again most references seem to be written in

Ancient Greek; as far as I understand (which is not very far!) LM is a

sort of blend between gradient descent and conjugate gradients, depending

on some sort of confidence criterion.

Help!

Thanks

Max

Feb 1, 1996, 3:00:00 AM2/1/96

to

M.J.Ra...@bris.ac.uk wrote:

: In a similar vein, does anybody have a really good explanation of

: In a similar vein, does anybody have a really good explanation of

: Levenberg-Marquardt. Once again most references seem to be written in

: Ancient Greek; as far as I understand (which is not very far!) LM is a

: sort of blend between gradient descent and conjugate gradients, depending

: on some sort of confidence criterion.

: Ancient Greek; as far as I understand (which is not very far!) LM is a

: sort of blend between gradient descent and conjugate gradients, depending

: on some sort of confidence criterion.

For Levenberg-Marquardt, I found that the description in

Numerical Recipes was fine -- good enough for me to implement

and use for my thesis. As I recall, you have to read stuff from

the preceeding sections -- you can't just read the L-M section

on its own.

In the unlikely event that you haven't heard of it, the book is:

"Numerical Recipes in {C, Pascal, Fortran}: The art of scientific computing"

Press, Teukolsky, Vetterling, Flannery

Cambridge University Press 1992 ISBN 0-521-43108-5 (C version, hardback)

: Help!

: Thanks

: Max

David Drysdale

Feb 2, 1996, 3:00:00 AM2/2/96

to

<M.J.Ra...@bris.ac.uk> writes:

>In a similar vein, does anybody have a really good explanation of

>Levenberg-Marquardt. Once again most references seem to be written in

.>In a similar vein, does anybody have a really good explanation of

>Levenberg-Marquardt. Once again most references seem to be written in

You might try "C Curve Fitting and Modeling for Scientists and Engineers"

by Jens-Georg Reich, published by McGraw-Hill. This book does invlude

the equations, but presents a qualitative explanation alongside: you

may find this helpful.

.

Will Dwinnell

Commercial Intelligence Inc.

.

P.S. The ISBN is 0-07-051761-4

Feb 2, 1996, 3:00:00 AM2/2/96

to

>>>>> "Max" == M J Ratcliffe <M.J.Ra...@bris.ac.uk> writes:

Max> What the bloody hell does 'conjugate' ACTUALLY MEAN? All the references

Max> that I can find say something useless such as: 'and from equation 1, it

Max> can clearly be seen that the new direction is conjugate, blah, blah...'.

Max> This sort of nonsense is not helpful.

Translate 'conjugate' into 'non-interfering'.

As far as conjugate gradient methods are concerned, conjugate directions can

be seen as non-interfering directions. On the contrary, in a pure gradient

method, directions (opposite of gradients) interfere, in the sense that a new

direction can 'undo' what an older direction had done. See a short and clear

explanation in

@Book{recipes88,

author = "William H. Press and others",

title = "Numerical Recipes, The Art of Scientific Computing",

publisher = "Cambridge University Press",

year = 1988

}

in the section on conjugate gradients.

Best regards,

Bruno Orsier E-mail: ors...@cui.unige.ch

University of Geneva WWW:http://cuiwww.unige.ch/AI-group/staff/orsier.html

--

Bruno Orsier E-mail: ors...@cui.unige.ch

University of Geneva WWW:http://cuiwww.unige.ch/AI-group/staff/orsier.html

Feb 5, 1996, 3:00:00 AM2/5/96

to

ors...@cuisun38.unige.ch (Bruno Orsier) writes:

>Translate 'conjugate' into 'non-interfering'.

>As far as conjugate gradient methods are concerned, conjugate directions can

>be seen as non-interfering directions. On the contrary, in a pure gradient

>method, directions (opposite of gradients) interfere, in the sense that a new

>direction can 'undo' what an older direction had done. See a short and clear

>explanation in

>@Book{recipes88,

> author = "William H. Press and others",

> title = "Numerical Recipes, The Art of Scientific Computing",

> publisher = "Cambridge University Press",

> year = 1988

>}

>in the section on conjugate gradients.

If you're interested, numerical recipes in C appears to be available

in postscript form on the web

(http://cfatab.harvard.edu/nr/bookc.html)

--

Matthew McDonald ma...@cs.uwa.edu.au

Nim's longest recorded utterance was the sixteen-sign declarative

pronouncement, "Give orange me give eat orange me eat orange give me

eat orange give me you."

Feb 10, 1996, 3:00:00 AM2/10/96

to

SUBJECT: Levenburg - Marquardt Method (LM)

Hi M , you wrote :

MJ>In a similar vein, does anybody have a really good explanation of

MJ>Levenberg-Marquardt. Once again most references seem to be written in

MJ>Ancient Greek; as far as I understand (which is not very far!) LM is

MJ>a sort of blend between gradient descent and conjugate gradients,

MJ>depending on some sort of confidence criterion.

MJ>Help!

MJ>Max

The LM method is explained pretty well in "Numerical Recipes in C" by W

H Press et al Cambridge Press. The latest book version is on line on

the Web If you have Netscrape (sic) do a search for numerical recipes

and you'll get the site address with out much pain. You will also need

a postscript viewer to read the files when you down load the chapters

(for free) in EPS. You should get your self GhostScript for windows

(its free and excellent) a GNU postscript viewer program.

If all this fails then the following bit of referential instant

gratification is offered. Borrowing from Numerical recipes (and heavily

bastardized) (see page 542 of the 1990 version)

"Levenburg Marquardt" (LM) is a method of non linear curve fitting to a

data set. The LM method happens to be robust enough that it has become

very popular. The LM method involves solving some inverse matrix

problems. It attempts to reduce the value "Chi squared" (check out any

Statistics high school swat book) of a fit between a set of X,Y points

with individual standard deviations and a nonlinear function. The Chi

Squared value is your confidence of fit criteria). Ie you use this value

to tell how close your curve fit function is and if you should retry

with slightly different function coefficients or if the fit is good

enough. The coefficients of the non linear function are what you are

solving for.

So the lower the Chi squared value (the confidence criteria) between

your data points and the points created by the non linear function

approximation, the better the function approximation. Using a Non

linear function means that your data points can be all over the place

and you will still be able to "join the dots" (curve fit) reasonably

well.

Clear as mud ? any way hope this helps. So hey lemme know !

* CMPQwk 1.42-R2 9380 *Talk is cheap because supply exceeds demand.

Feb 13, 1996, 3:00:00 AM2/13/96

to

Someone wrote:

MJ>In a similar vein, does anybody have a really good explanation of

MJ>Levenberg-Marquardt. Once again most references seem to be written in

MJ>Ancient Greek; as far as I understand (which is not very far!) LM is

MJ>a sort of blend between gradient descent and conjugate gradients,

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

MJ>depending on some sort of confidence criterion.

MJ>Help!

MJ>Max

This is partially incorrect.

Marquardt's method (and Levenberg's method II)

is a blend of _scaled_ steepest descent and

the Gauss-Newton method for solving nonlinear least squares problems.

It has nothing much to do with conjugate gradient methods,

which are a family of unstable, low storage methods

for solving more general smooth optimization problems.

The scaling (metrization) of the gradient vector in Marquardt's method

is important. Marquardt wrote an article on this

before he invented Marquardt's method:

"Solution of Nonlinear Chemical Engineering Models",

D. W. Marquardt,

Chemical Engineering Progress, Vol. 55 #6 (June 1959) 65-70.

--

John Chandler

j...@a.cs.okstate.edu

0 new messages

Search

Clear search

Close search

Google apps

Main menu