Any way to combine global solvers with local ones in LMFIT?

177 views
Skip to first unread message

sam mahdi

unread,
Aug 6, 2023, 1:50:53 PM8/6/23
to lmfit-py
Hello, 

I was trying to figure out a way to combine global solvers with local ones in LMFIT. Say for example basinhopping with levengberg-Marquardt. But I couldn't find anything in the documentation about this. Using minimize allows you to run basinhopping and LM, but only separately (and in basinhopping you can pass arguments into scipy for local solvers, but these do not include least squares methods). If not, is there a way to convert the the MinimizerResult from LMFIT into the OptimizeResult from scipy in LMFIT (this way I could built a custom method for Scipy Basinhopping that uses the output of LMFIT minimize)? 

I'm only asking if these possibilities (using global with local, or conversion of LMFIT MinimizerResult into Scipy OptimizeResult format) are built into LMFIT, or if there is a way to do these within LMFIT package. 

Matt Newville

unread,
Aug 6, 2023, 6:54:30 PM8/6/23
to lmfi...@googlegroups.com
On Sun, Aug 6, 2023 at 12:50 PM sam mahdi <sammah...@gmail.com> wrote:
Hello, 

I was trying to figure out a way to combine global solvers with local ones in LMFIT. Say for example basinhopping with levengberg-Marquardt. But I couldn't find anything in the documentation about this. Using minimize allows you to run basinhopping and LM, but only separately (and in basinhopping you can pass arguments into scipy for local solvers, but these do not include least squares methods). If not, is there a way to convert the the MinimizerResult from LMFIT into the OptimizeResult from scipy in LMFIT (this way I could built a custom method for Scipy Basinhopping that uses the output of LMFIT minimize)? 


I think there are a few different ways to think about how "combine global and local solvers".   You could run a "local solver" at each global solution" -- a few of the solvers such as `ampgo` support that.  You could run a global solver to find a set of "best candidates" and refine the best of those with an LM solver.  

I have certainly used `brute` in that way: do a brute force exploration of a few parameters, find the top 10% of those solutions, and then do a full fit with `leastsq` starting with each of those starting points.  

I should admit that I find many of the global solvers to be really quite unreliable and not very useful.  AMPGO and SHGO seem the most reliable, Brute as definite use, but basin-hopping and differential evolution seem extremely inefficient to me.  Apparently, some people find them useful, but I would suggest trying AMPGO or SHGO first.  

I'm only asking if these possibilities (using global with local, or conversion of LMFIT MinimizerResult into Scipy OptimizeResult format) are built into LMFIT, or if there is a way to do these within LMFIT package. 

Well, you can certainly do a fit with one method and then use that result (or those results) as the starting point(s) for fits with other methods. Changing solvers is very easy, and using `results.params` from one fit as the input Parameters to another fit is definitely supported.



 

--
You received this message because you are subscribed to the Google Groups "lmfit-py" group.
To unsubscribe from this group and stop receiving emails from it, send an email to lmfit-py+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/lmfit-py/17ac5a34-e555-46a7-990f-335533bfb931n%40googlegroups.com.


--
--Matt Newville <newville at cars.uchicago.edu630-327-7411

sam mahdi

unread,
Aug 6, 2023, 8:02:32 PM8/6/23
to lmfit-py
I was looking more for global techniques (such as those in basinhopping and AMPGO) that use global techniques in conjunction with local ones. However the selection of local solvers is limited to only those scipy has available e.g. in the case of AMPGO
local: str (default is 'L-BFGS-B') Name of the local minimization method. Valid options are: - 'L-BFGS-B' - 'Nelder-Mead' - 'Powell' - 'TNC' - 'SLSQP'But what I was looking for is a way to use levenberg marquardt instead of this set list. From my understanding the way scipy does this is by using the output "OptimizeResult"But what I was looking for is a way to use levenberg marquardt instead of this set list. From my understanding the way scipy incorporates the local solvers with the global is by using the output of the local solver "OptimizeResult" format into the global one. So in this manner one could generate their own function that generates this type of output. In scipy tho different solvers have different outputs (e.g. least squares, leastsq), so they don't work with minimize. Scipys global solvers can only use the minimize output. LMFIT has a single uniform output for all methods "MinimizerResult", which means all methods can be used interchangeably. My thought was then a method where you could convert the "MinimizerResult" to Scipys expected format "OptimizeResult".

Is this what you meant by 
"I think there are a few different ways to think about how "combine global and local solvers".   You could run a "local solver" at each global solution" -- a few of the solvers such as `ampgo` support that.  You could run a global solver to find a set of "best candidates" and refine the best of those with an LM solver.". 

If so I'm not entirely sure how I would set this up. The only idea I had is as above, just have method=func where func is the output from a local solver (But for this to work it would have to be in the OptimizeResult format). 

As to why to use a global solver (and why it would be useful), I do find other solutions with different starting conditions. As to which one to use, I generally like Basinhopping since I often just use it as a check to ensure I have the global minima and am not stuck in a local one (beats just throwing in random starting conditions to make sure I keep getting the same solution, or a grid search). From my understanding SHGO is a bit slow for higher dimension problems. APGO looks very interesting and useful, but I've just never had an issue with Basinhopping (outside of its a touch slow) so never had a reason to use it. 

Matt Newville

unread,
Aug 6, 2023, 10:41:34 PM8/6/23
to lmfi...@googlegroups.com
On Sun, Aug 6, 2023 at 7:02 PM sam mahdi <sammah...@gmail.com> wrote:
I was looking more for global techniques (such as those in basinhopping and AMPGO) that use global techniques in conjunction with local ones.

Well, what does "in conjunction" mean to you?  Do you mean to run a local solver at every candidate solution of the global solver?  Is it necessary to do it for every candidate?  If not, how does one decide which solver to run when?  


However the selection of local solvers is limited to only those scipy has available e.g. in the case of AMPGO
local: str (default is 'L-BFGS-B') Name of the local minimization method. Valid options are: - 'L-BFGS-B' - 'Nelder-Mead' - 'Powell' - 'TNC' - 'SLSQP'But what I was looking for is a way to use levenberg marquardt instead of this set list. From my understanding the way scipy does this is by using the output "OptimizeResult"

I think the general assumption is that if you are using a global solver then you have a scalar minimizer or may not want the restriction that the number of values in the residual array exceeds the number of variable parameters. 

I'm not sure, but I would guess that scipy uses its OptimizeResult because it is the only place they store the best-fit values.  
For lmfit, you would definitely want to use the result.params.  

But what I was looking for is a way to use levenberg marquardt instead of this set list.

Does your objective return a residual array?

From my understanding the way scipy incorporates the local solvers with the global is by using the output of the local solver "OptimizeResult" format into the global one. So in this manner one could generate their own function that generates this type of output. In scipy tho different solvers have different outputs (e.g. least squares, leastsq), so they don't work with minimize.
Scipys global solvers can only use the minimize output.

Perhaps I do not understand, but I doubt that the scipy global solvers can only use an OptimizeResult.  My experience is that global solvers often have "meta parameters" that need to be set up.

 
LMFIT has a single uniform output for all methods "MinimizerResult", which means all methods can be used interchangeably. My thought was then a method where you could convert the "MinimizerResult" to Scipys expected format "OptimizeResult"

I am pretty sure we do not want to do that. 


Is this what you meant by 
"I think there are a few different ways to think about how "combine global and local solvers".   You could run a "local solver" at each global solution" -- a few of the solvers such as `ampgo` support that.  You could run a global solver to find a set of "best candidates" and refine the best of those with an LM solver.". 

If so I'm not entirely sure how I would set this up. The only idea I had is as above, just have 
method=func where func is the output from a local solver (But for this to work it would have to be in the OptimizeResult format). 


Well, you do a fit with one method, then do a fit using the resulting parameters and another method:

    result = minimize(obj_function, params, method=method1, ...)
    final = minimize(obj_function, result.params, method=method2, ...) 

That's pretty simple to do with lmfit, and supports trying a global method or two (or 10) and selecting which ones to "finalize" with "leastsq", based on "result.chi_square" or similar statistics. 

As to why to use a global solver (and why it would be useful), I do find other solutions with different starting conditions. As to which one to use, I generally like Basinhopping since I often just use it as a check to ensure I have the global minima and am not stuck in a local one (beats just throwing in random starting conditions to make sure I keep getting the same solution, or a grid search). From my understanding SHGO is a bit slow for higher dimension problems. APGO looks very interesting and useful, but I've just never had an issue with Basinhopping (outside of its a touch slow) so never had a reason to use it. 

If you are interested in a reliable solution with global solvers, try multiple solvers.  My experience is that basin-hopping is not very good (see also  https://infinity77.net/global_optimization/multidimensional.html).   But, I'm sure others have other experiences.

I quite like `brute` when I have a few parameters for which I am very unsure of the scale of initial values.  For example, that might be 625 function evals for 4 variables (5 values each) and then sorting those results to pick 25 starting values for leastsq.  I should also say that almost all of the fitting problems I have are multivariate (more observations than parameters) and generally differentiable. But if you want to refine global fits with leastsq, your problem may be too.  

--Matt


sam mahdi

unread,
Aug 6, 2023, 11:13:25 PM8/6/23
to lmfit-py
I apologize, I'm a bit confused. Maybe I am understanding how these global solvers work?

If we use Basinhopping as an example. You start with some initial values, use a local solver to minimize, then "hop" to another area with some step size, and then use a local solver again. Rinse and repeat basically. But the local solver is built within the global one. 
So you cannot set it up like this:  
result = minimize(obj_function, params, method=method1, ...)
final = minimize(obj_function, result.params, method=method2, ...) 

Because in basinhopping, method1 and method2 are built in. E.G.

final = minimize(obj_function, result.params, method=method1, **fit_kws={method:method2...) (i.e. u cannot remove method2, so in ur setup in the "global" which you call final, u r still using a local solver, but it's just the default local solver for w.e. global solver u use, I am trying to modify that local built in solver)

The output from method2 is then modified by the global (method1), and used as inputs again for the local (method2). The 2 techniques are intertwined. The local gives info to global, info is modified by global and fed back into local. This is what I meant in conjunction, they work together. The output from one is used as the input of the other as a loop. This loop is repeated until the number of iterations is completed. There is no way to separate the global/local into 2 separate processes (they are intertwined from my understanding). What I was looking at was a way of having: 

final = minimize(obj_function, result.params, method=basinhopping, **fit_kws={method:least_squares...)

Matt Newville

unread,
Aug 6, 2023, 11:31:27 PM8/6/23
to lmfi...@googlegroups.com
On Sun, Aug 6, 2023 at 10:13 PM sam mahdi <sammah...@gmail.com> wrote:
I apologize, I'm a bit confused. Maybe I am understanding how these global solvers work?

If we use Basinhopping as an example. You start with some initial values, use a local solver to minimize, then "hop" to another area with some step size, and then use a local solver again. Rinse and repeat basically. But the local solver is built within the global one. 
So you cannot set it up like this:  
result = minimize(obj_function, params, method=method1, ...)
final = minimize(obj_function, result.params, method=method2, ...) 

Because in basinhopping, method1 and method2 are built in. E.G.

Yes, that is true, if you use basin hopping.  Which I really cannot recommend.

 

final = minimize(obj_function, result.params, method=method1, **fit_kws={method:method2...) (i.e. u cannot remove method2, so in ur setup in the "global" which you call final, u r still using a local solver, but it's just the default local solver for w.e. global solver u use, I am trying to modify that local built in solver)

Well, you *can* do that with any set of methods.  The question is whether you *want* to run a local solver at every step of a global solver.

FWIW, I think there are many solvers in lmfit that we just cannot support well.  I think it is possible that we should just remove several of these, including basinhopping.
--
You received this message because you are subscribed to the Google Groups "lmfit-py" group.
To unsubscribe from this group and stop receiving emails from it, send an email to lmfit-py+u...@googlegroups.com.

sam mahdi

unread,
Aug 6, 2023, 11:50:11 PM8/6/23
to lmfit-py
"The question is whether you *want* to run a local solver at every step of a global solver." 

That's the whole point of the technique though, so I'm confused by what you mean. The global solver cannot be run without running the local solver at every step. I'm confused by what you mean here?

But back to my original question then, LMFIT doesn't have anything then for incorporation of other local solvers for global ones that use them (e.g. least squares within basinhopping or ampgo). So would you recommend me to use the brute method then? I'm really just trying to ensure I am at the global minima and there are not other solutions outside of the one found using my initial starting conditions. 

Laurence Lurio

unread,
Aug 7, 2023, 9:54:31 AM8/7/23
to lmfi...@googlegroups.com
Sam,

I have some experience using the differential_evolution option and I can say it is significantly better at dealing with finding the best fit when there are multiple local minima. It is much slower than the least squares option.  In principle there is a flag that allows you to do multiprocessing (workers ) which could speed things up, but  I have never been able to get that to work.

Larry 

Matt Newville

unread,
Aug 7, 2023, 10:27:39 AM8/7/23
to lmfi...@googlegroups.com
Hi Sam, Laurence, 

Yeah, I definitely admit that my experience with the different global minimizers is limited, and I know that some people are strong advocates of differential evolution.   My limited experience is that shgo and ampgo are pretty reliable.  Again, I am sure that will vary with problem type, number of parameters vs number of observations, and how differentiable the cost function is with respect to the parameter values.  I think there just is not a single solution.

For basin-hopping with lmfit, it is completely possible that we are not wrapping that scipy.optimize function very well (or perhaps something changed that we were not aware of).  I know that I tried using Scipy's basin-hopping many (10+) years ago and it seemed to just never actually work.  I assume it does work now, but I am not at all sure that lmfit is using it well.  If basin-hopping is calling another scipy.optimize solver with the assumption that the objective function is written for scipy.optimize conventions, then that might be a very serious problem.   I have not investigated that....


sam mahdi

unread,
Aug 7, 2023, 10:41:18 AM8/7/23
to lmfit-py
Basinhopping does work in lmfit and is wrapped properly. As for the whole purpose of me trying to change the local solvers is the local solvers in minimize either don't work, or are too slow. 

local: str (default is 'L-BFGS-B') Name of the local minimization method. Valid options are: - 'L-BFGS-B' - 'Nelder-Mead' - 'Powell' - 'TNC' - 'SLSQP'


So in this case, BFGS doesn't work for me very well. Nelder-Mead works but is very slow with higher dimensions. Powell is apparently broken in Scipy. And I haven't had much luck with TNC or SLSQP either. LM is fast regardless of dimensionality, and works quite well (I am dealing with a least squares problem). It's why I've been refraining from using basinhopping, or ampgo (e.g. ampgo and Basinhoping would have the same issues here because it is the local solver that is the issue, not the global one), and why I didn't want to use dual_annealing since that is slow too. 

I was just curious if there was something I had missed or overlooked and wanted to double check if LMFIT could do this or not. I don't think there is anything wrong or improper with how LMFIT incorporates the global solvers. They use them/treat them the same way scipy does. 

Laurence Lurio

unread,
Aug 7, 2023, 10:56:20 AM8/7/23
to lmfi...@googlegroups.com
Matt,

One more question regarding differential_evolution, since you guys got me thinking about this.  I tried the workers flag and got an error when I called my model with the ".fit" method. (The keyword argument workers does not match any arguments of the model function. It will be ignored.).  However, if I use minimize directly, then this does not give an error.  Is there a reason I need to use minimize instead of just calling the .fit method?

Larry 

Matt Newville

unread,
Aug 7, 2023, 11:21:05 AM8/7/23
to lmfi...@googlegroups.com
Hi Sam, 

Sorry for the noise here, and I looked at the code for `basinhopping` in lmfit in more detail.  I believe that it should work fine with the local solvers it has: it does do the mapping of "array of variable values" to  Parameter dict, including handling parameter bounds.

But, as with many of the global solvers (at least the way lmfit uses them), it also always reduces the output of the users residual function to a single "penalty" value for a scalar minimizer, which is what all of the "local solvers" listed are.  Even at the scipy optimize docs (https://docs.scipy.org/doc/scipy/reference/optimize.html#local-multivariate-optimization) these are described as "Minimization of scalar function of one or more variables".  That is, the objective function is expected to return a scalar "penalty" or "cost" value that is to be minimized.

As far as I can tell, all of the global optimizers in scipy.optimize really do also assume that the objective are also all "scalar": that would have to be the case for "basinhopping" calling "nelder", for example, and from a non-exhaustive search of the scipy docs, it seems that all of the examples use scalar objective functions.   I don't know that this is actually a requirement, but it might be.  If anyone here knows for sure, please let us know. 

Anyway, for sure lmfit is currently using `Minimizer.penalty()` for the scalar minimizers and the global minimizers: if you write an objective that returns multiple values, this function will reduce that residual to a scalar, with "(r*r).sum()" being the default reduction. 
That means that currently, `leastsq` cannot be used as an "inner solver" for basinhopping, as `leastsq` requires a residual with an array with more values than the number of variables.

In principle, if a global solver allowed a residual array, it is possible that this restriction might be lifted.  But, I believe that Scipy basinhopping may not allow that. It is described as "Basin-hopping is a stochastic algorithm which attempts to find the global minimum of a smooth scalar function of one or more variables", which is to say it expects a scalar objective function.





 

 

On Mon, Aug 7, 2023 at 9:41 AM sam mahdi <sammah...@gmail.com> wrote:

sam mahdi

unread,
Aug 7, 2023, 2:48:33 PM8/7/23
to lmfit-py
Hi Matt, 

I don't think it matters that the global solver takes a scalar though (perhaps for some methods maybe it does). For basinhopping for example, all you are looking at is if you can get a lower chi2 with some stepsize from the minimized solution (i.e. a singular chi2 value). For least squares, the local solver, you do need an array of residuals, but that can be set up. What I was trying to see if I would have to setup my own custom setup or if there was something setup within LMFIT (I was trying to see if LMFIT had a converter, turns out Scipy already had an OptimizeResult built in).  Scipy itself allows for custom methods. One of these custom methods can be least squares from LMFIT. The chi2 output of least squares solution, which is a scalar, can then be returned to the global solver (BUT the minimization occurring on a local level uses the array of residuals). I don't see any issue with this though? I am able to set it up so for example:

def func():
   return array_of_residuals
def custom_method(fun, x0, args, **kwargs,**options):
    params=Parameters()
    params.add('var',value=x0[0],value=1)
    params.add('2var',value=x0[1],value=1)
    lmfit_solution=minimize(fun,params,args=args)
    return OptimizeResult(x=lmfit_solution.params,fun=lmfit_solution.chisqr)
basinhopping(func,minimizer_kwargs={'method'=custom_method},x0=[values]

So you can run your function that returns an array of residuals for least squares, sum it to a scalar, pass that to basinhopping, and then basinhopping will return another set of parameters for you to continue. I don't see any issue here with using a scalar for the global solver though? 

Matt Newville

unread,
Aug 7, 2023, 3:43:42 PM8/7/23
to lmfi...@googlegroups.com
Sam,


On Mon, Aug 7, 2023 at 1:48 PM sam mahdi <sammah...@gmail.com> wrote:
Hi Matt, 

I don't think it matters that the global solver takes a scalar though (perhaps for some methods maybe it does).

Yes, it does matter.

For basinhopping for example, all you are looking at is if you can get a lower chi2 with some stepsize from the minimized solution (i.e. a singular chi2 value).

Yes.

For least squares, the local solver, you do need an array of residuals, but that can be set up.

Hm...

What I was trying to see if I would have to setup my own custom setup or if there was something setup within LMFIT (I was trying to see if LMFIT had a converter, turns out Scipy already had an OptimizeResult built in). 

Nope, it does not. 

Scipy itself allows for custom methods. One of these custom methods can be least squares from LMFIT. The chi2 output of least squares solution, which is a scalar, can then be returned to the global solver (BUT the minimization occurring on a local level uses the array of residuals). I don't see any issue with this though? I am able to set it up so for example:

def func():
   return array_of_residuals
def custom_method(fun, x0, args, **kwargs,**options):
    params=Parameters()
    params.add('var',value=x0[0],value=1)
    params.add('2var',value=x0[1],value=1)
    lmfit_solution=minimize(fun,params,args=args)
    return OptimizeResult(x=lmfit_solution.params,fun=lmfit_solution.chisqr)
basinhopping(func,minimizer_kwargs={'method'=custom_method},x0=[values]

So you can run your function that returns an array of residuals for least squares, sum it to a scalar, pass that to basinhopping, and then basinhopping will return another set of parameters for you to continue. I don't see any issue here with using a scalar for the global solver though? 

I don't think there is a problem using a scalar function for a global solver - I think that is expected.  

The objective function called by `leastsq` must return an array.  I do not know what a global solver like `basinhopping` will do if its objective function returns an array, but I suspect "fail" is what it will do.    

It seems pretty surprising that one could call a different objective function for "global step" and "local refinement".  Like, what if they give totally different answers?  If that is permitted, you probably need to be careful to make sure the returned values actually match (you say you "sum it to a scalar", but you do not).  I would guess that `x=lmfit_solution.params` would not work for Scipy's basinhopping: you probably need an array of values of the variables.

But, I guess I am not sure if you are saying the above code works (it does not ;), or asking how to make it work.

sam mahdi

unread,
Aug 7, 2023, 3:54:14 PM8/7/23
to lmfit-py
I'm a bit confused here. Do you mean it won't work in the sense the answer will be wrong? Or that it won't work as in the code will not run. I can confirm the above setup does indeed work (it runs). I can also confirm you will get the proper answer (with the proper errors, covariances, etc.). I have not tried it for more complicated methods, but for my basic setups/examples, the above setup of using basinhopping as a global with least squares LMFIT as a local, has worked with no problems. 

Basinhopping must also take in a scalar (i.e. you cannot return an array of values, hence why I used the sum of squared residuals in my above example). The local refinement seems unrelated to the global step (at least in this instance?), which is why Scipy explicitly setup cases for custom methods in global methods like basinhopping (this is at least from my understanding). 

Matt Newville

unread,
Aug 7, 2023, 4:17:00 PM8/7/23
to lmfi...@googlegroups.com
Sam,

On Mon, Aug 7, 2023 at 2:54 PM sam mahdi <sammah...@gmail.com> wrote:
I'm a bit confused here. Do you mean it won't work in the sense the answer will be wrong?

The code you posted had 
     params.add('var',value=x0[0],value=1)

That's a syntax error.   I cannot guess what you actually did.

I have no idea what a scipy solver would do with the `x` value from 

      OptimizeResult(x=lmfit_solution.params,fun=lmfit_solution.chisqr)

I suspect that if that ever does work, it will be fragile.


sam mahdi

unread,
Aug 7, 2023, 4:25:47 PM8/7/23
to lmfit-py
Oh my apologies, that was just a basic "setup" idea. Not actual code. I accidently types the value twice (it should just be value=x0[0], since it is using the new point from basinhopping as the new starting conditions for local minimization). My point was that using least squares as a local solver by defining it as a custom method in say basinhopping, does work. In a similar format as above (the code I posted wasn't meant to be workable, just an example format). 

And from my understanding basinhopping is jumping from one solution to another, using the chisqr to determine which one is the global minimum. So it uses the x value from the OptimizeResult I presume as a starting point for it's next step? Again, it does work, and again I have been able to replicate getting proper answers. I cannot state as to how fragile it is, since I have yet to extensively test it on more difficult cases, but I have run the code using the built in local solvers in scipy, as well as other solvers such as least squares, and you get the same answers, same errors, correlations, etc. So it appears to be everything appears to be working properly?
Reply all
Reply to author
Forward
0 new messages