Hello,I was trying to figure out a way to combine global solvers with local ones in LMFIT. Say for example basinhopping with levengberg-Marquardt. But I couldn't find anything in the documentation about this. Using minimize allows you to run basinhopping and LM, but only separately (and in basinhopping you can pass arguments into scipy for local solvers, but these do not include least squares methods). If not, is there a way to convert the the MinimizerResult from LMFIT into the OptimizeResult from scipy in LMFIT (this way I could built a custom method for Scipy Basinhopping that uses the output of LMFIT minimize)?
I'm only asking if these possibilities (using global with local, or conversion of LMFIT MinimizerResult into Scipy OptimizeResult format) are built into LMFIT, or if there is a way to do these within LMFIT package.
--
You received this message because you are subscribed to the Google Groups "lmfit-py" group.
To unsubscribe from this group and stop receiving emails from it, send an email to lmfit-py+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/lmfit-py/17ac5a34-e555-46a7-990f-335533bfb931n%40googlegroups.com.
I was looking more for global techniques (such as those in basinhopping and AMPGO) that use global techniques in conjunction with local ones.
However the selection of local solvers is limited to only those scipy has available e.g. in the case of AMPGOlocal: str (default is 'L-BFGS-B') Name of the local minimization method. Valid options are: - 'L-BFGS-B' - 'Nelder-Mead' - 'Powell' - 'TNC' - 'SLSQP'But what I was looking for is a way to use levenberg marquardt instead of this set list. From my understanding the way scipy does this is by using the output "OptimizeResult"
But what I was looking for is a way to use levenberg marquardt instead of this set list.
From my understanding the way scipy incorporates the local solvers with the global is by using the output of the local solver "OptimizeResult" format into the global one. So in this manner one could generate their own function that generates this type of output. In scipy tho different solvers have different outputs (e.g. least squares, leastsq), so they don't work with minimize.
Scipys global solvers can only use the minimize output.
LMFIT has a single uniform output for all methods "MinimizerResult", which means all methods can be used interchangeably. My thought was then a method where you could convert the "MinimizerResult" to Scipys expected format "OptimizeResult"
Is this what you meant by"I think there are a few different ways to think about how "combine global and local solvers". You could run a "local solver" at each global solution" -- a few of the solvers such as `ampgo` support that. You could run a global solver to find a set of "best candidates" and refine the best of those with an LM solver.".If so I'm not entirely sure how I would set this up. The only idea I had is as above, just have
method=func where func is the output from a local solver (But for this to work it would have to be in the OptimizeResult format).
As to why to use a global solver (and why it would be useful), I do find other solutions with different starting conditions. As to which one to use, I generally like Basinhopping since I often just use it as a check to ensure I have the global minima and am not stuck in a local one (beats just throwing in random starting conditions to make sure I keep getting the same solution, or a grid search). From my understanding SHGO is a bit slow for higher dimension problems. APGO looks very interesting and useful, but I've just never had an issue with Basinhopping (outside of its a touch slow) so never had a reason to use it.
I apologize, I'm a bit confused. Maybe I am understanding how these global solvers work?If we use Basinhopping as an example. You start with some initial values, use a local solver to minimize, then "hop" to another area with some step size, and then use a local solver again. Rinse and repeat basically. But the local solver is built within the global one.So you cannot set it up like this:result = minimize(obj_function, params, method=method1, ...)final = minimize(obj_function, result.params, method=method2, ...)Because in basinhopping, method1 and method2 are built in. E.G.
final = minimize(obj_function, result.params, method=method1, **fit_kws={method:method2...) (i.e. u cannot remove method2, so in ur setup in the "global" which you call final, u r still using a local solver, but it's just the default local solver for w.e. global solver u use, I am trying to modify that local built in solver)
--
You received this message because you are subscribed to the Google Groups "lmfit-py" group.
To unsubscribe from this group and stop receiving emails from it, send an email to lmfit-py+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/lmfit-py/7b964dc2-d815-4f9c-8693-0e11e3416335n%40googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/lmfit-py/0943389f-8a1a-429e-a5a1-7dda9d1292e5n%40googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/lmfit-py/0943389f-8a1a-429e-a5a1-7dda9d1292e5n%40googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/lmfit-py/8bc0795c-6602-405a-b6ff-88480617597dn%40googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/lmfit-py/8bc0795c-6602-405a-b6ff-88480617597dn%40googlegroups.com.
Hi Matt,I don't think it matters that the global solver takes a scalar though (perhaps for some methods maybe it does).
For basinhopping for example, all you are looking at is if you can get a lower chi2 with some stepsize from the minimized solution (i.e. a singular chi2 value).
For least squares, the local solver, you do need an array of residuals, but that can be set up.
What I was trying to see if I would have to setup my own custom setup or if there was something setup within LMFIT (I was trying to see if LMFIT had a converter, turns out Scipy already had an OptimizeResult built in).
Scipy itself allows for custom methods. One of these custom methods can be least squares from LMFIT. The chi2 output of least squares solution, which is a scalar, can then be returned to the global solver (BUT the minimization occurring on a local level uses the array of residuals). I don't see any issue with this though? I am able to set it up so for example:def func():return array_of_residualsdef custom_method(fun, x0, args, **kwargs,**options):params=Parameters()params.add('var',value=x0[0],value=1)params.add('2var',value=x0[1],value=1)lmfit_solution=minimize(fun,params,args=args)return OptimizeResult(x=lmfit_solution.params,fun=lmfit_solution.chisqr)basinhopping(func,minimizer_kwargs={'method'=custom_method},x0=[values]So you can run your function that returns an array of residuals for least squares, sum it to a scalar, pass that to basinhopping, and then basinhopping will return another set of parameters for you to continue. I don't see any issue here with using a scalar for the global solver though?
I'm a bit confused here. Do you mean it won't work in the sense the answer will be wrong?