Re: LMFit - constraint as a function of variables

582 views
Skip to first unread message

Matt Newville

unread,
Apr 1, 2015, 10:22:37 PM4/1/15
to Quentin CAUDRON, lmfit-py
Hi Quentin,

On Wed, Apr 1, 2015 at 6:11 PM, Quentin CAUDRON <qcau...@princeton.edu> wrote:
Hi Matt,

Thanks very much for the awesome work on lmfit. I've been using it quite a lot recently, and I love how it works. I hope emailing you here is alright.

It's fine to ask me questions, but we have a mailing list, so I'm including that in my reply.  The mailing list has the benefit of having multiple people who can give different (and usually better) answers than I can .
 

I was wondering if it was possible to impose a constraint by inequality on a quantity defined not just by parameters, but also by the variables. For instance, let's say I wanted to fit a simple quadratic to some data, but impose that the derivative cannot go below -1. Is there any way to work this into lmfit ?
 

Hmm, that's an interesting suggestion, but I'm not entirely sure I understand.  Do you mean to add a constraint (or perhaps, add a penalty) for the derivative of the residual array?   The derivative with respect to what --  an independent variable "x" or one of the parameters?


It could be that this is not easily feasible with lmfit's approach to constraints, but I think that's worth thinking about.    There was a similar question a couple weeks back (see the discussion at https://groups.google.com/forum/?fromgroups=#!topic/lmfit-py/iwxUR-ogE50).      I'm definitely open to suggestions for improving this...


--Matt Newville

Quentin CAUDRON

unread,
Apr 2, 2015, 12:02:14 AM4/2/15
to lmfi...@googlegroups.com, qcau...@princeton.edu, newv...@cars.uchicago.edu
Thanks, Matt !

I'll explain the problem I'm working with. I'm trying to fit a smooth, parametric curve to the following data, whilst ensuring that its derivative never goes below -1. 


In this figure, I'm showing an older, more basic fit : just a piecewise linear regression, where the model parameters (including breakpoint) are fit with lmfit. Instead of two lines, I'd like to fit a quadratic, to allow the gradient to change smoothly as a function of the population. Mechanistically, however, I shouldn't ever see a gradient of less than -1 - the piecewise linear fit didn't have this constraint, but it estimated -1.026, which is fairly close, and I'm confident that constraining this would result in a fit that was nearly as good.

Working from http://lmfit.github.io/lmfit-py/constraints.html, I can see how to constrain parameters by inequality. However, for a quadratic, the derivative is a function of the independent variable, and I don't think lmfit can deal with this out of the box - I tried writing something in the form of the "Using Inequality Constraints" on this page and, of course, I can't phrase the constraint to take in the array of independent variable values. I then tried just constraining the derivative at a given point in x, hoping that might limit the gradient later in the curve ( for the above plot, I tried constraining the gradient at x = 6.5 to be >= -1 ). 

This shows the quadratic fit with gradient >= -1 at x=6.5; the right-most plot is the gradient of the fit. It's not a great model, but aside from that, I hope it illustrates what I'm trying to achieve !

Thanks again,
Quentin

Matt Newville

unread,
Apr 5, 2015, 2:06:55 PM4/5/15
to Quentin CAUDRON, lmfit-py, Quentin CAUDRON
Hi Quentin,

On Wed, Apr 1, 2015 at 11:02 PM, Quentin CAUDRON <quentin...@gmail.com> wrote:
Thanks, Matt !

I'll explain the problem I'm working with. I'm trying to fit a smooth, parametric curve to the following data, whilst ensuring that its derivative never goes below -1. 

The derivative with respect to what?   Some independent variable?  A parameter?

 


In this figure, I'm showing an older, more basic fit : just a piecewise linear regression, where the model parameters (including breakpoint) are fit with lmfit. Instead of two lines, I'd like to fit a quadratic, to allow the gradient to change smoothly as a function of the population.

You mean the gradient with respect to the x-axis, right?
 
Mechanistically, however, I shouldn't ever see a gradient of less than -1 - the piecewise linear fit didn't have this constraint, but it estimated -1.026, which is fairly close, and I'm confident that constraining this would result in a fit that was nearly as good.

But if your model is a quadratic function (a + b*x  + c*x*x), the gradient (b + 2*c*x) depends on the value of x, and isn't easily constrained, right? 
 

Working from http://lmfit.github.io/lmfit-py/constraints.html, I can see how to constrain parameters by inequality. However, for a quadratic, the derivative is a function of the independent variable, and I don't think lmfit can deal with this out of the box - I tried writing something in the form of the "Using Inequality Constraints" on this page and, of course, I can't phrase the constraint to take in the array of independent variable values. I then tried just constraining the derivative at a given point in x, hoping that might limit the gradient later in the curve ( for the above plot, I tried constraining the gradient at x = 6.5 to be >= -1 ). 


If you know the extreme values of x and want the gradient there (b + 2*c*x_max) to be >=-1, perhaps constraining  c to be '(delta - b - 1) / (2*x_max)' where delta >=0 would be OK?  That is

   params = Parameters()
   params.add('a', 0)
   params.add('b', -0.5)
   params.add('delta', 0.5, min=0)
   params.add('x_max', x.max(), vary=False) # assumes independent variable is called 'x'
   params.add('c', expr='(delta-b-1)/(2*x_max)')

Does that seem close to what you need?

This shows the quadratic fit with gradient >= -1 at x=6.5; the right-most plot is the gradient of the fit. It's not a great model, but aside from that, I hope it illustrates what I'm trying to achieve !

Thanks again,
Quentin
 
Hope that helps!
 
--Matt

Reply all
Reply to author
Forward
0 new messages