Thanks, Matt !
I'll explain the problem I'm working with. I'm trying to fit a smooth, parametric curve to the following data, whilst ensuring that its derivative never goes below -1.

In this figure, I'm showing an older, more basic fit : just a piecewise linear regression, where the model parameters (including breakpoint) are fit with lmfit. Instead of two lines, I'd like to fit a quadratic, to allow the gradient to change smoothly as a function of the population. Mechanistically, however, I shouldn't ever see a gradient of less than -1 - the piecewise linear fit didn't have this constraint, but it estimated -1.026, which is fairly close, and I'm confident that constraining this would result in a fit that was nearly as good.
Working from
http://lmfit.github.io/lmfit-py/constraints.html, I can see how to constrain parameters by inequality. However, for a quadratic, the derivative is a function of the independent variable, and I don't think lmfit can deal with this out of the box - I tried writing something in the form of the "Using Inequality Constraints" on this page and, of course, I can't phrase the constraint to take in the array of independent variable values. I then tried just constraining the derivative at a given point in x, hoping that might limit the gradient later in the curve ( for the above plot, I tried constraining the gradient at x = 6.5 to be >= -1 ).

This shows the quadratic fit with gradient >= -1 at x=6.5; the right-most plot is the gradient of the fit. It's not a great model, but aside from that, I hope it illustrates what I'm trying to achieve !
Thanks again,
Quentin