Scipy.optimize.curve_fit Download

2 views
Skip to first unread message

Kendall Paschel

unread,
Jul 22, 2024, 10:12:39 AM7/22/24
to aninexiv

Yesterday I struggled with the error message:
"ValueError("x0 is infeasible.")"
of the function scipy.optimize.curve_fit(), this is my contribution for others that start to work with this well tested method. If you work with measured data and you would like to utilize the feature of boundary (which often make sense) you should spent effort to ensure three things:

I had some luck with the scipy.optimize.curve_fit function, however it seemed very susceptible to deviations in the initial guess. curve_fit also didn't let me provide bounds on the input, which I think would greatly simplify the problem. Here is the code I used with curve_fit:

scipy.optimize.curve_fit download


Scipy.optimize.curve_fit Download 🗸🗸🗸 https://urluso.com/2zE0aV



I have a set of data points, (x and y in the code below) and I am trying to create a linear line of best fit through my points. I am using scipy.optimize.curve_fit. My code produces a line, but not a line of best fit. I have tried giving the function model parameters to use for my gradient and for my intercept, but each time it produces the exact same line which does not fit to my data points.

With our fit function in place, we now need to supply initial guesses for the parameter values, given by the kwarg p0. (We don't have to do this, but scipy.optimize.curve_fit() will guess a value of 1 for all parameters, which is generally not a good idea. You should always explicitly supply your own initial guesses.) In looking at the plot, we see that we indeed have a nonzero background signal, somewhere around $a \approx 0.2$. We also see that $I_0 \approx 0.9$ and $\lambda \approx 0.3$. We would normally use these as our approximate guesses, but to show an additional lesson, we will guess $\lambda = 1$, making the solver do a little more work.

Oh no! We got an exception that we had negative parameters! This occurred because under the hood, scipy.optimize.curve_fit() tries many sets of parameter values as it searches for those that bring the theoretical curve closest to the observed data. (It is way more complicated than that, but we won't get into that here.)

In this example we use a nonlinear curve-fitting function: scipy.optimize.curve_fit to give us the parameters in a function that we define which best fit the data. The scipy.optimize.curve_fit function also gives us the covariance matrix which we can use to estimate the standard error of each parameter. Finally, we modify the standard error by a student-t value which accounts for the additional uncertainty in our estimates due to the small number of data points we are fitting to.

The fit works, but it takes time - about ten times what scipy does using scipy.optimize.curve_fit. If I move it onto the GPU (by adding device='cuda' to the tensor functions), it takes a further 10 times as long! (5 sec)

This is a simple script which tries to find the global minima using scipy.optimize.curve_fit as well as a parameter search over the parameter space.It first generates ntol random models, then selects ntol*returnnfactor best models and does scipy.optimize.curve_fit on all of them. It then returns the best model of them all.This script improves scipy.optimize.curve_fit in two ways - No need to give initial values and thus getting global minima instead of local minima. And second, it automatically normalize and standardizes the data.

760c119bf3
Reply all
Reply to author
Forward
0 new messages