iteration: 1051 par_m : 1.371219604552536 par_b : -4.293066874884573 xi^2 1.7191296062209818 iteration: 1051 par_m : 1.371219604552536 par_b : -4.293066874884573 xi^2 1.7191296062209818 iteration: 1052 par_m : -76.56942327989591 par_b : -4.293066874884573 xi^2 38790133.57808929 iteration: 1053 par_m : 74.7768453852581 par_b : -4.293066874884573 xi^2 34407410.5376032 iteration: 1054 par_m : 90.35056507851729 par_b : 92.63245655527132 xi^2 74277469.70569123 iteration: 1055 par_m : 90.35056507851729 par_b : -89.05936415713427 xi^2 34978585.3295675 .
.
.
iteration: 1169 par_m : 1.0936850178317172 par_b : -4.5703498472581 xi^2 702.9040023042832 iteration: 1170 par_m : 1.371219604552536 par_b : -3.738403848812368 xi^2 104.9619027026542 iteration: 1171 par_m : 1.371219604552536 par_b : -4.847597610959724 xi^2 104.91227600870884
The fit_report is this:
'[[Fit Statistics]]\n # fitting method = differential_evolution\n # function evals = 1050\n # data points = 10\n # variables = 2\n chi-square = 1.71912961\n reduced chi-square = 0.21489120\n Akaike info crit = -13.6076697\n Bayesian info crit = -13.0024995\n[[Variables]]\n par_m: 1.37121960 +/- 0.01000214 (0.73%) (init = 1)\n par_b: -4.84759761 +/- 0.04363044 (0.90%) (init = 0)\n[[Correlations]] (unreported correlations are < 0.100)\n C(par_m, par_b) = -0.815'
As one can see the chi^2 seems to be best value within all iterations, but the parameters are the last ones.
In the least square case this seems not to happen (although here the last iteration also has the best chi^2):
One has for the prints:
iteration: -1 par_m : 1.0 par_b : 0.0 xi^2 3265.741717022397 iteration: 0 par_m : 1.0 par_b : 0.0 xi^2 3265.741717022397 iteration: 1 par_m : 1.0 par_b : 0.0 xi^2 3265.741717022397 iteration: 2 par_m : 1.0000000149006638 par_b : 0.0 xi^2 3265.7417989467594 iteration: 3 par_m : 1.0 par_b : 1.4901161193847656e-06 xi^2 3265.7446913150293 iteration: 4 par_m : 1.3712136626022726 par_b : -4.291755132717512 xi^2 1.7196891631086437 iteration: 5 par_m : 1.3712136830336732 par_b : -4.291755132717512 xi^2 1.7196892253485263 iteration: 6 par_m : 1.3712136626022726 par_b : -4.291755068804676 xi^2 1.719689218495164 iteration: 7 par_m : 1.3712213896957053 par_b : -4.293073734974456 xi^2 1.7191295992862299 iteration: 8 par_m : 1.3712214101272053 par_b : -4.293073734974456 xi^2 1.7191295992889075 iteration: 9 par_m : 1.3712213896957053 par_b : -4.293073671041995 xi^2 1.7191295992876123 iteration: 10 par_m : 1.3712213896957053 par_b : -4.293073734974456 xi^2 1.7191295992862299
And this is the fit result:
'[[Fit Statistics]]\n # fitting method = leastsq\n # function evals = 9\n # data points = 10\n # variables = 2\n chi-square = 1.71912960\n reduced chi-square = 0.21489120\n Akaike info crit = -13.6076698\n Bayesian info crit = -13.0024996\n[[Variables]]\n par_m: 1.37122139 +/- 0.01000215 (0.73%) (init = 1)\n par_b: -4.29307373 +/- 0.04363043 (1.02%) (init = 0)\n[[Correlations]] (unreported correlations are < 0.100)\n C(par_m, par_b) = -0.815'
Comparing the two results one can also see that the differential evolution fit really does not end on his minimum.
Have i done something wrong in applying the methods? Or is there any other reason why this does not work?
Thanks a lot for the help already!!!!!
Kind regards,
Frederic
params0
least square
differential evolution
params0
--
You received this message because you are subscribed to the Google Groups "lmfit-py" group.
To unsubscribe from this group and stop receiving emails from it, send an email to lmfit-py+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/lmfit-py/7210df2c-476b-43d7-ba54-93339ef9ba70o%40googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/lmfit-py/CA%2B7ESbpVTsoXMJ8m0DYidw9OaFqp05oXoveHOeA3o4si2d01RA%40mail.gmail.com.
To unsubscribe from this group and stop receiving emails from it, send an email to lmfi...@googlegroups.com.
To unsubscribe from this group and stop receiving emails from it, send an email to lmfi...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/lmfit-py/7210df2c-476b-43d7-ba54-93339ef9ba70o%40googlegroups.com.
--
You received this message because you are subscribed to the Google Groups "lmfit-py" group.
To unsubscribe from this group and stop receiving emails from it, send an email to lmfi...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/lmfit-py/CA%2B7ESbpVTsoXMJ8m0DYidw9OaFqp05oXoveHOeA3o4si2d01RA%40mail.gmail.com.
import lmfit import numpy as np import matplotlib.pyplot as plt
[[Model]] Model(linear) [[Fit Statistics]] # fitting method = differential_evolution # function evals = 750 # data points = 10 # variables = 2 chi-square = 0.15486430 reduced chi-square = 0.01935804 Akaike info crit = -37.6779109 Bayesian info crit = -37.0727407 [[Variables]] intercept: -4.48384150 +/- 0.08177605 (1.82%) (init = 0) slope: 1.28546989 +/- 0.01531806 (1.19%) (init = 1) [[Correlations]] (unreported correlations are < 0.100) C(intercept, slope) = -0.843 chissq_returned 0.15486430450850974 chisqr mmin-returned -1.5265566588595902e-15 chisqr final - returned 0.8630829382245354
which now shows that (big) difference, so maybe this just happens with smaller data sets or you were just "luckier".
Nevertheless I also get that difference with your program.
--
You received this message because you are subscribed to the Google Groups "lmfit-py" group.
To unsubscribe from this group and stop receiving emails from it, send an email to lmfit-py+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/lmfit-py/7e887c86-3df2-4f02-8e8d-407a2b5a9585o%40googlegroups.com.
--
You received this message because you are subscribed to the Google Groups "lmfit-py" group.
To unsubscribe from this group and stop receiving emails from it, send an email to lmfit-py+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/lmfit-py/7e887c86-3df2-4f02-8e8d-407a2b5a9585o%40googlegroups.com.
--Matt
To view this discussion on the web visit https://groups.google.com/d/msgid/lmfit-py/CA%2B7ESbrtDwnbRuVArDDbC%3DLL2%2BG%2Bc1yuYX1cKpHNU484W6jVGQ%40mail.gmail.com.
okay, I spoke too quickly and I should have waited for the Matt to open the Issue (https://github.com/lmfit/lmfit-py/issues/655) and/or read the messages a bit better. It seemed to me your problem was with the reported chi-square, but now realize that I misunderstood: everything in the output directly coming from the fit is correct (i.e., result_diff.chisqr, result_diff.residual), but for some reason there is a small discrepancy in one of the parameter values, which results in a difference in the chi-square if you *recalculate* the value using these parameters. It indeed clearly has to do with the numdifftools code, but not sure yet though where it originates from…