Global optimization algorithms are stuck in parameters boundaries

33 views
Skip to first unread message

Yehor Kosiachkin

unread,
Mar 19, 2024, 2:14:07 PMMar 19
to lmfit-py
Dear colleagues, good afternoon.

Thanks for your work.

I am currently trying to fit a system of 4 ODEs to an experimental data. While I managed to do that with local minimization algorithms (least_squares, nelder, powell, ... ) I can not proceed with Global optimization algorithms (to be sure that I am landing somewhere nearby global minima). It is resulting with a much worth fit and parameters being stuck on boarder values. Could you check, what I am doing wrong, please?

My simplified list of steps in the script:

 1. I am defining solver for ODE system with scipy.solve_ivp
 2. I am defining residual function

def residual_step(params, tt, init, data_pc, data_pv, data_sst, data_vip):
    """
    Difference between model and data for further minimization
    """
 model = simulate_step(tt, init, params)
    pc_r = np.array(model['f_e'].values - data_pc)*weights.ravel() #/np.array(pc_sen_err_new)
    pv_r = np.array(model['f_pv'].values - data_pv)*weights.ravel()
    sst_r = np.array(model['f_sst'].values - data_sst)*weights.ravel()
    vip_r = np.array(model['f_vip'].values - data_vip)*weights.ravel()
   
    arr = np.concatenate((pc_r,pv_r,sst_r,vip_r), axis=0)
return arr

 3. setting up parameters 

params = Parameters()
params.add('w_0', value = w[0] , vary = True, min = 0.1, max = 7)
params.add('w_1', value = w[1] , vary = True, min = 0.1, max = 7)
params.add('w_2', value = w[2] , vary = True, min = 0.1, max = 7)
... (that means that some similar parameters are here as well)
params.add('tau_0', value = tau[0] , vary = True, min = 0.01, max = 0.08)
params.add('tau_1', value = tau[1] , vary = True, min = 0.01, max = 0.08)
... (that means that some similar parameters are here as well)
params.add('i_0', value = i[0] , vary = True, min = 0.005, max = 1)
params.add('i_1', value = i[1] , vary = True, min = 0.005, max = 1)
... (that means that some similar parameters are here as well)
params.add('ampl_1', value = ampl_1 , vary = True, min = 0.1, max = 7)
params.add('ampl_2', value = ampl_2 , vary = False, min = 0, max = 7)
params.add('kl_1', value = kl_1 , vary = False, min = -15, max = 7)
params.add('kl_2', value = kl_2 , vary = False, min = 0, max = 10)
params.add('s_1', value = s_1 , vary = True, min = 0.005, max = 2)

 4. Defining local minimization (that works: finds model closely matching data points)

%%time

%matplotlib inline

result_nelder = minimize(residual_step, params, method='nelder', args=(t_exp, init, data_pc_all, data_pv, data_sst, data_vip), max_nfev=2000, nan_policy='propagate', tol=1e-13)
print()

 5. Defining different global optimizers (basinhopping, ampgo, shgo, dual_annealing)

%%time
%matplotlib inline
result_ampgo = minimize(residual_step, params, method='ampgo', args=(t_exp, init, data_pc_all, data_pv, data_sst, data_vip), nan_policy='propagate', max_nfev=4*max_nfev, local = 'Nelder-Mead')
report_fit(result_ampgo)

And at this point all global optimizers gives final result that lies on almost all parameters being stuck to min (or max, less probably) boarder and chi-square is 100 times higher than local minimization result . I am plotting resulting fit during each iteration and clearly see "better" (fits closer to data) fits than final result. Also I was playing with parameters of optimization algorithms, that did not help as well.

Could you please guide me, what I am doing wrong with global optimization? Maybe, accessing optimization result is a different way than report_fit function? 

Please, let me know if any additional information needed.

Looking forward hearing from you.

With kind regards, 
Yehor

Yehor Kosiachkin

unread,
Mar 19, 2024, 9:14:07 PMMar 19
to lmfit-py
Sorry for the misleading information. It seems, that dual_annealing is not stucking on the boundaries. It gives better results. However, the problem with ampgo and basinhopping remains: I see better fits in intermediate results, but the final result consists of almost all parameters being on their boundary value.

Thank you!

Kind regards,
Yehor
Reply all
Reply to author
Forward
0 new messages