Hi,
I'm running a custom mcmc modelling code and using emcee.autocorr.integratedtime as a way to test whether the parameters in the models have successfully converged (i.e. models with chains with integratedtime * 50/(length of chain)>1 are discarded).
However, I'm finding that it's often throwing out what look like good models in favour of models that look less well converged. I've attached two corner plots from two models fit to the same data. The two_exp model fails but the exp_gauss doesn't, which seems wrong.
Thanks,
Lucy