Hi everybody,
I have a population growth model to fit a time series, with growth rate lambda being itself a linear model. It looks like this in JAGS:
> lambda[t] <- beta0 + beta1*X[t] + beta2*N[t]
> N_determin[t+1] <- log(max(1,lambda[t]*N[t]))
> N[t+1] ~ dlnorm(N_determin[t+1], tau)
Where X[t] is an environmental covariate and I also include beta2 as a measure of density dependence in the population growth rate.
What I do is to simulate data using JAGS, and then fit the bayesian model to the simulated data to check I can well recover the known parameters. To get the “S” shape typical of density dependence, I need to use a very small value of beta2 (-0.0001). When fitting the bayesian model, the Gelman-Rubin diagnostic is equal to 1.0001, the chains look well mixed, and the posterior density plot looks nicely bell-shaped. However, I get an effective sample size of 1 for beta2, compared to a total number of samples after thinning of 240000. I was wondering if anyone knows what this might mean, if it is a cause for concern and if yes, whether there's any recommended practice to fix it?
Thanks,
Juan Pablo