Hi Everyone,
Does anyone have any guidance/tips for working around memory allocation failures during optimization:
opt <- nlminb(obj$par, obj$fn, obj$gr)
Error in sparseHessianFun(env, skipFixedEffects = skipFixedEffects) :
Memory allocation fail in function 'MakeADHessObject2'
I'm developing a multispecies age-structured model with normally distributed random effects (recruitment deviates) for fisheries applications and it works fine when "fixing" the variance of the random effects and running a penalized likelihood approach. However, when I try to estimate the variance and treat the random deviates as random effects I get memory allocation failures for a few of the 16 sensitivity runs I am doing (I clear the environment and memory using "gc" and "FreeADFun between runs). The odd part to me is that one sensitivity run will work while another sensitivity run with the same parameters (fixed and random), but parameterized slightly different will not work.
I've tried a few things that don't quite get me where I want to be for all the sensitivity runs: 1) sensitivity runs with shorter time series work (e.g. reduced number of estimable parameters), 2) reducing the size of the model helps (the multispecies formulation is solved via iteration using a for loop around the dynamics and reducing the number of iterations helps, but the multispecies model is not technically converged), 3) moving from a 32 to 64 GB memory computer did not help, 4) changing "tape.parallel" didn't help. I have a few 7 dim arrays of derived quantities and perhaps reducing those may help or reducing some 4D fixed effects parameter arrays? Perhaps using a different inner optimizer?
If anyone has come across this issue and has a few pointers or tips that would be much appreciated!
Cheers,
Grant