Error: Vector memory exhausted

91 views
Skip to first unread message

André Luís Luza

unread,
Sep 28, 2018, 2:32:53 PM9/28/18
to hmec...@googlegroups.com

Hi,

I am fitting a dynamic occupancy model similar to Bled et al.’s (2011, Ecology) to mosquito detection data from 496 sites over 104 primary periods. The analysis runs fine in a JAGS-based Bayesian formulation, implemented in R via jagsUI. The problems arise when I ask JAGS to save a gigantic matrix of site- and time-specific posterior samples of occupancy probability. With 20,000 iterations I get:

"Error: vector memory exhausted (limit reached?)
Error during wrapup: vector memory exhausted (limit reached?) ....).”

I need to monitor occupancy, because one of the key products of this project is a dynamic map of occupancy through time. I am using a MacBook Pro with MacOS High Sierra- intel core I7 with 16GB of RAM and 8 cores. The R version is 3.5.0., 64 bits, which should facilitate memory allocation. I have tried to run more chains in parallel with fewer iterations each and monitor/save only the matrix I want, but the problem persists. 

Does anyone know of a workaround that does not require a change of computer?

Thank you for any clues and my apologies if this question has been asked already.
--


André Luís Luza

Pesquisador de Pós-Doutorado
Programa de Pós-Graduação em Ecologia 
Universidade Federal do Rio Grande do Sul
Porto Alegre, Rio Grande do Sul - Brazil

Kery Marc

unread,
Sep 28, 2018, 2:44:02 PM9/28/18
to André Luís Luza, hmec...@googlegroups.com
Hi Andre

that's a super-cool model ! A couple of ideas for your memory problem are these:
- go really low with the number of saved samples per parameter, e.g., just save 50 or so. For mapping purposes the posterior mean of those may be enough. So, for instance, do 2 chains with 100,000 iterations each, a burnin of 50,000 and then thin by 1 in 1000, which still gives you 100 samples from the posterior of every parameter
- do you really mean occupancy probability (i.e., psi) or do you mean the realized occurrence indicators (the z's) ? If you mean the latter, then you probably don't need to save them for a full 496 x 103 matrix, but can simply save the samples of a much smaller number of parameters which describe the spatial and temporal variation in this matrix. Then, in R you can compute the posterior of the psi parameter.
- get your MCMC samples from multiple runs. Perhaps it helps if you save half of the z matrix in one run and the other half in another ? I have never tried this, but it might work if in BUGS you first define zx to be half of the full z matrix and then put zx into the list of params that you want to save samples.

I am sure there must be other tricks, but this is what comes to my mind now.

Best regards  ----- Marc



--
*** Three hierarchical modeling email lists ***
(1) unmarked: for questions specific to the R package unmarked
(2) SCR: for design and Bayesian or non-bayesian analysis of spatial capture-recapture
(3) HMecology (this list): for everything else, especially material covered in the books by Royle & Dorazio (2008), Kéry & Schaub (2012), Kéry & Royle (2015)
---
You received this message because you are subscribed to the Google Groups "hmecology: Hierarchical Modeling in Ecology" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hmecology+...@googlegroups.com.
To post to this group, send email to hmec...@googlegroups.com.
Visit this group at https://groups.google.com/group/hmecology.
To view this discussion on the web visit https://groups.google.com/d/msgid/hmecology/CADLPOyomczKRCidY2P-v7D8nsc7Xuyi6A_320HQPQwfxmh9LEA%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

Kery Marc

unread,
Sep 28, 2018, 2:45:21 PM9/28/18
to André Luís Luza, hmec...@googlegroups.com
Darn, below I meant 'if you mean the former (i.e., psi)'.



From: Kery Marc
Sent: 28 September 2018 20:42
To: André Luís Luza; hmec...@googlegroups.com
Subject: RE: Error: Vector memory exhausted

Andy Crosby

unread,
Sep 28, 2018, 2:49:35 PM9/28/18
to André Luís Luza, hmec...@googlegroups.com

Hi Andre,

I’ve run into a similar problem quite often trying to fit multi-species models with >400 sites and >100  species. The problem for me (and I assume for you as well) was that the computer did not have a big enough chunk of free memory to keep track of all the parameters being saved. We’re talking tens of GB of memory here. Even on a machine with 64GB of RAM there was not enough.

 

My solution, after a long time of trying, was to severely limit the number of iterations and thin a lot, run it with the jags.basic command from JAGSUI (so it only saves the raw chains), and just keep updating until it converged. That way I was able to keep the memory requirements low enough and just save the output in the form of .Rdata files. The chains from each update can be combined after the fact if need be.

 

Good luck,

 

Andrew D. Crosby
Postdoctoral Fellow
Boreal Avian Modelling Project
751 General Services Building
University of Alberta
Edmonton, AB T6G 2H1

 

From: André Luís Luza
Sent: Friday, September 28, 2018 12:32 PM
To: hmec...@googlegroups.com
Subject: Error: Vector memory exhausted

 

--

Perry de Valpine

unread,
Sep 28, 2018, 3:00:10 PM9/28/18
to cro...@ualberta.ca, luza....@gmail.com, hmecology: Hierarchical Modeling in Ecology
Hi André, Andy and Marc,

With apologies for again replying just to point out a nimble feature of interest, well, here it is: nimble does allow you to keep two sets of monitored variables, each with a different thinning interval.  This makes it possible to monitor a small number of parameters at higher frequency and a large number of latent states at lower frequency.

Another option would be to use nimble to integrate over the occupancy latent states during estimation and then draw from the state distributions as needed after the MCMC has run.

-Perry


Dusit Ngoprasert

unread,
Sep 29, 2018, 2:37:09 AM9/29/18
to Perry de Valpine, cro...@ualberta.ca, luza....@gmail.com, hmecology: Hierarchical Modeling in Ecology
I used “saveJAGS” package where the computer can’t  allocate memory with data cloning. Not sure is this help in your case. 

Best, 
Dusit
Reply all
Reply to author
Forward
0 new messages