Big run and vector memory exhausted (limit reach)?

2,568 views
Skip to first unread message

arapa...@gmail.com

unread,
Aug 14, 2018, 8:44:07 AM8/14/18
to BioGeoBEARS
Hello all,

I am trying to run an DEC+J analysis on a big dataset (360 terminals and 10 Areas).

All the input files are working fine, and the check_BioGeoBEARS_run() say it is a go.

Once  I get the run going, after a while R returns the following message to me:

Error: vector memory exhausted (limit reached?)

Is this due to my computer or due to the run setup?

This is the script I am using:

setwd("~/")
source("scripts/biogeobears-utilities.R") 
require(parallel)
require(FD)
require(snow)
require(minqa)
load.biogeobears()

calc_loglike_sp = compiler::cmpfun(calc_loglike_sp_prebyte)
calc_independent_likelihoods_on_each_branch = compiler::cmpfun(calc_independent_likelihoods_on_each_branch_prebyte)

decj.potamo <- define_BioGeoBEARS_run()

phylo.path <- "data/potamo_ultra.newick" 
moref(phylo.path)
potamo.tree <- read.tree(phylo.path)

# uncomment if needed
# plot(potamo.tree)
# prt(potamo.tree, get_tipnames = T)

decj.potamo$trfn = "data/potamo_ultra.newick"
decj.potamo$trfn #check if correctly assigned

geo.path <- "data/potamo_code_geog.data"
moref(geo.path)
getranges_from_LagrangePHYLIP(lgdata_fn=geo.path) #infer ranges from phylip file


decj.potamo$geogfn = "data/potamo_code_geog.data"
decj.potamo$geogfn #check if correclty assigned


decj.potamo = configure.standard.biogeobears.run(decj.potamo)

decj.potamo$BioGeoBEARS_model_object@params_table["j","type"] = "free"
decj.potamo$BioGeoBEARS_model_object@params_table["j","init"] = 0.0001
decj.potamo$BioGeoBEARS_model_object@params_table["j","est"] = 0.0001
decj.potamo$BioGeoBEARS_model_object@params_table["j","min"] = 0.0001

decj.potamo$BioGeoBEARS_model_object@params_table

decj.potamo$return_condlikes_table = T
decj.potamo$calc_TTL_loglike_from_condlikes_table = T
decj.potamo$calc_ancprobs = T

check_BioGeoBEARS_run(decj.potamo)
decj.potamo

decj.results = bears_optim_run(decj.potamo)
save(decj.results, file="results/decj.potamo.Rdata")


If needed, I can provide the tree file and the area data

Cheers

JP

Nick Matzke

unread,
Aug 14, 2018, 2:37:33 PM8/14/18
to bioge...@googlegroups.com
Hi -- this is the second time this morning someone has mentioned an error like this, I wonder if they changed something in R recently that lowers the default memory allocation.  10 areas and 360 tips has never been a problem before...

(the issue, probably, is that for parallel processing of the matrix exponentations, BioGeoBEARS sets up an array of size numstates x numstates x numbranches, this would be 1024x1024x359 for you, so it can get big)

Short term solution is probably to set the number of cores to use to 1, or google how to increase R's memory allocation.

Cheers,
Nick



--
You received this message because you are subscribed to the Google Groups "BioGeoBEARS" group.
To unsubscribe from this group and stop receiving emails from it, send an email to biogeobears+unsubscribe@googlegroups.com.
To post to this group, send email to bioge...@googlegroups.com.
Visit this group at https://groups.google.com/group/biogeobears.
To view this discussion on the web visit https://groups.google.com/d/msgid/biogeobears/0b6aa80f-b590-4cb0-9955-94c692389ffb%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Nick Matzke

unread,
Aug 14, 2018, 2:37:56 PM8/14/18
to bioge...@googlegroups.com
It looks like this is the cause/solution of the memory issue in R 3.5:


arapa...@gmail.com

unread,
Aug 14, 2018, 2:41:59 PM8/14/18
to BioGeoBEARS
Thanks for the reply, Nick.

Will dig into the memory issue of R 3.5

About the number of cores, I was having trouble to make the bears_optim_run() work... It would tell me I had NA cores available.

But then I ran your configure.standard.biogeobears.run() and the problem disappeared.

Do you think it has something to do with the problem?

Cheers

JP
To unsubscribe from this group and stop receiving emails from it, send an email to biogeobears...@googlegroups.com.
Reply all
Reply to author
Forward
0 new messages