Hello,
I am trying to run ENMEval using 9 rasters at 30m resolution for the entire state of Wisconsin. I've converted each raster to TIF, loaded into R, created a raster stack, and tried running the ENMEval code. I've tried on computers with 32GB, 64GB RAM, and even a server with 500GB RAM, and continue to get errors about allocating vector space (I've tried both running in parallel and not).
This instance, on a 32GB RAM computer, gave the error that it could not write a 13.2GB raster:
eval2<- ENMevaluate(OCCS, Kstack, method='randomkfold', kfolds = 3, RMvalues=c(1,2), fc='LQP', algorithm='maxnet')
On a 64GB RAM computer...
> eval2 <- ENMevaluate(OCCS, Kstack, method='checkerboard2', RMvalues=c(1,2,3), fc=c('L','LQ','LQP','LQPT','LQHPT'), algorithm='maxnet')
*** Running ENMevaluate using maxnet v.0.1.2 ***
Doing evaluations using checkerboard 2...
There are 88 background records with NA for at least one predictor variable.
Removing these records from analysis, resulting in 9912 records...
|======= | 7%
1583336 grid cells found with at least one NA value: these cells were excluded from raster predictions.
Error: cannot allocate vector of size 10.7 Gb
and...
> eval3 <- ENMevaluate(OCCS, Kstack, method='checkerboard2', RMvalues=c(1,2,3), fc=c('L','LQ','LQP','LQPT','LQHPT'), algorithm='maxnet', parallel=TRUE)
*** Running ENMevaluate using maxnet v.0.1.2 ***
Doing evaluations using checkerboard 2...
There are 117 background records with NA for at least one predictor variable.
Removing these records from analysis, resulting in 9883 records...
Of 16 total cores using 16
Running in parallel...
Error in { : task 1 failed - "Failure during raster IO
On a 500GB RAM server, I tried the below and got the following error message, and 96% of the memory on the server was being used:
> eval4 <- ENMevaluate(OCCS, Kstack, method='randomkfold', kfolds=2, RMvalues=c(1,2), fc=c('L','LQ','LQP'), algorithm='maxnet', parallel=TRUE)
*** Running ENMevaluate using maxnet v.0.1.2 ***
Doing random k-fold evaluation groups...
There are 101 background records with NA for at least one predictor variable.
Removing these records from analysis, resulting in 9899 records...
Of 48 total cores using 48
Running in parallel...
Error in { :
task 2 failed - "long vectors not supported yet: ../include/Rinlinedfuns.h:519"
Is there something I can be doing to make the files smaller or write intermediate files so it doesn't all stay in memory?
Thanks,
Megan