I am trying to run a large program in “R” software using large text files and I get the following error:
Error: cannot allocate vector of size 6.5 Mb
The size of the largest text file is:
221,207KB
I am running Windows XP Pro Version 2002
Paging file size is 2046 MB. It uses about 1.3G of memory when trying to load.
Thanks in advance.
Mike
4. I'm running out of memory; what should I do?
In Windows, by default you get 1 Gb memory (or the amount of RAM you
have on your computer, if that is less that 1 Gb). If you have 2 Gb
RAM, you need to use the command-line flag --max-mem-size to have
access to the additional memory.
Right-click on the R icon that you use to start R and select
"Properties". Then select the tab "Shortcut" and modify the "Target"
to include something like --max-mem-size=2G.
Alternatively, you can change the memory limit within R using the
memory.limit function, giving a new limit in Mb. (For example,
typememory.limit(2048) to change the memory limit to 2 Gb.)
See also the R for Windows FAQ and, within R, type ?Memory and ?
memory.size.
5. I'm still running out of memory; what should I do?
Of course, one is limited by the memory available on one's computer,
and so there are not many options.
First, clean up your workspace, removing objects that aren't important
to you. You can save objects to disk with the save command.
The multiple imputation method, as implemented, uses a particularly
large amount of memory. Consider using a small number of imputations
(n.draws) or a coarser grid (step) in sim.geno, or focusing on a
subset of the chromosomes.
karl