I have recently been trying to generate bigWig tracks from some STAR aligned RNA-seq data where the BAM files range from 25gb to 45gb. However, when attempting to run bamCoverage for the 25gb bam file, i'm running out of memory which causes the program to stop, but the command line does not end. I'm using a 128gb RAM desktop.
Is this simply a problem where I just need more RAM? Or is there something I can do to make this memory efficient?
--
You received this message because you are subscribed to the Google Groups "deepTools" group.
To unsubscribe from this group and stop receiving emails from it, send an email to deeptools+...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
-- Devon Ryan, PhD Bioinformatician / Data manager Bioinformatics Core Facility Max Planck Institute for Immunobiology and Epigenetics Email: dpry...@gmail.com
I'm actually using a bin size of 50 and what I mean is that I run bamCoverage and it will start eating up the 128gb of ram and then start eating up the 30gb of temp RAM as well ... At that point the program hangs indefinetely, RAM and cpu usage stops and lowers completely.