DeepTools: computeMatrix RAM usage

206 views
Skip to first unread message

Nicolas De Jay

unread,
Sep 19, 2017, 12:05:36 PM9/19/17
to deep...@googlegroups.com

Hi,

First of all thanks for the great piece of software. I’m currently in the progress of migrating my workflow from ngsplot to deeptools due to the more extensive documentation and more modular workflow.

However, I am running into a problem whereby RAM usage of the computeMatrix command seems to scale significantly with the numbers of samples included, exceeding the 120 GB for 8 samples. Do you know if I am doing something wrong or if this is a known issue?

Many thanks,

Nicolas

Devon Ryan

unread,
Sep 19, 2017, 5:27:12 PM9/19/17
to Nicolas De Jay, deep...@googlegroups.com

Memory will scale with the number of samples, the number of regions, the final number of bins per region. Particularly if you have a very large BED file and are using a very small bin size you may find high memory use. This is partly innate and partly a limitation of how generally poorly python handles multiple cores and memory management.

If you can increase your bin size then that'll likely bring things down to a more reasonable level.

Devon

--
You received this message because you are subscribed to the Google Groups "deepTools" group.
To unsubscribe from this group and stop receiving emails from it, send an email to deeptools+...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

-- 
Devon Ryan, PhD
Bioinformatician / Data manager
Bioinformatics Core Facility
Max Planck Institute for Immunobiology and Epigenetics
Email: dpry...@gmail.com

Nicolas De Jay

unread,
Sep 20, 2017, 3:28:16 PM9/20/17
to Devon Ryan, deep...@googlegroups.com
Thanks for the tips Devon, I'll try your suggestions I.e. cut back on the multicores, smaller BED file and increased bin size.
--
Nicolas De Jay
Reply all
Reply to author
Forward
0 new messages