Hi,
First of all thanks for the great piece of software. I’m currently in the progress of migrating my workflow from ngsplot to deeptools due to the more extensive documentation and more modular workflow.
However, I am running into a problem whereby RAM usage of the computeMatrix command seems to scale significantly with the numbers of samples included, exceeding the 120 GB for 8 samples. Do you know if I am doing something wrong or if this is a known issue?
Many thanks,
Nicolas
Memory will scale with the number of samples, the number of regions, the final number of bins per region. Particularly if you have a very large BED file and are using a very small bin size you may find high memory use. This is partly innate and partly a limitation of how generally poorly python handles multiple cores and memory management.
If you can increase your bin size then that'll likely bring things down to a more reasonable level.
Devon
--
You received this message because you are subscribed to the Google Groups "deepTools" group.
To unsubscribe from this group and stop receiving emails from it, send an email to deeptools+...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
-- Devon Ryan, PhD Bioinformatician / Data manager Bioinformatics Core Facility Max Planck Institute for Immunobiology and Epigenetics Email: dpry...@gmail.com