Error in the process trying to get a heatmap

97 views
Skip to first unread message

Ana Rabaza

unread,
Oct 26, 2016, 2:19:08 PM10/26/16
to Qiime 1 Forum
Hello:
I am trying to make a heatmap, buy I have this error....

qiime@qiime-190-virtual-box:~/Downloads/LR$ make_otu_heatmap.py -i otu_table_no_singletons_no_chimeras.biom -o heatmap/
Traceback (most recent call last):
  File "/usr/local/bin/make_otu_heatmap.py", line 254, in <module>
    main()
  File "/usr/local/bin/make_otu_heatmap.py", line 231, in main
    otu_order = get_clusters(data, axis='row')
  File "/usr/local/lib/python2.7/dist-packages/qiime/make_otu_heatmap.py", line 135, in get_clusters
    row_dissims = pw_distances(x, ids=map(str, range(nr)), metric='euclidean')
  File "/usr/local/lib/python2.7/dist-packages/skbio/diversity/beta/_base.py", line 59, in pw_distances
    squareform(distances, force='tomatrix', checks=False), ids)
  File "/usr/local/lib/python2.7/dist-packages/scipy/spatial/distance.py", line 1465, in squareform
    M = np.zeros((d, d), dtype=np.double)
MemoryError

Can someone help me please?

Ana.-

Jamie Morton

unread,
Oct 26, 2016, 6:56:48 PM10/26/16
to Qiime 1 Forum
Hi Ana,

I've forwarded this issue to one of the developers.  In the meantime - what is the size of your biom table? (number of samples and OTUs)?

Best,
Jamie

Daniel McDonald

unread,
Oct 27, 2016, 8:35:54 PM10/27/16
to Qiime 1 Forum
Hi Ana,

How many OTUs are in your BIOM table and how much memory do you have available? It's likely the case that the resulting heat map will not be interpretable if the number of OTUs is sufficiently large anyway...

Best,
Daniel

Garima Raj

unread,
Dec 6, 2016, 2:10:26 AM12/6/16
to Qiime 1 Forum
i am facing the same problem,

Traceback (most recent call last):
  File "/home/garima/miniconda3/envs/qiime1/bin/make_otu_heatmap.py", line 4, in <module>
    __import__('pkg_resources').run_script('qiime==1.9.1', 'make_otu_heatmap.py')
  File "/home/garima/miniconda3/envs/qiime1/lib/python2.7/site-packages/pkg_resources/__init__.py", line 744, in run_script
    self.require(requires)[0].run_script(script_name, ns)
  File "/home/garima/miniconda3/envs/qiime1/lib/python2.7/site-packages/pkg_resources/__init__.py", line 1499, in run_script
    exec(code, namespace, namespace)
  File "/home/garima/miniconda3/envs/qiime1/lib/python2.7/site-packages/qiime-1.9.1-py2.7.egg-info/scripts/make_otu_heatmap.py", line 254, in <module>
    main()
  File "/home/garima/miniconda3/envs/qiime1/lib/python2.7/site-packages/qiime-1.9.1-py2.7.egg-info/scripts/make_otu_heatmap.py", line 213, in main
    sample_order = get_clusters(data, axis='column')
  File "/home/garima/miniconda3/envs/qiime1/lib/python2.7/site-packages/qiime/make_otu_heatmap.py", line 135, in get_clusters

    row_dissims = pw_distances(x, ids=map(str, range(nr)), metric='euclidean')
  File "/home/garima/miniconda3/envs/qiime1/lib/python2.7/site-packages/skbio/diversity/beta/_base.py", line 57, in pw_distances
    distances = pdist(counts, metric)
  File "/home/garima/miniconda3/envs/qiime1/lib/python2.7/site-packages/scipy/spatial/distance.py", line 1220, in pdist
    dm = np.zeros((m * (m - 1)) // 2, dtype=np.double)
MemoryError

is it actually the RAM error, that I need more RAM, or is there any problem with the script or my sample data?
Please help

Antonio González Peña

unread,
Dec 12, 2016, 8:10:02 AM12/12/16
to Qiime 1 Forum
Well, I think we only have seen that error when trying to compute a large distance matrix, where the size can't be hold in memory. Thus, my guess is that you are running out of RAM. Have you tried a larger machine? 
Reply all
Reply to author
Forward
0 new messages