Error while using compute core microbiome script

478 views
Skip to first unread message

SG

unread,
Jul 20, 2016, 12:20:53 AM7/20/16
to Qiime 1 Forum
Hi,
I was trying to use compute_core_microbiome.py script, and i got an error

shashank@shashank-HP-Z620-Workstation:~/Desktop/TRISUTRA_New_Analysis$ compute_core_microbiome.py -i merged_otu_table.biom -o compute_core_microbiom
Traceback (most recent call last):
  File "/home/shashank/miniconda2/bin/compute_core_microbiome.py", line 4, in <module>
    __import__('pkg_resources').run_script('qiime==1.9.1', 'compute_core_microbiome.py')
  File "/home/shashank/miniconda2/lib/python2.7/site-packages/setuptools-20.3-py2.7.egg/pkg_resources/__init__.py", line 726, in run_script
   
  File "/home/shashank/miniconda2/lib/python2.7/site-packages/setuptools-20.3-py2.7.egg/pkg_resources/__init__.py", line 1484, in run_script
   
  File "/home/shashank/miniconda2/lib/python2.7/site-packages/qiime-1.9.1-py2.7.egg-info/scripts/compute_core_microbiome.py", line 171, in <module>
    main()
  File "/home/shashank/miniconda2/lib/python2.7/site-packages/qiime-1.9.1-py2.7.egg-info/scripts/compute_core_microbiome.py", line 156, in main
    write_biom_table(core_table, output_table_fp)
  File "/home/shashank/miniconda2/lib/python2.7/site-packages/qiime/util.py", line 569, in write_biom_table
    "Attempting to write an empty BIOM table to disk. "
qiime.util.EmptyBIOMTableError: Attempting to write an empty BIOM table to disk. QIIME doesn't support writing empty BIOM output files.


FYI

shashank@shashank-HP-Z620-Workstation:~/Desktop/TRISUTRA_New_Analysis$
print_qiime_config.py -t

System information
==================
         Platform:    linux2
   Python version:    2.7.11 |Continuum Analytics, Inc.| (default, Dec  6 2015, 18:08:32)  [GCC 4.4.7 20120313 (Red Hat 4.4.7-1)]
Python executable:    /home/shashank/miniconda2/bin/python

QIIME default reference information
===================================
For details on what files are used as QIIME's default references, see here:
 https://github.com/biocore/qiime-default-reference/releases/tag/0.1.3

Dependency versions
===================
          QIIME library version:    1.9.1
           QIIME script version:    1.9.1
qiime-default-reference version:    0.1.3
                  NumPy version:    1.11.0
                  SciPy version:    0.17.1
                 pandas version:    0.18.1
             matplotlib version:    1.4.3
            biom-format version:    2.1.5
                   h5py version:    2.6.0 (HDF5 version: 1.8.16)
                   qcli version:    0.1.1
                   pyqi version:    0.3.2
             scikit-bio version:    0.2.3
                 PyNAST version:    1.2.2
                Emperor version:    0.9.51
                burrito version:    0.9.1
       burrito-fillings version:    0.1.1
              sortmerna version:    SortMeRNA version 2.0, 29/11/2014
              sumaclust version:    SUMACLUST Version 1.0.00
                  swarm version:    Swarm 1.2.19 [Mar  1 2016 23:41:10]
                          gdata:    Installed.

QIIME config values
===================
For definitions of these settings and to learn how to configure QIIME, see here:
 http://qiime.org/install/qiime_config.html
 http://qiime.org/tutorials/parallel_qiime.html

                     blastmat_dir:    None
      pick_otus_reference_seqs_fp:    /home/shashank/miniconda2/lib/python2.7/site-packages/qiime_default_reference/gg_13_8_otus/rep_set/97_otus.fasta
                         sc_queue:    all.q
      topiaryexplorer_project_dir:    None
     pynast_template_alignment_fp:    /home/shashank/miniconda2/lib/python2.7/site-packages/qiime_default_reference/gg_13_8_otus/rep_set_aligned/85_otus.pynast.fasta
                  cluster_jobs_fp:    start_parallel_jobs.py
pynast_template_alignment_blastdb:    None
assign_taxonomy_reference_seqs_fp:    /home/shashank/miniconda2/lib/python2.7/site-packages/qiime_default_reference/gg_13_8_otus/rep_set/97_otus.fasta
                     torque_queue:    friendlyq
                    jobs_to_start:    1
                       slurm_time:    None
            denoiser_min_per_core:    50
assign_taxonomy_id_to_taxonomy_fp:    /home/shashank/miniconda2/lib/python2.7/site-packages/qiime_default_reference/gg_13_8_otus/taxonomy/97_otu_taxonomy.txt
                         temp_dir:    /tmp/
                     slurm_memory:    None
                      slurm_queue:    None
                      blastall_fp:    blastall
                 seconds_to_sleep:    1

QIIME base install test results
===============================
.........
----------------------------------------------------------------------
Ran 9 tests in 0.065s

OK

I have tried using this script for sample biom file as well and still got the same error.

Any Suggestions.

Best
Shashank

Rachel C.

unread,
Jul 21, 2016, 3:42:33 PM7/21/16
to Qiime 1 Forum
I'm having the exact same issue. The same biom table I'm trying to computer core with is working for other scripts but failing for his one. Hope you get an answer.

Gene Blanchard

unread,
Jul 21, 2016, 3:53:53 PM7/21/16
to Qiime 1 Forum
Is there a chance that no OTUs are present in at least 50% of your samples? Maybe try reducing the --min_fraction_for_core   to .25 and see if you get anything.

Rachel C.

unread,
Jul 21, 2016, 4:17:27 PM7/21/16
to Qiime 1 Forum
I had this thought myself and tried with an extremely small fraction (I am sure that there is at least one OTU present in all samples)

compute_core_microbiome.py -i rarefaction_1000_1.biom -o core_otu_table --min_fraction_for_core 0.005


And still end up with:

Colin Brislawn

unread,
Jul 21, 2016, 4:42:15 PM7/21/16
to Qiime 1 Forum
Hello Rachel,

I noticed this and thought I would 'qiime in.'
rarefaction_1000_1.biom
I find that 1000 reads per sample is a pretty low rarefaction depth. It's possible that at this low depth, no single OTU appears in all samples. Have you tried this script on your unratified table or on a table with a higher rarefaction depth?

I hope that helps!
Colin

Rachel C.

unread,
Jul 21, 2016, 4:57:33 PM7/21/16
to Qiime 1 Forum
I had a few samples that did not have many reads, in order to maintain enough replicates, 1000 was the highest I could go while having enough data to analyze all of my factors (too many samples with under 1000 reads. Not the best MiSeq run, apparently).

I did try with an unrarefied table and still got the same results (using 25% as my minimum percent) saying it couldn't write an empty biom table. I am certain that nearly every sample had at least the false "OTU" of "No Blast Hit" and I'm pretty certain there are a few OTUs in every or almost every sample as well. I did make a heatmap with the rarefied table, and I can see several OTUs that appear in most of the samples (looks like far more than 50%). 

Sorry to Shashank, I am not meaning to take over your question, I'm hoping my joining the thread will help you get an answer as well. 

SG

unread,
Jul 26, 2016, 1:01:15 AM7/26/16
to Qiime 1 Forum
Hi,
 
I don't understand how this happens, to use compute_core_microbiome.py script i installed QIIME 1.8.0 on other system, and it works there.

Secondly, when i was trying to merge two biom file in QIIME 1.9.1 using script,

                                          merge_otu_tables.py -i otu_table1.biom,otu_table2.biom -o merged_otu_table.biom

The output file merged_otu_table.biom is in HDF document (application/x-hdf) format and i guess QIIME does not support this format for their analysis.

So I think QIIME 1.8.0 version works fine for some of the scripts.

Best,
Shashank

Colin Brislawn

unread,
Jul 26, 2016, 2:02:32 PM7/26/16
to Qiime 1 Forum
Hello Shashank,

You mention something important:
is in HDF document (application/x-hdf) format
New versions of qiime like 1.9.1 support the HDF format, while older versions like 1.8.0 do not. If you want to use a HDF .biom files with qiime 1.8.0, you should convert the .biom file first.

Here is how you can convert HDF .biom files so they are compatable with other versions of qiime: 

Let me know if this helps,
Colin

SG

unread,
Jul 28, 2016, 12:07:22 AM7/28/16
to Qiime 1 Forum
Hi,
Well thanks for the information about HDF files.

But what about the main question, how QIIME 1.9.1 not working for compute_core_microbiome.py, while QIIME 1.8.0 works.

Best
Shashank

Colin Brislawn

unread,
Jul 28, 2016, 3:29:00 PM7/28/16
to Qiime 1 Forum
Hello Shashank,

QIIME 1.9.1 not working for compute_core_microbiome.py,
That script should still work... 

If there is still an issue with that script, could you run print_qiime_config.py followed by compute_core_microbiome.py and post the full output of both commands? Thanks!

Colin

SG

unread,
Jul 28, 2016, 11:36:31 PM7/28/16
to Qiime 1 Forum
Hello Colin,

I would like you to redirect one of my question which i have posted in qiime forum, you can find it here.

When i used print_qiime_config.py script, it shows an error.

shashank@shashank-HP-Z620-Workstation:~$ print_qiime_config.py -t

Traceback (most recent call last):
  File "/home/shashank/miniconda2/bin/print_qiime_config.py", line 4, in <module>
    __import__('pkg_resources').run_script('qiime==1.9.1', 'print_qiime_config.py')
  File "/home/shashank/miniconda2/lib/python2.7/site-packages/setuptools-20.3-py2.7.egg/pkg_resources/__init__.py", line 2900, in <module>
  File "/home/shashank/miniconda2/lib/python2.7/site-packages/setuptools-20.3-py2.7.egg/pkg_resources/__init__.py", line 2886, in _call_aside
  File "/home/shashank/miniconda2/lib/python2.7/site-packages/setuptools-20.3-py2.7.egg/pkg_resources/__init__.py", line 2913, in _initialize_master_working_set
  File "/home/shashank/miniconda2/lib/python2.7/site-packages/setuptools-20.3-py2.7.egg/pkg_resources/__init__.py", line 644, in _build_master
  File "/home/shashank/miniconda2/lib/python2.7/site-packages/setuptools-20.3-py2.7.egg/pkg_resources/__init__.py", line 657, in _build_from_requirements
  File "/home/shashank/miniconda2/lib/python2.7/site-packages/setuptools-20.3-py2.7.egg/pkg_resources/__init__.py", line 830, in resolve
pkg_resources.DistributionNotFound: The 'matplotlib!=1.4.2,>=1.1.0' distribution was not found and is required by qiime



Thanks.
Shashank

Colin Brislawn

unread,
Jul 29, 2016, 6:51:38 PM7/29/16
to Qiime 1 Forum
Hello Shashank,

I'm afraid I'm not much help here. The current guide for installing qiime works for me, and I have not recieved this error before. 

Colin



AC M

unread,
Aug 16, 2016, 3:27:13 AM8/16/16
to Qiime 1 Forum
Hi,

I have had the same error ("Attempting to write an empty BIOM table to disk. QIIME doesn't support writing empty BIOM output files."). I found out though that despite the erorr the output file I wanted was still computed :) I wanted OTUs that occur in at least 25% of my samples, so I set --min_fraction_for_core 0.25. The problem is that there is still a default value of 1, i.e. 100% of samples for --max_fraction_for_core and 11 for --num_fraction_for_core_steps. The latter causes the script to make files at different percentages from 25% up to 100% in 11 steps. So even if we have OTUs in 25% of samples, it calculates also for 32%, 40%, 48% etc (until 100% in 11 steps) and creates an output file for each. At the higher percentages where we don't have OTUs in all those samples anymore, it tries to write empty biom files which causes the error. The 25% or 50% file we wanted in the beginning is nevertheless fine! At least mine was ;-)

To avoid having all these additional files, I set --num_fraction_for_core_steps to 2 (that is apparently the minimum number of steps required). Then I get a file with 25% (or whatever threshhold you choose) core microbiome and one with 100% (which still causes the error, but at least I didn't get 11 output files anymore).

Hope this helps

Jai Ram Rideout

unread,
Aug 16, 2016, 1:43:53 PM8/16/16
to Qiime 1 Forum
Hello,

Thanks for following up with details about the workaround. You should be able to use the smaller percentage tables just fine. The script is raising an error when attempting to write an empty (i.e. completely filtered) .biom file for the higher percentage tables. It computes the tables in ascending order, starting with --min_fraction_for_core and increasing the percentage by --num_fraction_for_core_steps until --max_fraction_for_core is reached. You can supply a lower value for --max_fraction_for_core to avoid these errors.

Best,
Jai
Reply all
Reply to author
Forward
0 new messages