duplicate reads

117 views
Skip to first unread message

df61...@gmail.com

unread,
Nov 5, 2014, 9:19:26 PM11/5/14
to metaphl...@googlegroups.com
When you map reads to the MetaPhlAn marker genes, does
MetaPhlAn, running bowtie2, map as paired end or single end?

How did you deal with possible PCR or optical duplicate
reads in your metagenomic sample? If you remove them
("dedup"), what software do you use?

If I use Picard Tools MarkDuplicates with single-end reads,
it seems to remove identical reads that map to the same
location, and these reads may not all be duplicates. I am
concerned that "dedup'ing" with MarkDuplicates will remove
too many reads, affecting abundance estimates. On the other
hand if duplicates are left in, that also effects abundance
estimates.

How do you handle this issue of possible duplicates?

Nicola Segata

unread,
Nov 6, 2014, 2:41:41 AM11/6/14
to df61...@gmail.com, metaphl...@googlegroups.com
Hi,
 thanks for getting in touch. MetaPhlAn maps the reads (using BowTie2) as single end reads. There are several reasons for this connect with the fact that we map against short markers rather than full contigs, but the bottom line is that we get more precise results with this setting.

For duplicates, we do not remove them in our own metagenomic projects for the task of taxonomic profiling. It is true that there may be some duplicates due to PCR artifacts, but the robust average implemented in MetaPhlAn will not consider in any case markers with inconsistently high abundances compared to markers of the same clade.

I hope this helps,
thanks
Nicola


--
You received this message because you are subscribed to the Google Groups "MetaPhlAn-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to metaphlan-users+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

df61...@gmail.com

unread,
Nov 6, 2014, 5:28:26 PM11/6/14
to metaphl...@googlegroups.com, df61...@gmail.com, nicola...@unitn.it

Yes, that does help.

Thanks a lot for the reply (and also for the promptness)!

Reply all
Reply to author
Forward
0 new messages