Discrepencies between benchmark resutls from Maaslin2 and the study by Weiss et al. 2017

65 views
Skip to first unread message

Li Yanxian

unread,
Aug 7, 2019, 10:01:39 AM8/7/19
to MaAsLin-users
Dear Maaslin2 developers,

I was lucky to access the Maaslin2 presentation by Dr. Huttenhower given for the STAMPS 2019 workshop. A wide range of normalization, transformation, and differentially abundance testing methods were benchmarked for the development of Maaslin2. One of the slides showed the sensitivity and false discovery rate of the various state-of-the-art methods for differential abundance testing using simulated data. When I compared the Maaslin2 benchmark results to those by Weiss et at., 2017 (Fig.6), I found discrepancies for many of the methods tested, such as DESeq2 and ANCOM. Now, I'm not sure which methods are best suited for the differential abundance testing task. Could you comment on this and give your recommendations of "best" methods?

Regards,
Yanxian

Results from Weiss et al., 2017:

Weiss et al., 2017.png


Results from Maaslin2 benchmarks:


maaslin2.png

Reply all
Reply to author
Forward
0 new messages