Part of the testing includes real-world performance testing. This isn't new, I've said that multiple time before in these discussions. I said it when ESET was your fav, and nothing has changed. The testing includes work with the following:
Microsoft Office products (Excel, PowerPoint, OneNote, and Word).
email (Opera, Outlook, Pegasus, and Thunderbird),
Web surfing (IE 6 through 9, Mozilla, Chrome, Opera, and Safari)
Recently (late 2010/early 2011) added is cloud drives from Box, Dropbox, Google Drive, and SugarSync.
You want us to add something else, great, make a suggestion.
Opinion is NOT fact. Your statement that MSE is (among) the worst, is simply not substantiated by facts.
I didn't say that average user experience isn't important. I did say that opinion isn't fact. I did say that we do and did real world performance testing. If you don't want to accept that we attempt to duplicate real-world conditions, that's your choice. The organizations that pay for the lab regularly review the performance testing to make sure it provides an accurate sample of their work environments. This has been the norm since the beginning. So... your opinion about it is just that, your opinion.
Are there circumstances where any particular tool will be less than optimal for a given situation? Absolutely! Again, that's why there are multiple products in the space.
Is MSE the best or only tool? No. never said it was. In fact, I initially panned MSE in this group. However, the product and the results changed. It is a worthy contender! You don't want accept that. OK, change can be difficult. Doesn't change the facts. MSE is NOT "the," or even close to being, worst. It does not belong in that category. To claim otherwise, based on anecdotal or occasional experience, is absolutely misleading when the opinion-based claim is stated as fact.
David