We see exactly the same issue and have done similar things to prove
it. Since we don't have source and the Quadrant developers won't
respond to the email address on their website, we have to guess but we
believe they are incorrectly trying to use a 16-bit depth and only S/W
GL supports that. Hence they don't use hardware GL and you see the
artifacts. We'd like to learn more about the logic they use in their
EGL config chooser. This should only be a 5 min conversation or email
and the benchmark could be easily fixed.
-Michael