I wonder if anyone can shed light onto what's happening on some browsers with this rather simple HTML document.
This page has an <img> behind a <canvas>, which is a WebGL canvas with {premultipliedAlpha: true} set in its context attributes.
https://madeinhaste.github.io/webgl-premultipliedalpha-quirk/I'm rendering 3 orange gradients onto this WebGL canvas (just using scissor/clear for simplicity - hit "view page source" to see the full code):
1) the first gradient is (rgb * alpha, alpha), where I would expect a "normal"/"source-over" interaction with the background, fading from opaque to transparent.
2) the second gradient is (rgb * alpha, 0), where I was hoping for an additive effect on the background.
3) the third gradient is (rgb * alpha, 1/255), which I'd expect to be similar to the second.
(You can mouse over the canvas to see the pixel values read back from the drawing buffer).
For me, the appearance of this page depends on the browser:
- Chrome & Firefox on Linux, Firefox on Windows 10, Safari on my iPhones 7 & 14, Chrome on my Pixel 3, all display gradients 1, 2 & 3 "correctly" - that is, with normal and additive blending effects.
- Chrome & Edge on Windows 10, and Chrome & Safari on a MacBook running Ventura display only gradients 1 & 3. The second gradient with alpha=0 is invisible.
I can't see anything in the WebGL spec about intended blending behaviour, and so far I have not found any other posts about this. Just wondering if anyone has any insight as this variance surprised me a little.
Dan.