But the swizzles are in GL3/ES3 to emulate L and LA. They’re just not going to be supported by WebGL 2. I’m suggesting that you need one or the other. Either the swizzles need to be exposed, or else leave in L and LA support (which have DX equivalents already, and use the swizzles internally to WebGL 2 to avoid exposing them).
On Feb 25, 2015, at 1:15 PM, Alecazam <al...@figma.com> wrote:But the swizzles are in GL3/ES3 to emulate L and LA.
They’re just not going to be supported by WebGL 2. I’m suggesting that you need one or the other. Either the swizzles need to be exposed, or else leave in L and LA support (which have DX equivalents already, and use the swizzles internally to WebGL 2 to avoid exposing them).
L & LA textures remain … part of OpenGL ES 3 and therefore WebGL 2
But the swizzles are in GL3/ES3 to emulate L and LA. They’re just not going to be supported by WebGL 2. I’m suggesting that you need one or the other. Either the swizzles need to be exposed, or else leave in L and LA support (which have DX equivalents already, and use the swizzles internally to WebGL 2 to avoid exposing them).
On Tuesday, February 24, 2015 at 8:10:54 PM UTC-8, Mark Callow wrote:
> On Feb 25, 2015, at 11:16 AM, Alecazam <al...@figma.com> wrote:
>
> This will prevent simple emulation of L and LA textures. These formats were stripped from the GL3.2 core spec in a breaking change to existing apps. Texture swizzles emulate them, but Apple's initial implementation of 3.2 core didn't include swizzle on all platforms (but did in later OS updates).
>
> Unfortunately, it looks like WebGL 2 has decided to not emulate swizzles either. Won't this be a breaking change for WebGL 1 apps that rely on these formats. Apps rely on these formats to feed grayscale data to color shaders without having to modify the shader. Each texture op will now require manual swizzling in the shaders that is disabled for color formats.
All true, but L & LA textures remain - a deprecated - part of OpenGL ES 3 and therefore WebGL 2.
Regards
-Mark
--
You received this message because you are subscribed to the Google Groups "WebGL Dev List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to webgl-dev-lis...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
To clarify my last statement, DX9 had the L and LA formats. DX11 only has R and forces the shader to monkey-patch for swizzles. The hw has all sorts of programmable swizzle support internally (as evidenced by the GL texture swizzle), but maybe DX12 will re-expose that. I just think this will be a mess for WebGL developers as well.Sorry, Mark just saw your reply. I re-read your sentence, and it's good to see these are still in WebGL 2. The first read I thought you said they were deprecated. Now, I'm just not sure how the DX11/Angle layer emulates them efficiently other than to mod the shader.
On Tuesday, February 24, 2015 at 8:15:49 PM UTC-8, Alecazam wrote:But the swizzles are in GL3/ES3 to emulate L and LA. They’re just not going to be supported by WebGL 2. I’m suggesting that you need one or the other. Either the swizzles need to be exposed, or else leave in L and LA support (which have DX equivalents already, and use the swizzles internally to WebGL 2 to avoid exposing them).
On Tuesday, February 24, 2015 at 8:10:54 PM UTC-8, Mark Callow wrote:
> On Feb 25, 2015, at 11:16 AM, Alecazam <al...@figma.com> wrote:
>
> This will prevent simple emulation of L and LA textures. These formats were stripped from the GL3.2 core spec in a breaking change to existing apps. Texture swizzles emulate them, but Apple's initial implementation of 3.2 core didn't include swizzle on all platforms (but did in later OS updates).
>
> Unfortunately, it looks like WebGL 2 has decided to not emulate swizzles either. Won't this be a breaking change for WebGL 1 apps that rely on these formats. Apps rely on these formats to feed grayscale data to color shaders without having to modify the shader. Each texture op will now require manual swizzling in the shaders that is disabled for color formats.
All true, but L & LA textures remain - a deprecated - part of OpenGL ES 3 and therefore WebGL 2.
Regards
-Mark
On Feb 25, 2015, at 1:37 PM, Alecazam <al...@figma.com> wrote:I did a search for LUMINANCE in the webgl2 spec, but didn't see it. I'm assuming it's inherited from the WebGL 1 context.
> ANGLE over D3D11 expands L/LA to RGBA behind the scenes, at a performance cost on shader upload/readback time.Oh that just made me so sad, but I do appreciate that piece of info. Is there any 1/2 channel 8-bit format (A8?) that doesn't get expanded? This would be good to indicate in the WebGL 1 performance docs. This basically makes these formats highly unpredictable for memory tracking on an API that won't even tell you how much video memory you have.
--
To unsubscribe from this group and stop receiving emails from it, send an email to webgl-dev-list+unsubscribe@googlegroups.com.