WebGL and Depth-Stencil Renderbuffers

1,540 views
Skip to first unread message

Alecazam

unread,
Mar 22, 2014, 3:03:16 PM3/22/14
to webgl-d...@googlegroups.com
In reading the WebGL 1.0 spec (6.6), and various previous posts on depth and stencil.  I'd like to know what the recommended course of action is for WebGL platforms that still do not support depth-texture.  Trying to emulate the WebGL non-specific spec on Chrome on this has cost me several hours of work.  Depth textures work as per the ES2 spec, but renderbuffers do not.   How do I achieve the following depth formats in WebGL, and can these be added with specifics to the spec?  I'm not sure that the browsers even comply if I don't use a depth texture.

D16     - pass DEPTH_COMPONENT16 to create, attach to DEPTH_COMPONENT  <- only one that works for me in Chrome
D16S8 - pass DEPTH_COMPONENT16 to create, attach to DEPTH_STENCIL_COMPONENT
D24X8 - ?
D24S8 - pass DEPTH_STENCIL to create, attach to DEPTH_STENCIL_COMPONENT 
S8      -  pass STENCIL_INDEX8 to STENCIL_ATTACHMENT, pass nothing to DEPTH or DEPTH_STENCIL_ATTACHMENT

Alecazam

unread,
Mar 22, 2014, 3:08:40 PM3/22/14
to webgl-d...@googlegroups.com
Lemme correct that last item:

S8      -  pass STENCIL_INDEX8 to create, attach to STENCIL_ATTACHMENT only

Alecazam

unread,
Mar 22, 2014, 3:51:22 PM3/22/14
to webgl-d...@googlegroups.com
Chrome and Firefox both report GL_STENCIL_BITS of 0, despite supporting depth_texture with combined depth-stencil.  That makes it impossible to tell from IE WebGL, which still doesn't support stencil (and also reports 0).   Also passing GL_DEPTH_STENCIL to create, and passing that to GL_DEPTH_STENCIL_ATTACHMENT results in a invalid FBO.  Getting texture and depth formats correct should be part of the validation suite?  This is a pretty fundamental requirement to use a GPU.

Alecazam

unread,
Mar 22, 2014, 6:08:59 PM3/22/14
to webgl-d...@googlegroups.com
Looks like the spec works on Firefox/Chrome after all, but I had to disable checks for GL_STENCIL_BITS.


D16     - pass DEPTH_COMPONENT16 to create, attach to DEPTH_COMPONENT  <- only one that works for me in Chrome
D16S8 or D24S8 - pass DEPTH_STENCIL to create, attach to DEPTH_STENCIL_COMPONENT
S8      -  pass STENCIL_INDEX8 to create, attach to STENCIL_ATTACHMENT only

Seems like D24X8 is missing, but that must be because the depth formats aren't known (despite GL_DEPTH_BITS).


Jeff Dash

unread,
Mar 22, 2014, 6:55:40 PM3/22/14
to webgl-d...@googlegroups.com
I'm a little confused what's being asked here, but I can dump some info here.

Available in WebGL 1:
renderbufferStorage:
DEPTH_COMPONENT16 (D16)
DEPTH_STENCIL (D24S8)
STENCIL_INDEX8 (S8)

With WEBGL_depth_texture enabled:
texImage2D:
DEPTH_COMPONENT/UNSIGNED_SHORT (D16)
DEPTH_COMPONENT/UNSIGNED_INT (D32)
DEPTH_STENCIL/UNSIGNED_INT_24_8_WEBGL (D24S8)

D16 and D32 attach at DEPTH_ATTACHMENT
S8 attaches at STENCIL_ATTACHMENT
D24S8 attaches at DEPTH_STENCIL
Other format/attachment-point combos are invalid.

There are no D16S8 or D24X8 formats currently defined for WebGL 1 or its extensions.
WebGL doesn't have an extension for S8 textures, either.

There's additional info here:

We have a bunch of tests which should be testing this stuff. If you think you've found a bug, please detail it in a bug filed against the/all browser(s), and if possible, provide a testcase.

-Jeff




--
You received this message because you are subscribed to the Google Groups "WebGL Dev List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to webgl-dev-lis...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Alecazam

unread,
Mar 22, 2014, 7:20:15 PM3/22/14
to webgl-d...@googlegroups.com
USE of DEPTH_STENCIL as a renderbuffer creation format was what threw me off.  That's not standard ES2 from what I've used, but I realized it was due to WebGL not being able to be specific about depth due to mobile/desktop differences in depth resolution.  It still feels like WebGL impls (Chrome/Firefox) should return 8 for GL_STENCIL_BITS.  IE11 still doesn't have stencil support, so I check that value to see if STENCIL is even possible, but if all the browsers return 0, then it doesn't work as a test.

Thanks also for the D32 texture format info.  I didn't realize that was a valid texture format.  OpenGL and ES both lack simple mechanisms to query texture caps (unlike DX), so a lot of this format stuff is trial and error.  When they fail on a renderbuffer, then you don't realize they work on a texture, and vice versa.  It also doesn't help that GL and ES differ on sized format, internal format, format, and type (and between ES2 and ES3).  In DX, these are all a simple single value (or untyped with view to cast).  

Jeff Dash

unread,
Mar 22, 2014, 7:34:19 PM3/22/14
to webgl-d...@googlegroups.com

D32 might not actually be D32, query depth bits for that. Stencil bits should be non-zero for depth stencil, I should think.

Alecazam

unread,
Mar 22, 2014, 7:42:19 PM3/22/14
to webgl-d...@googlegroups.com
I can call gl.RENDERBUFFER_STENCIL_SIZE, but I test gl.STENCIL_BITS for 0 before trying to create a buffer, and switch to a non-stencil format.  That's what's currently broken.  I submitted another Bugzilla item on this.

Alecazam

unread,
Mar 22, 2014, 7:58:51 PM3/22/14
to webgl-d...@googlegroups.com
It looks like my understanding of gl.STENCIL_BITS isn't correct.  You must have an FBO bound to obtain these correctly in Firefox/Chrome, but then it's a bit late to use the to test if the system supports stencil.   I guess it's back to try the depth-stencil format, and if it fails then mark it as unusable.  Thanks for the very helpful info Jeff.

Alecazam

unread,
Mar 23, 2014, 1:30:52 PM3/23/14
to webgl-d...@googlegroups.com
On further exploration, this still looks like a bug.  At startup, even without allocating a default depth target, I see gl.DEPTH_BITS return 24, but gl.STENCIL_BITS returns 0.  That means the DEPTH_BITS isn't tied into the current FBO, and neither should gl.STENCIL_BITS.   There is already gl.RENDERBUFFER_STENCIL_SIZE, and it feels like gl.STENCIL_BITS has been mistaken with that on Chrome/Firefox.

Alecazam

unread,
Mar 23, 2014, 2:04:43 PM3/23/14
to webgl-d...@googlegroups.com
I filed a Firefox bug on this here.  But Chrome has the same issue.


Here's what a native OpenGL test app reports for the very same quantities (with no FBO depth and or stencil allocated), using those same constants.  I believe that WebGL is interpreting the constant incorrectly.  It's supposed to reflect capability to use Stencil (0 or 8), and not the current stencil bits of an FBO.  It is a "max bits" for specification of RED/GREEN/BLUE/ALPHA/DEPTH/STENCIL.  You should be able to call these before any FBO is allocated.

texDepthBits   = 32
texStencilBits = 8

Shannon Woods

unread,
Mar 23, 2014, 2:31:46 PM3/23/14
to webgl-d...@googlegroups.com
 It's supposed to reflect capability to use Stencil (0 or 8), and not the current stencil bits of an FBO.

I don't think that assertion is correct. In the OpenGL ES 2.0.25 spec (https://www.khronos.org/registry/gles/specs/2.0/es_full_spec_2.0.25.pdf), section 4.4.6 states that all the values from table 6.2.1 (Implementation Dependent Pixel Depths, with includes STENCIL_BITS) are framebuffer-dependent, and will change with the state of FRAMEBUFFER_BINDING. The ES man page for glGet also reflects this (http://www.khronos.org/opengles/sdk/docs/man/xhtml/glGet.xml) -- "params returns one value, the number of bitplanes in the stencil buffer of the currently bound framebuffer."

Unless there's something in the WebGL spec which causes this value to have a different meaning than in ES, which I'm not seeing, the value you're querying should be entirely dependent on the currently bound framebuffer.


--

Alecazam

unread,
Mar 23, 2014, 2:42:04 PM3/23/14
to webgl-d...@googlegroups.com
Thanks Shannon for the spec reference.  The spec says that when an FBO is not bound, then the results are implementation dependent.  I'm saying that aspect of WebGL does not appear to match what native OpenGL does (returning max bits for each BITS request).  I'm seeing 24/0 on WebGL, and 32/8 on native OpenGL in my test app.  I'll give the calls a try with an FBO bound too.  

Even if this behavior could be fixed, it sounds like I'll still need a workaround that doesn't rely on these values.  This behavior still seems redundant with gl.RENDERBUFFER_DEPTH/STENCIL_SIZE.  Why have two constants for the same thing?  See the text about that right above 6.1.4 in the spec.

Alecazam

unread,
Mar 23, 2014, 2:44:14 PM3/23/14
to webgl-d...@googlegroups.com
I suppose an FBO with a texture bound might be reported in gl.DEPTH/STENCIL_BITS, where you don't have a renderbuffer in the FBO.  Okay, now maybe they make more sense on the currently bound FBO, and seem less redundant.

Shannon Woods

unread,
Mar 24, 2014, 11:53:23 AM3/24/14
to webgl-d...@googlegroups.com
Thanks Shannon for the spec reference.  The spec says that when an FBO is not bound, then the results are implementation dependent.  I'm saying that aspect of WebGL does not appear to match what native OpenGL does (returning max bits for each BITS request).

I wouldn't assume that's what native OpenGL is doing. If you're querying DEPTH_BITS and STENCIL_BITS on the default framebuffer, the values returned are likely the depths of those channels in the default framebuffer. Likewise, if you're querying the default framebuffer in WebGL, you're probably getting the actual bit depth of the default framebuffer. (Assuming these queries are passed through to the underlying ES implementation as is, that is definitely what you are getting on at least one implementation.)

Are you requesting stencil at context creation time? If I do not pass context creation parameters to canvas.getContext() in Chrome on Windows, it appears to create a default framebuffer without stencil, and gl.STENCIL_BITS is 0 when the framebuffer binding is 0. If I use {stencil: 8}, the default framebuffer gets an 8-bit stencil buffer, and gl.STENCIL_BITS is 8 when the framebuffer binding is 0. I would be very unsurprised if the default channel sizes created when you leave these unspecified differs wildly between implementations and APIs, so if you are leaving these values up to the implementation rather than specifying them, that may account for the difference you're seeing.

Alecazam

unread,
Mar 24, 2014, 8:06:24 PM3/24/14
to webgl-d...@googlegroups.com
It turns out that a default Depth (and no Stencil) was begin allocated, so that's what was being reflected back in those bits.  It's good because I can fix that, and save a full depth buffer.  On ES, I thought there is always a default FBO, where on OpenGL one must be explicitly bound.  That may be why these quantities differ between Mac and WebGL impls.

Shannon Woods

unread,
Mar 25, 2014, 1:13:43 AM3/25/14
to webgl-d...@googlegroups.com
On ES, I thought there is always a default FBO, where on OpenGL one must be explicitly bound.

ES 2.0 always has a default window-system-provided framebuffer (which is different from an FBO, if I'm being pedantic-- those are user-created), while desktop OpenGL 3.0+ and OpenGL ES 3.0+ can be used without such a framebuffer (using only FBOs, meaning that binding 0 results in an incomplete framebuffer, and rendering occurs only off-screen). I suspect in your example applications you were seeing a window-system-provided framebuffer in both cases, though-- it was just that the ES/WebGL system framebuffer didn't contain a stencil buffer. This is more of a difference between the companion API behavior (EGL/CGL/GLX/WGL/etc) and the underlying window system than between ES and GL themselves. EGL in particular specifies that the default value for stencil size in a config description is zero, so if you don't specify one, and the system can provide a default framebuffer without stencil, then you don't get stencil. (It's slightly more complicated than that-- EGL is queried for matching configurations, and a list is returned, and the application can select between them, but I don't think this is pertinent to the view from WebGL.)



On Mon, Mar 24, 2014 at 8:06 PM, Alecazam <al...@figma.com> wrote:
It turns out that a default Depth (and no Stencil) was begin allocated, so that's what was being reflected back in those bits.  It's good because I can fix that, and save a full depth buffer.  On ES, I thought there is always a default FBO, where on OpenGL one must be explicitly bound.  That may be why these quantities differ between Mac and WebGL impls.

I wouldn't assume that's what native OpenGL is doing. If you're querying DEPTH_BITS and STENCIL_BITS on the default framebuffer, the values returned are likely the depths of those channels in the default framebuffer. Likewise, if you're querying the default framebuffer in WebGL, you're probably getting the actual bit depth of the default framebuffer. (Assuming these queries are passed through to the underlying ES implementation as is, that is definitely what you are getting on at least one implementation.)

--
Reply all
Reply to author
Forward
0 new messages