Best way to sample depth-values from previously rendered depth buffer?

2,381 views
Skip to first unread message

Andre Weissflog

unread,
Feb 28, 2014, 2:36:21 PM2/28/14
to webgl-d...@googlegroups.com
Hi,

I want to sample depth-values in a deferred pre-light-pass renderer to reconstruct view-space position. Depth is written in an initial depth/normal pass to a framebuffer-attached depth-texture. This is the only pass which writes to the depth-buffer. All other passes (light and material) only read the depth-buffer. In the following passes I need to sample depth-values.

My first solution was to simply sample from the depth-texture, which is still attached to the framebuffer. On some platforms this isn't a problem in WebGL, since it's only undefined behaviour when writing to and reading from a texture at the same time. The ANGLE backend had this recently disabled though since this is not possible in DX11 (see here: https://code.google.com/p/chromium/issues/detail?id=347182).

My next attempt was to use copyTexImage2D to duplicate the depth buffer into a second depth texture, which works on desktop GL, but the WEBGL_depth_texture extension specifically forbids copyTexImage2D with DEPTH_COMPONENT and DEPTH_STENCIL internalformat.

Sooo... next I will try to do an additional fullscreen-quad pass, read from the depth-texture (which is still attached to a framebuffer, but the framebuffer is not bound), and write to a float render target (which additionally requires the OES_texture_float extension). Then later sample depth values from this float texture instead of the depth texture, while the original depth texture is used as depth-buffer (with z-writes disabled).

Is there a better way which I'm currently missing?

Thanks!
-Floh.

Shannon Woods

unread,
Feb 28, 2014, 4:17:11 PM2/28/14
to webgl-d...@googlegroups.com
This doesn't answer your question, but I wanted to clarify-- it's undefined according to the ES spec if the texture's bound to the current framebuffer, bound to a texture unit, and either the current vertex or fragment shader samples from it. It's not actually contingent upon performing a write to the texture.

-- Shannon --

--
You received this message because you are subscribed to the Google Groups "WebGL Dev List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to webgl-dev-lis...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

Kenneth Russell

unread,
Feb 28, 2014, 4:37:34 PM2/28/14
to webgl-d...@googlegroups.com
Is it possible for you to just detach the depth texture from the
framebuffer when sampling from it? Or have a second framebuffer which
has the same color attachments as the first, but no depth attachment,
and swap framebuffers when you want to sample the previously-rendered
depth texture?

-Ken

Andre Weissflog

unread,
Feb 28, 2014, 5:31:50 PM2/28/14
to webgl-d...@googlegroups.com
I've been thinking in this direction too but I need the depth buffer in the followup passes for hidden surface removal, while at the same time sampling depth values in the fragment shader to reconstruct view-space-position for lighting and other effects.

Here's a quick (and simplified) overview of my render-pipeline:

There are 3 framebuffers/rendertargets:

(1) Normal/Depth-Buffer: RGB8 + DEPTH24
(2) Light-Buffer: RGBA8 + depth buffer from render target (1)
(3) Color-Buffer: RGBA8 + depth buffer from render target (1)

So all render-targets share the same depth-buffer.

3 render passes, first a geometry pass to record surface normals and depth of the scene, then a lighting pass where light-sources are rendered, and finally another geometry pass with combines material color with the lighting values.

Normal/Depth Pass: the scene is rendered into the Normal/Depth-Buffer with Z-Writes enabled, and Z-Func set to less-equal. The per-pixel view-space surface normal is written to RGB, and the depth buffer is written with the vanilla fragment depth values.

Lighting-Pass: this renders one global light, and any number of local (point-) light-sources. Color-blending is set to additive, Z-Writes are disabled. For local lights Z-Test is enabled, and the Z-Func is set to GREATER, so the fragment shader is running for areas where local light volumes intersect with the scene. For lighting I generally need the view-space-normal (no problem, I'm getting this from render-target (1)), and specifically for local lights I need to reconstruct the view-space-position from the depth at that pixel (and here's the problem, I need both the current frame's depth buffer functionality, as well as sample the depth in the fragment shader). Result of the lighting pass is surface light color in RGB, and some N.H derived value in A. 

Material-Pass: Z-Writes remain disabled, Z-Func is set to EQUAL (for opaque materials) or LESS-EQUAL (for alpha-blended materials). Simple materials only sample the previously rendered light-buffer. But some materials (reflective, depth-modulated alpha, etc...) also need to read depth-values, so same problem: the depth-buffer is needed for z-test, but the fragment-shader also needs to read depth values.

Originally I was encoding both depth and normal into the RGBA part of the normal/depth render target (16-bit each), but 16-bit depth is too little precision, and only 16-bit for normals is also bad for surface detail.

I just noticed that both Firefox and Chrome have support for WEBGL_draw_buffers, so there are more options:

(1) if draw_buffers and texture_float is available, I could render the depth to an additional float32 render target during the initial normal/depth pass
(2) if draw_buffers is available, but not texture_float, I could encode the depth to an additional RGBA render target during the initial normal/depth pass
(3) if only texture_float is available, I would need an additional fullscreen-quad pass to transfer the depth-buffer-values into a float32 texture
(4) if EXT_frag_depth is available (Firefox Nightly seems to have this, but not Chrome?) I would also need an additional fullscreen-quad-pass, but could write to another depth-texture instead of a float-texture.
(5) if nothing but depth_texture is available I'd have to do the fullscreen-quad transfer pass, reading from the depth-texture, and encoding the value into an RGB value

Blargh, I guess I'll first check around what extensions are available on different WebGL platforms. draw_buffers+texture_float sounds like the most elegant solution. 

-Floh.

Andre Weissflog

unread,
Feb 28, 2014, 6:00:50 PM2/28/14
to webgl-d...@googlegroups.com
Thanks for the clarification. I wonder whether there's hardware out there for which this restrictions is actually really necessary. It's pretty clear that reading from and rendering to a texture at the same time is a bad idea, but it's not as clear why reading from a depth-texture which is also still used as (read-only) depth-buffer is not allowed, the only sensible reason would be when the depth-buffer backing store isn't in a format which can also easily be texture-sampled.

Won

unread,
Mar 4, 2014, 4:48:21 PM3/4/14
to webgl-d...@googlegroups.com
First of all, let me express my opinion that I think it is reasonable to expect that depth textures may be simultaneously bound as attachments if depth writes are disabled. It has long been possible in OpenGL to do funkier things like 1:1 post processing with textures simultaneously bound as attachments, and what Floh is suggesting sounds pretty vanilla to me. Maybe the WebGL implementation could perform the underlying copy behind-the-scenes on platforms that do not enable the copy-free behavior.

Floh, 

As a workaround, I would suggest the following format:

2-channel normal + gloss + reverse depth in a 4-channel 16-bit float target.

2-channel normal should be stereographically projected (other options also explored here): http://aras-p.info/texts/CompactNormalStorage.html#method07stereo

Instead of storing depth directly, store 1-depth. The reason being: perspective Z piles all the precision at the near plane, and using floating point makes this worse since it piles all the precision near 0. You can recover lots of the precision by inverting Z, so that the additional float precision goes near the far plane.

-Won

PS if you have a global light, it might be better to do it in your material pass to avoid a per-pixel blend.

Mark Callow

unread,
Mar 4, 2014, 9:29:45 PM3/4/14
to webgl-d...@googlegroups.com

On 2014/03/05 6:48, Won wrote:
First of all, let me express my opinion that I think it is reasonable to expect that depth textures may be simultaneously bound as attachments if depth writes are disabled. It has long been possible in OpenGL to do funkier things like 1:1 post processing with textures simultaneously bound as attachments, and what Floh is suggesting sounds pretty vanilla to me.

This and "funkier things" may work on *some* OpenGL and OpenGL ES implementations but both spec's are very clear that these things are rendering feedback loops and the behavior is undefined. The API, therefore, provides no way to determine if it will work. On  the massively parallel machines that are modern GPUs even something that seems like it should "just work" can trip over.

It is possible that the definition of what constitutes a feedback loop could be narrowed in a future OpenGL {,ES} spec. revision as it does seem too broad at the moment.


Maybe the WebGL implementation could perform the underlying copy behind-the-scenes on platforms that do not enable the copy-free behavior.

WebGL implementations cannot do this. See above. 

Regards

    -Mark

--
注意:この電子メールには、株式会社エイチアイの機密情報が含まれている場合が有ります。正式なメール受信者では無い場合はメール複製、 再配信または情報の使用を固く禁じております。エラー、手違いでこのメールを受け取られましたら削除を行い配信者にご連絡をお願いいたし ます.

NOTE: This electronic mail message may contain confidential and privileged information from HI Corporation. If you are not the intended recipient, any disclosure, photocopying, distribution or use of the contents of the received information is prohibited. If you have received this e-mail in error, please notify the sender immediately and permanently delete this message and all related copies.

Reply all
Reply to author
Forward
0 new messages