WebGL2: Rendering to 3D textures?

936 views
Skip to first unread message

Jerry Ylilammi

unread,
May 9, 2015, 11:55:05 AM5/9/15
to webgl-d...@googlegroups.com
Now that WebGL 2 provides 3D texture support, is it possible to write to 3D textures or are we limited to reading from them?

I'd like to implement 3D fluid simulation using shaders, but to be efficient I'd need to be able to write to 3D textures.

Jamie Madill

unread,
May 9, 2015, 2:18:44 PM5/9/15
to webgl-d...@googlegroups.com
Hey Jerry,

You can attach a 3D texture or 2D array texture layer to a FBO with 'framebufferTextureLayer', see [1].

Unfortunately you can't bind a whole 3D texture like you can in desktop GL. This is because we don't have access to geometry shaders, which can pick a layer to render into after the vertex shader stage.

I was thinking about this restriction for my own project, and the idea I was considering was to use MRT. One color attachment matches one z plane, and condense your solve into a sequence of draw calls. This at least cuts down on the number of draw calls by a factor of 'MaxDrawBuffers'. There may be other solutions as well, maybe others have alternate suggestions.


On Sat, May 9, 2015 at 11:55 AM, Jerry Ylilammi <jerry.y...@gmail.com> wrote:
Now that WebGL 2 provides 3D texture support, is it possible to write to 3D textures or are we limited to reading from them?

I'd like to implement 3D fluid simulation using shaders, but to be efficient I'd need to be able to write to 3D textures.

--
You received this message because you are subscribed to the Google Groups "WebGL Dev List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to webgl-dev-lis...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Jerry Ylilammi

unread,
May 11, 2015, 1:03:00 PM5/11/15
to webgl-d...@googlegroups.com
Hey Jamie,

That sounds like it might be viable. console.log("MaxDrawBuffers " + gl.MAX_DRAW_BUFFERS); gives me MaxDrawBuffers 34852. I'll have to check if I can do 200 buffers, that'd give me 200x200x200 grid for simulation which would be plenty enough. It might still be way too slow thought, because doing Jacobi iterations of the simulations is around 20-30 draw calls already. So one step in fluid simulation is ~35 draw calls and I'd like around 30 FPS which would mean 1050 draw calls per second and binding all those buffers each time.

Thanks for this, I'll return at some point with results :)

- Jerry

Zhenyao Mo

unread,
May 11, 2015, 1:19:55 PM5/11/15
to webgl-d...@googlegroups.com
I always assumed MAX_DRAW_BUFFERS could be at most 16 because GL only
defines DRAW_BUFFER0 to DRAW_BUFFER15. I could be wrong though, but
34852 sounds like a crazy number.

On Mon, May 11, 2015 at 10:02 AM, Jerry Ylilammi

Shannon Woods

unread,
May 11, 2015, 1:20:00 PM5/11/15
to webgl-d...@googlegroups.com
On Mon, May 11, 2015 at 1:02 PM, Jerry Ylilammi <jerry.y...@gmail.com> wrote:
Hey Jamie,

That sounds like it might be viable. console.log("MaxDrawBuffers " + gl.MAX_DRAW_BUFFERS); gives me MaxDrawBuffers 34852.

That doesn't seem right. AFAIK even the best current consumer cards have at most 8 MAX_DRAW_BUFFERS, and I don't know of anything that'll let you have tens of thousands of simultaneous render targets. That seems more like a max dimension than a number of buffers...

Won

unread,
May 11, 2015, 1:24:01 PM5/11/15
to webgl-d...@googlegroups.com
Pretty sure you have to gl.GetInteger (or whatever) and that MAX_DRAW_BUFFERS is just an enumerant.

Jamie Madill

unread,
May 11, 2015, 2:38:05 PM5/11/15
to webgl-d...@googlegroups.com
Correct, the literal value of GL_MAX_DRAW_BUFFERS is 0x8824, or 34852. What you want is the value returned from GetIntegerv, which will probably be something like 4 or 8.
Reply all
Reply to author
Forward
0 new messages