Offscreen rendering of float textures into FBO

464 views
Skip to first unread message

Ivan Adanja

unread,
Aug 21, 2014, 11:56:44 AM8/21/14
to vi...@googlegroups.com
Hey guys,

I hope you can find some time to help an OpenGL newbie out by shedding some light on the questions below :)

I'm looking at the examples/howto/offscreen.py (bleeding edge github @6f6bde03) and wondering what would it take to retrieve the a float buffer instead of an RGBA one?

My understanding is that _screenshot() calls read_pixels(), which performs the following OpenGL call to retrieve the color buffer:

   im = gl.glReadPixels(x, y, w, h, gl.GL_RGB, gl.GL_UNSIGNED_BYTE)

But I noticed that gl.glReadPixels only seem supports the following formats: GL_ALPHA, GL_RGB, GL_RGBA and only for GL_UNSIGNED_BYTE.

[Q1] What need to be added to get a float image out?  Where is the dtype of the numpy returned array set?

[Q2] Ultimately, my goal is to be able to write shaders that output a float (custome depth buffers) or triplets of floats (normal maps) into a texture that I can retrieve to the CPU. 
       I don't understand the FrameBuffer mechanism enough to see what is missing to allow me to do that?
       The code in examples/howto/offscreen.py hints at this working, since the render texture, _rendertex, is a float. But the image returned by the shader and _screenshot is an RGBA.

[Q3] Also I've seen solution that retrieve textures using glGetTexImage insted of glReadPixel. Do you know what is the difference between these approaches?

Thanks in advance,
Ivan

Cyrille Rossant

unread,
Aug 21, 2014, 12:06:00 PM8/21/14
to vi...@googlegroups.com
Hi Ivan

I'll let Almar or Nicolas answer in more details as they're the ones
who wrote gloo. I assume that doing everything on the Red channel and
using RGBA textures is not satisfactory for you because of
performance? (and it would only be a temporary solution anyway!)

> [Q3] Also I've seen solution that retrieve textures using glGetTexImage
> insted of glReadPixel. Do you know what is the difference between these
> approaches?

I think that glGetTexImage is not available in OpenGL ES 2.0, which is
the spec we currently use in Vispy. However, it should be possible to
add this function to the non-ES namespace when OpenGL desktop is
available -- that might simplify certain operations like the offscreen
rendering example.

Best,
Cyrille

Nicolas P. Rougier

unread,
Aug 21, 2014, 12:21:30 PM8/21/14
to vi...@googlegroups.com

On 21 Aug 2014, at 17:56, Ivan Adanja <ada...@gmail.com> wrote:

> Hey guys,
>
> I hope you can find some time to help an OpenGL newbie out by shedding some light on the questions below :)
>
> I'm looking at the examples/howto/offscreen.py (bleeding edge github @6f6bde03) and wondering what would it take to retrieve the a float buffer instead of an RGBA one?
>
> My understanding is that _screenshot() calls read_pixels(), which performs the following OpenGL call to retrieve the color buffer:
>
> im = gl.glReadPixels(x, y, w, h, gl.GL_RGB, gl.GL_UNSIGNED_BYTE)
>
> But I noticed that gl.glReadPixels only seem supports the following formats: GL_ALPHA, GL_RGB, GL_RGBA and only for GL_UNSIGNED_BYTE.
>
> [Q1] What need to be added to get a float image out? Where is the dtype of the numpy returned array set?


> im = gl.glReadPixels(x, y, w, h, gl.GL_ALPHA, gl.GL_UNSIGNED_BYTE)

With this you would get a (h,w,1) array with dtype np.uint8


>
> [Q2] Ultimately, my goal is to be able to write shaders that output a float (custome depth buffers) or triplets of floats (normal maps) into a texture that I can retrieve to the CPU.
> I don't understand the FrameBuffer mechanism enough to see what is missing to allow me to do that?

It depends on what you want to achieve with the depth or normal buffer. if you intend to use them in a shader, you'd better use texture for them and pass these texture as regular texture in your shader.


> The code in examples/howto/offscreen.py hints at this working, since the render texture, _rendertex, is a float. But the image returned by the shader and _screenshot is an RGBA.

For depth buffer, you will need a (h,w) texture, not a (h,w,3)

>
> [Q3] Also I've seen solution that retrieve textures using glGetTexImage insted of glReadPixel. Do you know what is the difference between these approaches?

glReadPixel read from the framebuffer (the screen in most situation) while glGetTexImage can read from a texture.

Hope that helps.

>
> Thanks in advance,
> Ivan
>
> --
> You received this message because you are subscribed to the Google Groups "vispy" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to vispy+un...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.

Ivan Adanja

unread,
Aug 22, 2014, 11:12:37 AM8/22/14
to vi...@googlegroups.com
Cyrille and Nicolas, thank you both for your speedy replies!

To answer your suggestions on using one 8bit channel:

> im = gl.glReadPixels(x, y, w, h, gl.GL_ALPHA, gl.GL_UNSIGNED_BYTE) 
> With this you would get  a (h,w,1) array with dtype np.uint8 

> I assume that doing everything on the Red channel and 
> using RGBA textures is not satisfactory for you because of 
> performance? (and it would only be a temporary solution anyway!) 

No, indeed, 8 bits is not enough resolution for my needs. I need 32bit floats.



>> [Q2] Ultimately, my goal is to be able to write shaders that output a float (custome depth buffers) 
>>  or triplets of floats (normal maps) into a texture that I can retrieve to the CPU. 
>>        I don't understand the FrameBuffer mechanism enough to see what is missing to allow me to do that? 
> It depends on what you want to achieve with the depth or normal buffer. 
> if you intend to use them in a shader, you'd better use texture for them and pass these texture as regular texture in your shader. 

No, I would like to use the buffer as output of the shader, rather than its input.


Let me give a concrete example, so I avoid misusing terminology, as I'm not very familiar with it:
I want to render offscreen, a float32 depth image and get it back to the CPU for further processing.

Here's the fragment shader I want to use:

layout(location = 0) out float my_depth;
in vec4 camera_view_position;

void main()
{
   my_depth = camera_view_position.z;
}


AFAIU, the code in examples/howto/offscreen.py renders to a FrameBuffer (_fbo), who's color attachement is a float32:

# Texture where we render the scene.
self._rendertex = gloo.Texture2D(shape=self.size, dtype=np.float32)
# FBO.
self._fbo = gloo.FrameBuffer(self._rendertex,
                            gloo.DepthBuffer(self.size))


However _screenshot() retrieves a RGBA image back (4 8bit channels).

So I guess, the question persists: how could I setup/retrieve a float32 attachement?

Best regards,
Ivan

Cyrille Rossant

unread,
Aug 22, 2014, 11:16:31 AM8/22/14
to vi...@googlegroups.com
> No, indeed, 8 bits is not enough resolution for my needs. I need 32bit
> floats.

Probably not the best solution, but this *might* work for you in the
meantime: http://stackoverflow.com/questions/18453302/how-do-you-pack-one-32bit-int-into-4-8bit-ints-in-glsl-webgl

Ivan Adanja

unread,
Aug 22, 2014, 11:29:41 AM8/22/14
to vi...@googlegroups.com
Thanks, Cyrille. Good reference to have, I'll give it a try.

But I would still like to understand where the pipeline needs to be changed to accommodate float buffer output (needed for e.g. vec3 floats). I'm just looking for some pointers so I can experiment. 
Basically I would like to know if it would be easy enough for an opengl newbie to do some small hacks or there are some architectural limitations that I won't be able to easily fix?

Thanks,
Ivan

Nicolas P. Rougier

unread,
Aug 22, 2014, 11:57:16 AM8/22/14
to vi...@googlegroups.com

You can also try:

gl.glReadPixels(x, y, w, h, gl.GL_ALPHA, gl.GL_FLOAT)

and you should get a (h,w) np.float32 array

Ivan Adanja

unread,
Aug 22, 2014, 12:09:49 PM8/22/14
to vi...@googlegroups.com
Great. Thanks Nicolas.

Ivan


You received this message because you are subscribed to a topic in the Google Groups "vispy" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/vispy/QH33cVZAg8U/unsubscribe.
To unsubscribe from this group and all its topics, send an email to vispy+un...@googlegroups.com.

Eric Larson

unread,
Sep 3, 2014, 3:35:51 PM9/3/14
to vi...@googlegroups.com
Some of these issues have been addressed by PR (returning float instead of uint8):


Ivan, you're welcome to open an issue regarding only returning the `alpha` channel. I didn't enable it in my PR above because we'd need to decide how the `read_pixels` parameters should work, which will require a little bit of discussion.

Cheers,
Eric

Ivan Adanja

unread,
Sep 3, 2014, 5:27:43 PM9/3/14
to vi...@googlegroups.com
Thanks Eric!
I'll try it out as soon as I get access to a proper graphics card again and open an issue.

Currently I get a runtime error (e.g., running offscreen.py) on both Intel HD Graphics 4000 and 5000.
RuntimeError: Combination of internal formats used by attachments is not supported.
(in vispy/gloo/framebuffer.py, line 374, in _attach)

I guess this warrants an issues too.

Best regards,

Ivan
Reply all
Reply to author
Forward
0 new messages