[osg-users] Custom Camera and Multi-stage Rendering (RenderStage)

335 views
Skip to first unread message

Paul Leopard

unread,
Jun 8, 2012, 11:30:14 AM6/8/12
to osg-...@lists.openscenegraph.org
I'm a little confused here … I’m working on an application and I think I’m close to completing it but I have come to a point where I need some help.
First some background :

The application requires anti-aliasing of a scene and the subsequent creation of floating point color and depth textures that can be sampled and used like any ordinary texture. Specifically I need to do the following:
1) Render the scene to an offscreen FBO that has float color and depth render buffers attached to it and is configured for AA
2) Automatically blit the color and depth buffers in the FBO to another FBO that has attached float color and depth textures.
3) Query/sample the depth texture and map the color texture onto a quad

I have successfully got this to work in standard OpenGL. Now I am trying to understand how to do this in an OpenSceneGraph context. I see that RenderStage has what I need in terms of the render buffer and resolve buffer blit … good so far …

So … I suspect that what I need to do is to get a pointer to a RenderStage and configure that instance. I have a custom camera class (class IRCamera : public osg::Camera) that I am using for offscreen rendering. Currently it just has a color texture attached to it and I am using that rendered texture successfully. I want to rework this custom camera class to use the multi-stage approach so that I can take advantage of AA and still have the rendered results in a texture.

Is there a way for me to get a pointer to a RenderStage instance associated with my custom camera? Is this the proper approach for what I want to do?
Thanks in advance,
Paul

------------------
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=48142#48142

_______________________________________________
osg-users mailing list
osg-...@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Paul Martz

unread,
Jun 8, 2012, 12:25:08 PM6/8/12
to osg-...@lists.openscenegraph.org
I'm fairly certain you can access the RenderStage in a Camera callback, but I'm
not sure you're approaching the problem correctly.

It seems like the multipass rendering you describe below could be accomplished
by using a series of Cameras. In OSG these must be arranged hierarchically, but
you set the Cameras as pre- or post-render to specify the sequential rendering
order. There's an osgprerender example that uses multiple Cameras, which you
might find helpful.
-Paul

Tim Moore

unread,
Jun 8, 2012, 1:15:12 PM6/8/12
to OpenSceneGraph Users
There's also the osgfpdepth example, which does render to an FBO with
a floating point depth buffer and then renders the image to the window
color buffer.

Tim

Paul Leopard

unread,
Jun 8, 2012, 2:11:23 PM6/8/12
to osg-...@lists.openscenegraph.org
Thanks for your help, I'll check into the Camera callback. I've dissected the the osgfpdepth example already ... that's where I learned to do the floating point depth buffer and the anti-aliasing setup for a frame buffer object. I'll look into it again to see if I've missed something ...

I'm fairly certain that I'm approaching it the correct way as RenderStage calls glBlitFramebuffer(...) to blit from a buffer to a texture according to a buffer attachment map owned by its resolve FBO :


Code:
GLbitfield blitMask = 0;
//find which buffer types should be copied
for (FrameBufferObject::AttachmentMap::const_iterator
it = _resolveFbo->getAttachmentMap().begin(),
end =_resolveFbo->getAttachmentMap().end(); it != end; ++it)
{
switch (it->first)
{
case Camera::DEPTH_BUFFER:
blitMask |= GL_DEPTH_BUFFER_BIT;
break;
case Camera::STENCIL_BUFFER:
blitMask |= GL_STENCIL_BUFFER_BIT;
break;
case Camera::PACKED_DEPTH_STENCIL_BUFFER:
blitMask |= GL_DEPTH_BUFFER_BIT | GL_STENCIL_BUFFER_BIT;
break;
case Camera::COLOR_BUFFER:
blitMask |= GL_COLOR_BUFFER_BIT;
break;
default: ...
}
}
// Bind the resolve framebuffer to blit into.
_fbo->apply(state, FrameBufferObject::READ_FRAMEBUFFER);
_resolveFbo->apply(state, FrameBufferObject::DRAW_FRAMEBUFFER);
if (blitMask)
{
// Blit to the resolve framebuffer.
fbo_ext->glBlitFramebuffer(
0, 0, static_cast<GLint>(_viewport->width()), static_cast<GLint>(_viewport->height()),
0, 0, static_cast<GLint>(_viewport->width()), static_cast<GLint>(_viewport->height()),
blitMask, GL_NEAREST);
}


[/code]

------------------
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=48145#48145

John Richardson

unread,
Jun 8, 2012, 6:24:55 PM6/8/12
to osg-...@lists.openscenegraph.org
Hello,

Delurking.

8 August 2012. 10AM - 11AM.

Commence making those videos and slides. Be the first geek in your
neighborhood to reserve a slot....:)

John F. Richardson

Paul Leopard

unread,
Jun 12, 2012, 3:13:15 PM6/12/12
to osg-...@lists.openscenegraph.org
I'm almost there, just can't seem to figure out how to attach the osg::FrameBufferObject instances to the camera in a way that they will be recorded in the buffer attachment map. I've summarized the process below, replacing some variables with uppercase descriptions of what they represent. Anyone have an idea of how to fill in the TODO tags below?


Code:

void IRCamera::initBuffers( osg::GraphicsContext* pGfxContext )
{
// Get graphics context state and ID and the FBO extensions
osg::State* pGfxContextState = pGfxContext->getState();

unsigned int gfxContextID = pGfxContextState->getContextID();

osg::FBOExtensions* pFBOExtensions = osg::FBOExtensions::instance( gfxContextID, true );

// Extract some info ...
bool isMultiSample = N_COLOR_SAMPLES > 0;
bool isCSAA = N_COVERAGE_SAMPLES > N_COLOR_SAMPLES;
size_t texWidth = COLOR_TEXTURE->getTextureWidth();
size_t texHeight = COLOR_TEXTURE->getTextureHeight();

// Create the framebuffer objects and apply the graphics context state to them

m_RenderFBO = new osg::FrameBufferObject();
m_RenderFBO->apply( *pGfxContextState );

m_ResolveFBO = new osg::FrameBufferObject();
m_ResolveFBO->apply( *pGfxContextState );

// Non-multisample setup just attaches textures directly to the FBO's
if ( !isMultiSample )
{
// Attach color texture to the render FBO
m_RenderFBO->setAttachment(
osg::Camera::COLOR_BUFFER,
osg::FrameBufferAttachment( COLOR_TEXTURE )
);

// Attach depth texture to the render FBO
m_RenderFBO->setAttachment(
osg::Camera::DEPTH_BUFFER,
osg::FrameBufferAttachment( DEPTH_TEXTURE )
);

// @TODO@: ATTACH THE RENDER FBO SOMEHOW SO THAT IT IS REGISTERED
// IN THE BUFFER ATTACHMENT MAP
}

// Multisample setup
else
{
// Create and attach a color renderbuffer to the RENDER FBO
osg::RenderBuffer* pColorRB =
new osg::RenderBuffer(
texWidth,
texHeight,
COLOR_BUFFER_FORMAT,
N_COVERAGE_SAMPLES,
N_COLOR_SAMPLES
);
m_RenderFBO->setAttachment(
osg::Camera::COLOR_BUFFER,
osg::FrameBufferAttachment( pColorRB )
);

// Create and attach a depth renderbuffer to the RENDER FBO
osg::RenderBuffer* pDepthRB =
new osg::RenderBuffer(
texWidth,
texHeight,
DEPTH_BUFFER_FORMAT,
N_COVERAGE_SAMPLES,
N_COLOR_SAMPLES
);
m_RenderFBO->setAttachment(
osg::Camera::DEPTH_BUFFER,
osg::FrameBufferAttachment( pDepthRB )
);

// @TODO@: ATTACH THE RENDER FBO SOMEHOW SO THAT IT IS REGISTERED
// IN THE BUFFER ATTACHMENT MAP

// Attach color texture to RESOLVE FBO
m_ResolveFBO->setAttachment(
osg::Camera::COLOR_BUFFER,
osg::FrameBufferAttachment( COLOR_TEXTURE )
);

// Attach depth texture to RESOLVE FBO
m_ResolveFBO->setAttachment(
osg::Camera::DEPTH_BUFFER,
osg::FrameBufferAttachment( DEPTH_TEXTURE )
);

// @TODO@: ATTACH THE RESOLVE FBO SOMEHOW SO THAT IT IS REGISTERED
// IN THE BUFFER ATTACHMENT MAP
}

// Double check everything
size_t nBuffers = this->getBufferAttachmentMap().size();
cout << "BUFFER ATTACHMENT MAP SIZE : " << nBuffers << endl;

sgp_core::String errorMsg;
try
{
sgp_osg::CheckFramebufferStatus( pGfxContext );
}
catch( ... )
{
pFBOExtensions->glBindFramebuffer( GL_FRAMEBUFFER_EXT, 0 );
SGP_RETHROW;
}




------------------
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=48253#48253

Paul Leopard

unread,
Jun 12, 2012, 5:36:34 PM6/12/12
to osg-...@lists.openscenegraph.org
I guess what I am asking is this ...

I have attached render buffers and textures to the two FrameBufferObject instances m_ResolveFBO and m_RenderFBO ...

Now, how do I attach the two FrameBufferObject instances to a Camera's buffer attachment map?

I see two Camera 'attach(...)' methods and neither deals with a FrameBufferObject ...


Thanks,
Paul

------------------
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=48255#48255

Paul Leopard

unread,
Jun 13, 2012, 2:14:24 PM6/13/12
to osg-...@lists.openscenegraph.org
Now I'm thinking that I need to just create an instance of RenderStage, make it a member of my custom camera class, set it up with the frame buffer objects and attachments, then use it by adding a post draw callback to my custom camera class (derived from osg::Camera) ... Comments on this approach?

------------------
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=48289#48289
Reply all
Reply to author
Forward
0 new messages