My program has 4 render to texture cameras that get composited into the frame buffer via custom shader. I want to make a screen capture of one of the texture as a response to an event.
Attaching an Image to the camera works, but it copies the data to system memory after every frame. Not just when I need it.
I tried calling camera->attach(image); viewer->frame(); osgDB::writeImageFile(image); camera->attach(texture); in my event handler, but the image saved was blank.
I also tried calling texture->setImage(image), but that image is never updated with fresh data.
I also tried calling osg::Image::readPixels() in a Camera::postDrawCallback. This doesn't work because RenderStage detaches the FBO before the callback and I get a screenshot of the framebuffer instead of the texture. Would calling RenderStage::setDisableFboAfterRender(false) fix this? Whats the place to call that function from. Will this screw something up when I try to render to the actual framebuffer?
Is there another option I haven't thought of? Perhaps manually attaching the FBO and calling readPixels?
Thanks,
John
------------------
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=48059#48059
_______________________________________________
osg-users mailing list
osg-...@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org