Realtime rendering to texture possible?

84 views
Skip to first unread message

Luke Tramps

unread,
Jan 11, 2016, 9:09:40 AM1/11/16
to Genome2D
Would it be possible render a part of the viewport to a texture on every visible frame, so that you have some kind of 'video texture' resulting?

Maybe someone has achieved this already and could even provide some kind of minimal example?

Thanks :)

Luke Tramps

unread,
Jan 12, 2016, 6:32:07 AM1/12/16
to Genome2D
From what I found out so far it should be possible and it works when I render textures to texture via context directly but I would like to be able to render a complete scene onto a texture via GCamera, also capturing transformations applied to the renderables (a particle system) node.

I know I could do that by hand but I think GCaneraControler should be able to do it for me. However all my approaches result into a

"Sampler 0 binds a texture that is also bound for render to texture" error (3604)

Build c8246234d8631dcba680df86f3507402
Version 1.1
Date 2015-05-10 14:29:03

sHTiF

unread,
Jan 12, 2016, 6:59:21 AM1/12/16
to Genome2D
I have quite a lot of work on my shoulders atm but send me a minimal example where this error occurs for you and i'll see what I can do.

Luke Tramps

unread,
Jan 12, 2016, 7:57:43 AM1/12/16
to Genome2D
Hello Peter!

I thought it would work like this

var renderTex = GTextureManager.createRenderTexture(...);
genome.onPreRender.add(function(){
cameraControler.renderTargent = renderTex;
cameraControler.render();
cameraControler.renderTargent = null;
}

But it seems I must trigger the Context to render first and I don't see how. I appended a minimal example here.

//Full source
package
{
import com.genome2d.components.GCameraController;
import com.genome2d.components.renderable.GSprite;
import com.genome2d.context.GCamera;
import com.genome2d.context.GContextConfig;
import com.genome2d.Genome2D;
import com.genome2d.node.GNode;
import com.genome2d.textures.GTexture;
import com.genome2d.textures.GTextureManager;
import com.luketramps.shells.engine.data.PointVX;
import flash.display.BitmapData;
import flash.display.MovieClip;
import flash.display.Sprite;
import flash.events.Event;
import flash.events.MouseEvent;
import flash.geom.Point;
/**
* ...
* @author Lukas Damian Opacki
*/
[SWF(width="800",height="800",frameRate="60")]
public class TestGenome2DRenderTextureMinimal extends MovieClip
{
private var genome:Genome2D;
private var camNode:GNode;

public function TestGenome2DRenderTextureMinimal()
{
haxe.initSwc (this);

var genomeConfig:GContextConfig = new GContextConfig (stage);
genomeConfig.enableDepthAndStencil = true;
genome = Genome2D.getInstance ();
genome.onInitialized.addOnce (onGenome2dReady);
genome.init (genomeConfig);
}

private function onGenome2dReady():void
{
addSecondaryCam ();
addTestAnimation ();
addRenderTexture ();
}

private function addSecondaryCam():void
{
camNode = GNode.createWithComponent (GCameraController, "renderTexCameraNode").node;
}

// Render and display renderTexture.
private function addRenderTexture():void
{
var renderTex:GTexture= GTextureManager.createRenderTexture ("renderTexture", 500, 500);
var camCtrl:GCameraController = camNode.getComponent (GCameraController);

// Render each frame of sprite animation into renderTexture.
genome.onPreRender.add (function():void
{
camCtrl.renderTarget = renderTex;
camCtrl.render ();
camCtrl.renderTarget = null;
});

// Display renderTexture.
var renderTexSprite:GSprite = GNode.createWithComponent (GSprite);
renderTexSprite.texture = renderTex;
genome.root.addChild (renderTexSprite.node);

// Move renderable to distinguish it
stage.addEventListener (Event.ENTER_FRAME, function(e:*):void
{
renderTexSprite.node.x = stage.mouseX;
renderTexSprite.node.y = stage.mouseY;
});
}


// Adds a mvoing sprite that should be captured onto a renderTexture.
private var sprite:GSprite;
private var spritePos:Point;
private var spriteSpeed:Number;
private var spriteTravelDistance:Number = 100;
private function addTestAnimation():void
{
sprite = GNode.createWithComponent (GSprite);
sprite.texture = GTextureManager.createTexture ("greenTexture", new BitmapData (32, 32, false, 0x00ff00));

spritePos = new Point (200, 200);
sprite.node.x = spritePos.x;
sprite.node.y = spritePos.y;

camNode.addChild (sprite.node);

stage.addEventListener (Event.ENTER_FRAME, animateSprite);
}
private function animateSprite(e:Event):void
{
if (sprite.node.x <= spritePos.x)
spriteSpeed = 5;
else if (sprite.node.x >= spritePos.x + spriteTravelDistance)
spriteSpeed = -5;

sprite.node.x += spriteSpeed;
}

}

}

sHTiF

unread,
Jan 17, 2016, 6:26:37 AM1/17/16
to Genome2D
Ok I just put together a simple example using camera.renderTexture feature. Basically I have 2 cameras, one rendering to texture and second one rendering a sprite with that texture to the screen. You also need two containers one with all the content that will render using camera 1 to the texture and second one rendered only with camera 2 to back buffer.

Luke Tramps

unread,
Jan 18, 2016, 1:20:00 PM1/18/16
to Genome2D
Awsome, thank you peter!
Reply all
Reply to author
Forward
0 new messages