Sending an arbitrary GLES textures from native app?

44 views
Skip to first unread message

Asen Jechev

unread,
Dec 15, 2017, 1:54:03 AM12/15/17
to discuss-webrtc
Hello!
I'm trying to build an app that sends out a modified video stream out to peers.
What I want to do is draw over the raw camera input in real time and send that as a video instead of just the base raw camera input.
To do this, I've created a CustomVideoCapturer java class from WebRTC's base VideoCapturer, which I send my modified frames to.
So far i've managed to get an OpenGL texture across by rendering what i want to render and sending it over the capturerObserver.onByteBufferFrameRecieved after converting the texture to YUV and reading the pixels with glReadPixels.

Unfortunately, this approach has been really slow and makes most devices heat up significantly in a short period of time, so i'm looking for a way to stream that texture by giving webRTC the texture ID.

However the result so far has been that when I call just capturerObserver.onTextureFrameCaptured, stream does not establish. I'm guessing what I need to do is wrap the native texture in the SurfaceTexture from the SurfaceTextureHelper and do some wizardry with EGL, however I'm not entirely sure on how exactly to implement that part.

Any pointers towards the right direction would be appreciated.
Reply all
Reply to author
Forward
0 new messages