How to add effects to Webrtc video on Android

970 views
Skip to first unread message

raulf tang

unread,
Jun 13, 2017, 6:59:06 AM6/13/17
to discuss-webrtc

We are working on a project based WebRTC for video communication. Currently we want to add effects to the video. Basically on Android WebRTC use code like following to open camera:

final int textureArray[] = new int[1];
GLES20.glGenTextures(1, textureArray, 0);
int oesTextureId = textureArray[0];
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,     GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
checkNoGLES2Error("generateTexture");

SurfaceTeture surfaceTexture = new SurfaceTexture(oesTextureId);
surfaceTexture.setOnFrameAvailableListener(new SurfaceTexture.OnFrameAvailableListener() {
  @Override
  public void onFrameAvailable(SurfaceTexture surfaceTexture) {
    surfaceTexture.updateTexImage();
    final float[] transformMatrix = new float[16];
    surfaceTexture.getTransformMatrix(transformMatrix);
    onTextureFrameAvailable(oesTextureId, transformMatrix);
  }
});

final android.hardware.Camera camera;
try {
  camera = android.hardware.Camera.open(cameraId);
} catch (RuntimeException e) {
  callback.onFailure(e.getMessage());
  return;
}

try {
  camera.setPreviewTexture(surfaceTexture);
} catch (IOException e) {
  camera.release();
  callback.onFailure(e.getMessage());
  return;
}

Then in onTextureFrameAvailable(oesTextureId, transformMatrix), it will send the texture to render and encoder individually.

For the render, it will use following glsl to render the texture:

VERTEX_SHADER_STRING:

  varying vec2 interp_tc;
  attribute vec4 in_pos;
  attribute vec4 in_tc;
  uniform mat4 texMatrix;
  void main() {
      gl_Position = in_pos;
      interp_tc = (texMatrix * in_tc).xy;
   };

FRAGMENT_SHADER_STRING:

#extension GL_OES_EGL_image_external : require
precision mediump float;
varying vec2 interp_tc;
uniform samplerExternalOES oes_tex;
void main() {
  gl_FragColor = texture2D(oes_tex, interp_tc);
};

While the encoder will convert the texture into YUV data then I420 frame and encode.

If I want to add effects to the video, the idea is to copy the oes texture to a offscreen normal texture before onTextureFrameAvailable(oesTextureId, transformMatrix), then add effects to the offscreen texture. All these are done. But If don't change the render and encoder process, I need to copy the changed offscreen texture to another oes texture and deliver to onTextureFrameAvailable for render and encode. But I don't know how to. Anyone can help?

Also if there is no such a method, then I must change the render and encoder to adapt new offscreen texture, which will be a bad idea(the effects can't be added and removed flexibly).Please confirm.

Niels Moller

unread,
Jun 16, 2017, 5:25:09 AM6/16/17
to discuss...@googlegroups.com
On Tue, Jun 13, 2017 at 12:59 PM, raulf tang <raul...@gmail.com> wrote:
> If I want to add effects to the video, the idea is to copy the oes texture
> to a offscreen normal texture before onTextureFrameAvailable(oesTextureId,
> transformMatrix), then add effects to the offscreen texture. All these are
> done. But If don't change the render and encoder process, I need to copy the
> changed offscreen texture to another oes texture and deliver to
> onTextureFrameAvailable for render and encode. But I don't know how to.

To me, it seems a bit silly that the opengl shader language uses
different types for an "oes" texture (what we get from the camera) and
other textures. If you can find some nice way to hide this difference
in the java interfaces, that would be ideal.

Maybe there's some easy way, e.g., a class which represents a texture
and includes both the texture id and the correct opengl type as a
string, so that the proper type can be substituted into the fragment
shader program. If we go that way, the transformation matrix should
probably be moved into that same class as well.

(I wrote the original version of the fragment shader used for
texture-to-yuv conversion, with guidance from Magnus Jedvert, but I
don't have any deep opengl knowledge).

Regards,
/Niels

Sami Kalliomäki

unread,
Jun 16, 2017, 8:25:34 AM6/16/17
to discuss-webrtc
We are already working on refactoring the Java VideoFrames. TextureBuffer will be used to represent these in the future.


--

---
You received this message because you are subscribed to the Google Groups "discuss-webrtc" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss-webrtc+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/discuss-webrtc/CAAO0x15wrprDR-LW71-UG9L6ERvacs0xe%3D4xknAJF5CcB_PLRQ%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

Reply all
Reply to author
Forward
0 new messages