We are working on a project based WebRTC for video communication. Currently we want to add effects to the video. Basically on Android WebRTC use code like following to open camera:
final int textureArray[] = new int[1];
GLES20.glGenTextures(1, textureArray, 0);
int oesTextureId = textureArray[0];
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
checkNoGLES2Error("generateTexture");
SurfaceTeture surfaceTexture = new SurfaceTexture(oesTextureId);
surfaceTexture.setOnFrameAvailableListener(new SurfaceTexture.OnFrameAvailableListener() {
@Override
public void onFrameAvailable(SurfaceTexture surfaceTexture) {
surfaceTexture.updateTexImage();
final float[] transformMatrix = new float[16];
surfaceTexture.getTransformMatrix(transformMatrix);
onTextureFrameAvailable(oesTextureId, transformMatrix);
}
});
final android.hardware.Camera camera;
try {
camera = android.hardware.Camera.open(cameraId);
} catch (RuntimeException e) {
callback.onFailure(e.getMessage());
return;
}
try {
camera.setPreviewTexture(surfaceTexture);
} catch (IOException e) {
camera.release();
callback.onFailure(e.getMessage());
return;
}Then in onTextureFrameAvailable(oesTextureId, transformMatrix), it will send the texture to render and encoder individually.
For the render, it will use following glsl to render the texture:
VERTEX_SHADER_STRING:
varying vec2 interp_tc;
attribute vec4 in_pos;
attribute vec4 in_tc;
uniform mat4 texMatrix;
void main() {
gl_Position = in_pos;
interp_tc = (texMatrix * in_tc).xy;
};FRAGMENT_SHADER_STRING:
#extension GL_OES_EGL_image_external : require
precision mediump float;
varying vec2 interp_tc;
uniform samplerExternalOES oes_tex;
void main() {
gl_FragColor = texture2D(oes_tex, interp_tc);
};While the encoder will convert the texture into YUV data then I420 frame and encode.
If I want to add effects to the video, the idea is to copy the oes texture to a offscreen normal texture before onTextureFrameAvailable(oesTextureId, transformMatrix), then add effects to the offscreen texture. All these are done. But If don't change the render and encoder process, I need to copy the changed offscreen texture to another oes texture and deliver to onTextureFrameAvailable for render and encode. But I don't know how to. Anyone can help?
Also if there is no such a method, then I must change the render and encoder to adapt new offscreen texture, which will be a bad idea(the effects can't be added and removed flexibly).Please confirm.
--
---
You received this message because you are subscribed to the Google Groups "discuss-webrtc" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss-webrtc+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/discuss-webrtc/CAAO0x15wrprDR-LW71-UG9L6ERvacs0xe%3D4xknAJF5CcB_PLRQ%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.