Can I edit video frames before they are sent?

1,831 views
Skip to first unread message

Cheyenne Forbes

unread,
Oct 4, 2017, 4:43:18 PM10/4/17
to discuss-webrtc
I want to add filters to video streams. Is it possible to edited video frames and add filters like Black and White before they are sent to the other person?

Cheyenne Forbes

unread,
Oct 7, 2017, 3:28:46 PM10/7/17
to discuss-webrtc
Anyone?

Philipp Hancke

unread,
Oct 7, 2017, 3:41:53 PM10/7/17
to WebRTC-discuss

--

---
You received this message because you are subscribed to the Google Groups "discuss-webrtc" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss-webrtc+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/discuss-webrtc/2655a2f5-446b-4ac4-8aac-8856aafcdfd8%40googlegroups.com.

For more options, visit https://groups.google.com/d/optout.

Cheyenne Forbes

unread,
Oct 7, 2017, 5:51:30 PM10/7/17
to discuss-webrtc
No, I mean like editing the libjingle library and use a java or objective-c image/video filtering library to add filters to the camera stream


On Saturday, October 7, 2017 at 2:41:53 PM UTC-5, Philipp Hancke wrote:
2017-10-07 21:28 GMT+02:00 Cheyenne Forbes <cheyenne.o...@gmail.com>:
Anyone?

On Wednesday, October 4, 2017 at 3:43:18 PM UTC-5, Cheyenne Forbes wrote:
I want to add filters to video streams. Is it possible to edited video frames and add filters like Black and White before they are sent to the other person?

--

---
You received this message because you are subscribed to the Google Groups "discuss-webrtc" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss-webrt...@googlegroups.com.

Silvia Pfeiffer

unread,
Oct 8, 2017, 4:44:35 PM10/8/17
to discuss...@googlegroups.com
It's sure possible, browsers do it for WebRTC.

If you want to do it natively rather than in the browser, you have to replicate what the browsers have done on canvas etc. You're basically outside WebRTC with such an effort. You'll have to study browser source code to learn how it's done.

Best Regards,
Silvia.

To unsubscribe from this group and stop receiving emails from it, send an email to discuss-webrtc+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/discuss-webrtc/c6464f90-d9e8-4440-a4a2-5a6f34e1b48e%40googlegroups.com.

Eric Davies

unread,
Oct 8, 2017, 6:47:59 PM10/8/17
to discuss-webrtc
It would depend on how competent you are in OpenGL.

I can only speak for the android API, not the iOS api, at this point but what you can do is:
Create a camera VideoTrack as usual.
Set up your own custom VideoRenderer to capture frames from the a local camera media stream's video track. The frames usually represent the raster image as a texture so it can stay in the GPU.
Derive your own texture from each frame.
Create your own custom VideoCapturer object that simply passes along that texture as a frame.
Use the VideoCapturer to build a VideoSource to build a VideoTrack to build a local MediaStream.


deni...@webrtc.org

unread,
Oct 9, 2017, 6:37:29 AM10/9/17
to discuss-webrtc
Similarly, on iOS you can use RTCCameraVideoCapturer and chain of delegates to manipulate RTCVideoFrame objects (containing raw buffers).
Reply all
Reply to author
Forward
0 new messages