First I tried to create a new videotrack containing the drawing canvas AND the frozen frame to draw on but I couldn't succeed. When creating a videotrack with peerConnectionFactory.createVideoTrack("ARDAMSv1_" + rand, videoSource);
I am supposed to specify the video source of the track but the source can only be a VideoSource
and this VideoSource can only be created with a VideoCapturer
which is directly linked to a device camera (without any drawing on it of course). This explains why User2 is not seeing any drawing on his device.
My question here is: how can I create a VideoCapturer which can stream the camera stream (frozen frame) AND a canvas with the drawing on it?
So I tried to implements my own VideoCapturer to either:
1) Capture a View (for example the layout containing the drawing and the frozen frame) and stream it for the VideoSource
OR 2)Capture the camera view but also add the drawing to the frame before streaming it.
I couldn't make any of this work because I have no idea how to manipulate the I420Frame object to draw on it and return it with the right callback.
Maybe I am totally wrong with this approach and need to do something completely different, I am open to any suggestion. PS: I am using Android API 25 with WebRTC 1.0.19742. I do NOT want to use any paid third party SDK/lib.
Does anyone have a clue how to proceed to achieve a simple WebRTC live drawing from one android app to another android app?