Hello,
I also could not understand the implementation of VideoCapturer. However, I changed the direction and am trying to do Screen Sharing from Android using MediaProjection API of Android 5.0. It is not good solution as it continuously takes the snapshots of Android screen and send them using DataChannel to other peer. For other peer, continuously changing images give the feel of shared screen.
I know it is just a workaround and not an efficient solution. I would change it once AndroidScreenCapturer is provided by WebRTC guys.
My work is still under construction and not complete, but it can give you some direction (I am stuck on some stupid place where I try to render sent images on canvas on web, and they are not being rendered there). The following question I asked might give you some hint on what I am trying to do:
Also if you find the solution to this, then do answer please :)
The SO question discusses the complete code which captures the android screen and then takes snapshots continuously and then send them using data channel. The idea was taken by this open source sample code which uses TemaSys SDK to share screen from Android. I just did not want to use their SDK so I tried to do myself.
You might like to see the sample code I use to render the sent images from android on web. This is the part where I am having issues:
Once again, I know it is not a good solution but a good workaround until this feature is officially supported in android by webrtc team. Kindly, also let me know if you are successful on doing this as I am also not able to render the sent images on web.
Regards