Hi!
I am trying to create an Android app that utilizes the libjingle WebRTC native Android library to project the users Android desktop to a peer using WebRTC. To that end, I have successfully used the
pristine.io libjingle mirror to recreate the Android apprtc example application using:
compile 'io.pristine:libjingle:10531@aar'
in my build.gradle file. The apprtc example woks fine with the
https://apprtc.appspot.com/ demo web site. I have also created a separate app that records the user's screen to an H.264 encoded mp4 file using the MediaProjection library introduced in Android API 21.
Now, I would like to marry these two ideas into an app that utilizes the H.264 encoded file as the video/audio stream for the WebRTC peerconnection. Is this possible? The PeerConnection.addStream method expects an instance of MediaStream. How can you create an object of type MediaStream from an mp4 file?
Thank you for any insight you might be able to provide!
Robert