Hello,
I'm trying to use webrtc for video transmition obtained frame by frame in an Android app.
I already developed the receptor of the video available for video transmitted from browsers using getUserMedia().
As I seen, the Android webrtc api is focused on obtain 'usermedia' using videotrack/videosource to construct a MediaStream from the cameras available on the device. But, the video that I want to pass to the peerconection is not from a camera on the device and I recibed it frame by frame.
So, I wonder if it is possible to convert the frames received to a MediaStream object or if it is not, what would be the better/easier alternative from video transmition in Android app?
Thank you all for your time. I hope it is enough clear and sorry if I make some English mistakes.
Sergi.