Can You Use Output of Android MediaProjection API as Source for org.webrtc.MediaStream Instance?

413 views
Skip to first unread message

Robert Brinson

unread,
Dec 3, 2015, 4:03:54 PM12/3/15
to discuss-webrtc
Hi!

I am trying to create an Android app that utilizes the libjingle WebRTC native Android library to project the users Android desktop to a peer using WebRTC. To that end, I have successfully used the pristine.io libjingle mirror to recreate the Android apprtc example application using:

compile 'io.pristine:libjingle:10531@aar'

in my build.gradle file. The apprtc example woks fine with the https://apprtc.appspot.com/ demo web site. I have also created a separate app that records the user's screen to an H.264 encoded mp4 file using the MediaProjection library introduced in Android API 21.

Now, I would like to marry these two ideas into an app that utilizes the H.264 encoded file as the video/audio stream for the WebRTC peerconnection. Is this possible? The PeerConnection.addStream method expects an instance of MediaStream. How can you create an object of type MediaStream from an mp4 file?

Thank you for any insight you might be able to provide!

Robert

Christoffer Jansson

unread,
Dec 4, 2015, 3:40:29 AM12/4/15
to discuss-webrtc
Hi,

Unfortunately I do not know enough to provide you with a solution or workaround, however I have answered inline with some pointers.
You would need the raw format from the MediaProjection library (if it can provide that?), you do not want to encode an already encoded frame. Also writing and reading from disk is a big no no if you want real time video.

Ron Aaron

unread,
Jul 3, 2018, 5:43:39 AM7/3/18
to discuss-webrtc
Hi, Robert -

This is very similar to what I want to accomplish.  Did you find a way to do it?

Vishal Dalsania

unread,
Jul 3, 2018, 6:22:57 AM7/3/18
to discuss-webrtc
Your answer is ScreenCapturerAndroid.java

It is part of WebRTC code base for Android that can stream android screen.

Ron Aaron

unread,
Jul 3, 2018, 6:27:38 AM7/3/18
to discuss-webrtc
Ah, but I want to capture Audio (e.g. from a file, or some other source) and stream it.

Vishal Dalsania

unread,
Jul 3, 2018, 6:33:03 AM7/3/18
to discuss-webrtc
I dont have solution with me. But iff you look at PeerConnectionClient.java for Android you should be able to figure out how to do that.


Look for addStream method calls.

Ron Aaron

unread,
Jul 3, 2018, 6:42:32 AM7/3/18
to discuss-webrtc
Thank you.

But those are "org.webrtc.MediaStream" etc.  I don't see any path to attaching an "android.media.AudioTrack" or similar to a org.webrtc.MediaStream (or to create such a stream from a regular file etc).

Am I simply missing something?
Reply all
Reply to author
Forward
0 new messages