Using Android 5.0 android.media.projection API for Screen Sharing

1,848 views
Skip to first unread message

Sojharo Mangi

unread,
Mar 9, 2015, 10:02:51 AM3/9/15
to discuss...@googlegroups.com
Hello,

For my Android app, I have to do the screen sharing with WebRTC. 

While looking on the Internet, I came to know about new screen capturing API in Android 5.0. 


I have following questions regarding the above one:

1. Can we use this API with WebRTC in Android application?

2. How compatible will it be with WebRTC?

3. It has some restrictions like it want apps to display the dialog box before capturing the screen content. I mean, if user presses the button on the app to share the screen then why need to show the permission dialog box. Is it a good option to, instead using this API, we go to Android NDK to build our own screen capturer?

4. What other problems or concerns we should expect if we are going to use it in production app.

In one of my posts in this group, Alexandre Gouaillard had suggested following:

"In a native application, if you manage to implement a capturer that is compatible with a native stack (e.g. inherit from desktopCapturer in the webrtc.org source code) then you can use the rest of the stack and send your desktop as any other video track."

Here is the link to the post: 


Some questions regarding the above suggestion:

1. What freedoms will I have (compared to Android 5.0 API) in this option i.e. avoid restriction to ask user's permission using a separate dialog box.

2. From Alexandre's comment, do I have to build for Android again if I am going to inherit the desktopCapturer of WebRTC? (I have libjingle_peerconnection.jar already built) Or I can build it separately with just desktopCapturer.

3. What are the drawbacks of using some other third party open source screen capturer written in C++ which is written exclusively for Android? I mean if desktopCapturer of webrtc source is very generic.

4. What would be the effort if I implement destopCapturer of WebRTC as compared to if I implement Android 5.0 screen sharing API?




--
Regards,
 
Sojharo


Jiayang Liu

unread,
Mar 12, 2015, 12:17:07 PM3/12/15
to discuss...@googlegroups.com
WebRTC does not provide a desktopCapturer implementation for Android, which means you need to provide the implementation using whatever OS supported API.

I think what Alexandre meant is, if you write an implementation of desktopCapturer for Android, you can easily use it with the webrtc stack, e.g. to feed the stream into PeerConnection. But you do not intend to use the rest of the WebRTC stack, there is no point in implementing desktopCapturer.

Benjamin Hamrick

unread,
Nov 23, 2015, 9:58:04 AM11/23/15
to discuss-webrtc
Can you elaborate how this could be done easily? I am trying to send the screen capture using android.media.projection API using webrtc.

Sethuraman Ramachandran

unread,
Apr 28, 2016, 8:01:03 PM4/28/16
to discuss-webrtc
@Benjamin - I have the same question.
I am trying to do a screen sharing feature in my cordova based Android app. I would like to use WebRTC.
Were you able to achieve this? 

Sethuraman Ramachandran

unread,
Apr 28, 2016, 8:01:03 PM4/28/16
to discuss-webrtc
@Sojharo - did you find the answers you were looking for? I am looking to do something similar. Thanks

Sojharo Mangi

unread,
Apr 29, 2016, 3:57:51 PM4/29/16
to discuss-webrtc
hi,

I was able to do screen sharing from Android to web or other android, but not exactly in a way that Liu has explained. It is a bit late night here. I would write detailed post here on how I did it.

--

---
You received this message because you are subscribed to the Google Groups "discuss-webrtc" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss-webrt...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/discuss-webrtc/b1239cf3-6aa8-4e45-91b0-6ea5a9f13114%40googlegroups.com.

For more options, visit https://groups.google.com/d/optout.



--
Regards,
 
Sojharo


Sethuraman Ramachandran

unread,
Apr 30, 2016, 1:58:31 AM4/30/16
to discuss...@googlegroups.com

Thanks Sojharo. Appreciate it.

You received this message because you are subscribed to a topic in the Google Groups "discuss-webrtc" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/discuss-webrtc/nwU4DAgP-BQ/unsubscribe.
To unsubscribe from this group and all its topics, send an email to discuss-webrt...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/discuss-webrtc/CADn_1WSPbdQf%3DmskUx2o9_aBdvp890RcAZPEFLdtBqvhTn2K1A%40mail.gmail.com.

Sojharo Mangi

unread,
Apr 30, 2016, 9:01:26 AM4/30/16
to discuss-webrtc
Hello,

To continue with my previous post, I had to use Android Media Projection API along with WebRTC DataChannel to implement screen sharing from Android to Web.

Media Projection API was able to give me snapshots from a stream of screen capture. I was then using the DataChannel to send these snapshots to web. Then web rendered these incoming snapshots on canvas. It gave the feeling of android screen being shared.

See the following links on how image is sent using data channel and received and rendered on web:



For how android uses media projection API to capture snapshots and send as images to web, see the following code:



For more options, visit https://groups.google.com/d/optout.



--
Regards,
 
Sojharo


Sethuraman Ramachandran

unread,
May 2, 2016, 1:10:37 AM5/2/16
to discuss...@googlegroups.com
@Sojharo - I will check this out today and email if I have any queries. Thanks

Sethuraman Ramachandran

unread,
May 9, 2016, 9:24:32 AM5/9/16
to discuss...@googlegroups.com, Neeraj Rathore
@Sojharo - I tried this out and what is happening is that on Android, the senddata seems to work fine, but on the other end, where I have HTML/JS listening on a laptop browser (using Skylink web), I am not getting any events. 
If in the same code, I send out a Message, I am able to receive on the other end. Only for data, there is no event.
We tried experimenting and tried receiving on another android - which worked!!
So what is going wrong? We also saw one of your other posts:

Did you face similar problems sending from Android to web/desktop ? What am I doing wrong? Any pointers would be helpful.
Thanks, Seth

Sojharo Mangi

unread,
May 9, 2016, 3:42:26 PM5/9/16
to discuss-webrtc, Neeraj Rathore
Yes, we had faced similar problems. Image was not shown on the html in a browser.

We send image in form of several binary chunks from Android. Finally, when one image is sent, then we send some string (it is basically size of next image in byteLength) and then send binary chunks for next image. We send string so that web on other side knows that now the next binary chunk is for next image.

Following JS chunk would help:

(On incoming datachannel data)

        if (typeof event.data === 'string') {
          buf = new Uint8ClampedArray(parseInt(data.data));
          count = 0;
          chunks = [];
          console.log('Expecting a total of ' + buf.byteLength + ' bytes');
          return;
        }
        var imgdata = new Uint8ClampedArray(data.data);
        console.log('image chunk')
        buf.set(imgdata, count);
        chunks[count] = data.data;
        count += imgdata.byteLength;
        if (count === buf.byteLength) {
          // we're done: all data chunks have been received
          //renderPhoto(buf);
          var builder = new Blob(chunks, buf.type);
          console.log('full image received');
          screenViewer.src = URL.createObjectURL(builder);
        }

screenViewer is an image element.

Hope it helps.


For more options, visit https://groups.google.com/d/optout.



--
Regards,
 
Sojharo


Sethuraman Ramachandran

unread,
May 11, 2016, 1:11:05 AM5/11/16
to discuss...@googlegroups.com, Neeraj Rathore
Hi Sojharo - the link that takes to skylinkshare/MainActivity.java, has onImageAvailable method like this:
int rowPadding = rowStride - pixelStride * displayWidth;
// create bitmap
bitmap = Bitmap.createBitmap(displayWidth + rowPadding / pixelStride,
displayHeight, Bitmap.Config.ARGB_8888);
bitmap.copyPixelsFromBuffer(buffer);
if (skylinkConnection != null && !TextUtils.isEmpty(currentRemotePeerId)) {
stream = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.JPEG, 5, stream);
skylinkConnection.sendData(currentRemotePeerId, stream.toByteArray());
Log.d(TAG, "sending data to peer :" + currentRemotePeerId);
}
Here I do not see the image splitting /chunking code and also, it uses skylinkConnection.sendData.
Is this an older implementation? Could you please provide snippets of current one? The reason I ask is, when I use sendData, I am not getting any event on the other end on webjs.
Appreciate your help - Seth

Sojharo Mangi

unread,
May 17, 2016, 8:01:35 AM5/17/16
to discuss-webrtc, Neeraj Rathore
Hi, sorry for late reply. Please notice the following chunk in above code:

skylinkConnection.sendData(currentRemotePeerId, stream.toByteArray());

They are converting it to byte array. It means each cell of this array is binary chunk of your image :)

We had modified the above code. They are using their skylink SDK to send data. We use datachannel to send the image data.


For more options, visit https://groups.google.com/d/optout.



--
Regards,
 
Sojharo


Reply all
Reply to author
Forward
0 new messages