Multiprocess rendering

208 views
Skip to first unread message

S. Litherum

unread,
Oct 9, 2012, 9:03:38 PM10/9/12
to android-...@googlegroups.com
Hello,
I'm curious about the state of multiprocess rendering in Android. In particular, I'm interested in creating an Application with a SurfaceView that interacts with a Service, which runs GL commands. This architecture is similar to Chrome's GPU process, and I'm interested in this design for the same reasons they are.

I know that one way to implement this is with the GraphicBuffer wrapper around gralloc. This works because the namespace that gralloc operates on is shared among processes, so if one process creates a GraphicBuffer, another process can use that as a texture to show whatever the first process renders, by way of an EGLImage. (Please correct me if I'm wrong)

However, the GraphicBuffer api isn't one of the NDK's "Stable APIs." Indeed, it appears that it has been taken out in recent versions of Android.

Hopefully, there is (or will be) a way to simply give an opaque handle of a window (like an XID) to another process and let the other process create a EGL onscreen context around it. Is there a way to do this? If not, are there plans to implement something like this?

Is there a best practice for multiprocess rendering?

Thanks,
Myles C. Maxfield

Romain Guy

unread,
Oct 9, 2012, 9:05:57 PM10/9/12
to android-platform
Hi,

You can simply use a SurfaceTexture to achieve this. SurfaceTexture
does pretty much what you've described but it's exposed as an SDK API.
> --
> You received this message because you are subscribed to the Google Groups
> "android-platform" group.
> To view this discussion on the web visit
> https://groups.google.com/d/msg/android-platform/-/G3S0B-6AKj8J.
> To post to this group, send email to android-...@googlegroups.com.
> To unsubscribe from this group, send email to
> android-platfo...@googlegroups.com.
> For more options, visit this group at
> http://groups.google.com/group/android-platform?hl=en.



--
Romain Guy
Android framework engineer
roma...@android.com

Myles C. Maxfield

unread,
Oct 10, 2012, 8:04:49 PM10/10/12
to android-...@googlegroups.com
Thanks for the quick reply! I read the SurfaceTexture docs, and it seems pretty straightforward to use from the receiver's end.

The documentation describes the image stream as coming from either a 'Camera' object or a 'MediaPlayer' object. Is there any documentation on how those classes write to an image stream that is connected to a SurfaceTexture? Bonus points if the writer is in another process. (In addition, it would be interesting to see if/how this can be done from native code on either/both ends).

Android is open source, so I can look up the code for those classes myself, but I was hoping there was something out there you know off the top of your head that's a little friendlier to the uninitiated.

Thanks,
Myles
Reply all
Reply to author
Forward
0 new messages