Creating render surfaces for both image and video

347 views
Skip to first unread message

ava...@gmail.com

unread,
May 6, 2014, 6:47:02 AM5/6/14
to chromi...@chromium.org
Hi,

I am learning the architecture of Chrome's media pipeline by browsing the source and I have reached a point where a VideoLayer is created (inside webmediaplayer_impl.cc). 
The layer is wrapped inside a webLayer again, why and what is really happening here?

Where is the actual surface created? What is the graphics context if skia is used? Can someone point me to the code/file? 
If I want to create an OpenGL surface, where do I put the code?

Thanks in advance.

Dana Jansens

unread,
May 6, 2014, 11:11:21 AM5/6/14
to ava...@gmail.com, chromium-dev, graphics-dev
On Tue, May 6, 2014 at 6:47 AM, <ava...@gmail.com> wrote:
Hi,

I am learning the architecture of Chrome's media pipeline by browsing the source and I have reached a point where a VideoLayer is created (inside webmediaplayer_impl.cc). 
The layer is wrapped inside a webLayer again, why and what is really happening here?

WebLayer is a wrapper around cc::Layer that is our API to blink. The declaration of the the WebLayer abstract class lives in the blink repo so that blink can know about it, while the implementation lives in chromium where it can know about cc::Layer.

Where is the actual surface created? What is the graphics context if skia is used? Can someone point me to the code/file? 
If I want to create an OpenGL surface, where do I put the code?

ava...@gmail.com

unread,
May 8, 2014, 9:43:49 AM5/8/14
to chromi...@chromium.org, ava...@gmail.com, graphics-dev
Thanks Dana, seems a bit complicated and confusing.

I dont see the below examples Canvas2d and webGL being used in the current video rendering path (HTMLMediaElement->WebMediaPlayerImpl->VideoLayer). How do I enable them?

If I need to create only a openGL surface and redirect to this surface instead of the existing and still be able to use the path of WebMediaPlayerImpl::GetCurrentFrame (I am assuming this is the accelerated path). Where does the code go?

What is the need for /skia/gpu/, I understand skia is only software rendered. What is the role of GPU in SKIA?

Thanks a lot in advance.

Dana Jansens

unread,
May 8, 2014, 11:41:34 AM5/8/14
to avatar9, chromium-dev, graphics-dev
On Thu, May 8, 2014 at 9:43 AM, <ava...@gmail.com> wrote:
Thanks Dana, seems a bit complicated and confusing.

I dont see the below examples Canvas2d and webGL being used in the current video rendering path (HTMLMediaElement->WebMediaPlayerImpl->VideoLayer). How do I enable them?

They are not the "video" path, but the simpler OpenGL display path. Video does not work the same at all, has VideoFrames involved and such. You'd want to look at WebMediaPlayerImpl, VideoResourceUpdater and friends.

If I need to create only a openGL surface and redirect to this surface instead of the existing and still be able to use the path of WebMediaPlayerImpl::GetCurrentFrame (I am assuming this is the accelerated path). Where does the code go?

I'm not sure what you mean, you want to draw the video frame into a texture and give that to the compositor? Then you could use a TextureLayer like canvas and the VideoFrame would just be one of your inputs.

WebMediaPlayerImpl::GetCurrentFrame is used for software or hardware video frames, the VideoFrame structure can hold either.
 

What is the need for /skia/gpu/, I understand skia is only software rendered. What is the role of GPU in SKIA?

ava...@gmail.com

unread,
May 21, 2014, 5:14:29 AM5/21/14
to chromi...@chromium.org, avatar9, graphics-dev
Thanks Dana, It took me sometime to understand the flow however I am still confused.
I am following your lead "Draw the video frame into a texture and give that to the compositor? Then you could use a TextureLayer like canvas and the VideoFrame would just be one of your inputs"

I see that the webexternalTextureLayer is called from the compositor (web_compositor_support) and there is no interface to pass VideoFrame as the input directly. Is it the preparemailbox that can do this?
What is the idea behind mailbox concept?

i am sure that I am missing something basic here. can you help?

off topic: Is canvas2D bridge enabled default? or should I enable some compile/runtime switch?

Thanks in advance.

Dana Jansens

unread,
May 21, 2014, 11:11:33 AM5/21/14
to avatar9, chromium-dev, graphics-dev
On Wed, May 21, 2014 at 5:14 AM, <ava...@gmail.com> wrote:
Thanks Dana, It took me sometime to understand the flow however I am still confused.
I am following your lead "Draw the video frame into a texture and give that to the compositor? Then you could use a TextureLayer like canvas and the VideoFrame would just be one of your inputs"

I see that the webexternalTextureLayer is called from the compositor (web_compositor_support) and there is no interface to pass VideoFrame as the input directly. Is it the preparemailbox that can do this?

If you pass a texture (mailbox) you'd use a cc TextureLayer via WebExternalTextureLayer. If you want to pass a VideoFrame, then you'd use a cc::VideoLayer. See the WebMediaPlayerImpl and the VideoFrameProviderClientImpl and VideoLayerImpl for how this works. Also http://www.chromium.org/developers/design-documents/video-playback-and-compositor

 
What is the idea behind mailbox concept?

 

i am sure that I am missing something basic here. can you help?

off topic: Is canvas2D bridge enabled default? or should I enable some compile/runtime switch?

For hardware accelerated canvases. I believe they have to be large enough to trigger it.

kenz kiran

unread,
May 21, 2014, 1:06:23 PM5/21/14
to Dana Jansens, avatar9, chromium-dev, graphics-dev
if it is useful, I had traced Android's HW accelerated Video Playback.

I have put a fairly detailed arch diagram and noted import pieces of code:

https://docs.google.com/document/d/1vzPIW0gAhI3m51ozwSY6g2Hz1wzWLOs3GaPGy9Y4y8w/edit?usp=sharing

-Ravi






To unsubscribe from this group and stop receiving emails from it, send an email to graphics-dev...@chromium.org.

ava...@gmail.com

unread,
May 22, 2014, 3:07:47 AM5/22/14
to chromi...@chromium.org, Dana Jansens, avatar9, graphics-dev
Thanks Ravi, I am reading your doc, it looks impressive. Just curious, Are you trying to use accelerated surface for the Android media player to draw on?

kenz kiran

unread,
May 22, 2014, 10:17:45 AM5/22/14
to avatar9, Chromium-dev, Dana Jansens, graphics-dev
Not sure when you say "accelerated surface". If you mean that we don't paint Video Frames using Skia or CPU based libs, then "YES".
We have ported blink on our own platform, and our Arch is similar to Android i.e the real video playback stack is on Browser Process and we use StreamTexture like classes to provide "texture mailboxes" that will be used to composite Video along with other Layers.

-Ravi

Reply all
Reply to author
Forward
0 new messages