Hi,I am learning the architecture of Chrome's media pipeline by browsing the source and I have reached a point where a VideoLayer is created (inside webmediaplayer_impl.cc).The layer is wrapped inside a webLayer again, why and what is really happening here?
Where is the actual surface created? What is the graphics context if skia is used? Can someone point me to the code/file?If I want to create an OpenGL surface, where do I put the code?
--Thanks in advance.
--
Chromium Developers mailing list: chromi...@chromium.org
View archives, change email options, or unsubscribe:
http://groups.google.com/a/chromium.org/group/chromium-dev
Thanks Dana, seems a bit complicated and confusing.I dont see the below examples Canvas2d and webGL being used in the current video rendering path (HTMLMediaElement->WebMediaPlayerImpl->VideoLayer). How do I enable them?
If I need to create only a openGL surface and redirect to this surface instead of the existing and still be able to use the path of WebMediaPlayerImpl::GetCurrentFrame (I am assuming this is the accelerated path). Where does the code go?
What is the need for /skia/gpu/, I understand skia is only software rendered. What is the role of GPU in SKIA?
Thanks Dana, It took me sometime to understand the flow however I am still confused.I am following your lead "Draw the video frame into a texture and give that to the compositor? Then you could use a TextureLayer like canvas and the VideoFrame would just be one of your inputs"
I see that the webexternalTextureLayer is called from the compositor (web_compositor_support) and there is no interface to pass VideoFrame as the input directly. Is it the preparemailbox that can do this?
What is the idea behind mailbox concept?
i am sure that I am missing something basic here. can you help?off topic: Is canvas2D bridge enabled default? or should I enable some compile/runtime switch?
To unsubscribe from this group and stop receiving emails from it, send an email to graphics-dev...@chromium.org.