Hello guys,
I've developed a Android application, which uses a custom version of the Chromium content shell to show websites with enhanced functionality.
For interaction with my backend via JavaScript as well as for rendering video into the website with ExoPlayer (video formats not supported by chrome), I've developed a custom WebPlugin.
Currently the ExoPlayer renders into a Surface in the Android application which is located behind the browser. I gets visible through the browser because I stamp a hole into the Browser by painting the WebPlugin transparent (in my implementation of the WebPlugin::paint method).
I think this can be improved by rendering the video directly into a surface that is handled by the browser as I've seen it is done by the WebMediaPlayerAndroid.
The WebMediaPlayerAndroid creates a cc::VideoLayer and sets this layer as the WebLayer at the WebMediaPlayerClient.
I've noticed that I can also set a WebLayer at the WebPluginContainer that holds my custom WebPlugin, so I hope the rendering could be done in a similar way.
What I don't understand is how the creation of the cc::VideoLayer leads to the creation of a MediaPlayerBridge, which gets a surface that seems to be connected to the cc::VideoLayer and can be used as a Java Surface. Can anybody explain me the way how this works?
Is there any way to achieve something similar get a Java surface to render into for a WebPlugin? Can I reuse the classes that are used by the WebMediaPlayer or do I have to copy and modify them?
It would be nice if I could just create a cc:VideoLayer that is connected to a "WebPluginBridge" in the constructor of the WebPlugin and set it as the WebLayer at the WebPluginContainer.
Thank you in advance!
Best regards
Ludwig