Do you think it makes sense to use libmpv to implement WebMediaPlayer for playing video? It seems possible to me. Basically we can ask libmpv to render a frame in a texture. Then we can wrap it as a VideoFrame and let cc::VideoLayer grab it.
Motivations:
Problematic video playing on Linux:
Chromium video playing is problematic on Linux. Some people cannot enable hardware decode acceleration. I can enable hardware decode acceleration, but it produces very poor video quality (video is blurred, color is slightly off, and is distorted like 1920x1080 will be played at 1920x1040). These are well known problems of VAAPI rendering (decoding is fine, rendering is not). In contrast, libmpv uses VAAPI to decode but OpenGL to render. (If you force mpv to use VAAPI to render, mpv will show similar problems, so it is not a Chromium bug)
Configurable video playing:
I care most about upscaling filter. If you play 1920x1080 video on a 4K monitor, Chromium will blur the video due to the poor upscaling algorithm used (whether using VAAPI or software). libmpv allows you to choose or even add an upscaling algorithm (even machine learning-based algorithms can be used). libmpv also allows other configurations.
To solve the first problem, we can definitely just re-implement the VAAPI decoder so that it only uses VAAPI to decode and not render. However, the second motivation justifies the use of libmpv for its flexibility.
Also, I think media playing is really a separate thing that Chromium should not care too much about. Offloading to a "professional" media playing library seems a reasonable choice to me.
Any comments? I am interested in working on it if it has a reasonable chance to get merged, at least as an option that can be enabled in chrome://flags.