How to use a VPU for video decoding acceleration

667 views
Skip to first unread message

rebilla...@gmail.com

unread,
Jan 5, 2015, 9:48:21 AM1/5/15
to graphi...@chromium.org
Hi all,

I'm using chromium with QT 5.4 (qtwebengine) on an arm embedded linux device.
This is working quite fine, I can launch a browser and navigate the web.

Nevertheless, my project goal is to play video using MSE and EME APIs but the video performance is not optimal.
This is to be expected as the video decoding is done in software using FFMPEG.
So, because the chipset has a VPU, i'd like to use it in order to improve decoding performance.

I started to implement a media::VideoDecoder sub-class to replace the default one based on FFMPEG.
My problem is that I can't output in a planar YUV format which are the only ones supported (as I understood from the code).
By default the VPU output is NV12 and with help of the hardware post-processor I can output various RGB formats and other semi-planar YUV but no planar YUV.

Therefore I have two questions :
- Is it correct that only planar YUV formats are allowed for a video frame output ?
- I've seen that there is a media::VideoDecodeAccelerator class: Is it feasible to implement a subclass that could be used to decode frames and output in NV12 or RGB ?

Thank you in advance for your answers.

Aubin


Andrew Scherkus

unread,
Jan 6, 2015, 5:28:15 PM1/6/15
to rebilla...@gmail.com, graphi...@chromium.org
Accelerated video decode is typically handled via implementing a media::VideoDecodeAccelerator that runs in the GPU process.

Depending on your architecture, you can either use the opaque video format type NATIVE_TEXTURE or use one of the existing outputs. The compositor handles the video formats inside of VideoResourceUpdater:

Hope that helps,
Andrew

Aubin REBILLAT

unread,
Jan 7, 2015, 11:32:16 AM1/7/15
to Andrew Scherkus, graphi...@chromium.org
Hi Andrew,

Thank you for your anwer, it is indeed helpful.
I looked into the NATIVE_TEXTURE format and it is, i think, a good solution.

I don't know if you can help me, but i have some question about it :
- To create a NATIVE_TEXTURE media::VideoFrame, i have to generate a texture containing the RGBA pixels returned by the VPU and assign it to a mailbox. Is this correct ?
- From what i understand, the VideoDecoder is in a different thread than the render thread, therefore how can i access a gpu::gles2::GLES2Implementation in the videoDecoder ?

Thank you for your answer anyway.
Aubin

Andrew Scherkus

unread,
Jan 7, 2015, 2:19:34 PM1/7/15
to Aubin REBILLAT, graphi...@chromium.org
My memory is foggy, but yes I believe generating textures via ProvidePictureBuffers() should be sufficient as GpuVideoDecoder will take care of wrapping said textures in a NATIVE_TEXTURE.

Ideally, you shouldn't have to worry about VideoDecoder as the texture should get passed along down to the compositor via existing code.

Andrew

Aubin REBILLAT

unread,
Jan 8, 2015, 12:16:00 PM1/8/15
to Andrew Scherkus, graphi...@chromium.org
Oh ok, I did not understand that i should use the NATIVE_TEXTURE format with a VideoDecodeAccelerator.
I was trying to use it inside a VideoDecoder, which was quite difficult to do.

So i implemented a VideoDecodeAccelerator and it is indeed easier.
I am now facing openGL issues, but i'm pretty sure i have some mistakes in my code.

Thank you very much for your help.

kenz kiran

unread,
Jan 8, 2015, 1:33:53 PM1/8/15
to Aubin REBILLAT, Andrew Scherkus, graphics-dev
Aubin,

I found the best way to move from FFMPEG (software video decoding) to HW Accelerated Video decoding (Software doesn't create, manipulate video frames) was using StreamTextures. Check out how Android Renders HTML Video.

Basically, if you have a GPU that supports  EGLImageExternal extension (https://www.khronos.org/registry/gles/extensions/OES/OES_EGL_image_external.txt). and a VPU splitting out any YUV format, then you would get a zero-copy video rendering.

Chrome already has the necessary fragment shaders for video textures provided in NV12 format. (if I remember correctly).

If it is of any help, here is a diagram for chrome android html5 video (albeit old by an year, things could have changed).



regards,
Ravi






To unsubscribe from this group and stop receiving emails from it, send an email to graphics-dev...@chromium.org.

Aubin REBILLAT

unread,
Jan 9, 2015, 11:39:20 AM1/9/15
to kenz kiran, Andrew Scherkus, graphics-dev
Hi Kenz,

I managed to make my VideoDecodeAccelerator work correctly so for now i will not attempt another method.
Nevertheless, this is really interresting and i will take a closer look at it in the near future.

Thank you for your help.

Andrew Scherkus

unread,
Jan 9, 2015, 1:46:30 PM1/9/15
to Aubin REBILLAT, kenz kiran, graphics-dev
Glad to hear you got it working!

Andrew

abhiji...@gmail.com

unread,
Feb 20, 2018, 9:19:50 AM2/20/18
to Graphics-dev, kira...@gmail.com, sche...@chromium.org, rebilla...@gmail.com
On Friday, 9 January 2015 22:09:20 UTC+5:30, Aubin REBILLAT wrote:
> Hi Kenz,
>
>
> I managed to make my VideoDecodeAccelerator work correctly so for now i will not attempt another method.
> Nevertheless, this is really interresting and i will take a closer look at it in the near future.
>
>
> Thank you for your help.
>
>
> On Thu, Jan 8, 2015 at 7:33 PM, kenz kiran <kira...@gmail.com> wrote:
>
>
>
> Aubin,
>
> I found the best way to move from FFMPEG (software video decoding) to HW Accelerated Video decoding (Software doesn't create, manipulate video frames) was using StreamTextures. Check out how Android Renders HTML Video.
>
>
> Basically, if you have a GPU that supports  EGLImageExternal extension (https://www.khronos.org/registry/gles/extensions/OES/OES_EGL_image_external.txt). and a VPU splitting out any YUV format, then you would get a zero-copy video rendering.
>
>
> Chrome already has the necessary fragment shaders for video textures provided in NV12 format. (if I remember correctly).
>
>
> If it is of any help, here is a diagram for chrome android html5 video (albeit old by an year, things could have changed).
>
>
>
>
>
Hi Aubin,

I am trying to hardware accelerate video decoding for chromium browser. As far I know there are three components in Chromium for Video rendering,
1. Pipeline
2. ffmpeg
3. Webkit

For hardware acceleration, I am contemplating to change ffmpeg to use VPU. Can you please let me know what are steps that you carried out for hardware acceleration? Does ffmpeg expects standard APIs from underlying decoder?

Dale Curtis

unread,
Feb 20, 2018, 1:17:29 PM2/20/18
to abhiji...@gmail.com, Graphics-dev, kira...@gmail.com, Andrew Scherkus, rebilla...@gmail.com
You won't be able to use ffmpeg for hardware acceleration unless you're running without a sandbox (not recommended). Instead you'll want to look at implementing a MojoVideoDecoder; you can see an example of this being done on Android with MediaCodecVideoDecoder:

abhiji...@gmail.com

unread,
Feb 21, 2018, 1:21:32 AM2/21/18
to Graphics-dev, abhiji...@gmail.com, kira...@gmail.com, sche...@chromium.org, rebilla...@gmail.com
Hi Dale,

Thank you very much for your reply.

Actually I am trying to use stand-alone chromium browser on Linux. However I checked that, there is media/gpu/v4l2 directory present under Chromium browser source.

Can I use gpu/v4l2 for hardware acceleration of Video Codecs?

Dale Curtis

unread,
Feb 21, 2018, 2:06:29 PM2/21/18
to abhijit naik, Graphics-dev, kenz kiran, Andrew Scherkus, Aubin REBILLAT
Ah, that is not formally supported in Chrome, but some folks maintain a patch here you can check out: https://chromium-review.googlesource.com/c/chromium/src/+/532294

- dale
Reply all
Reply to author
Forward
0 new messages