Openmax Integration in Chromium browser

989 views
Skip to first unread message

Arun M

unread,
Mar 14, 2011, 8:59:06 AM3/14/11
to Chromium-dev
Hi

Iam trying to test Openmax video decoding in chromium browser.
libOmxCore.so library is getting loaded only when i disable sandbox(--
no-sandbox)

Is there any way to initialize Openmax in chromium browser without
disabling sandbox?

Regards
Arun

Evan Martin

unread,
Mar 14, 2011, 1:11:51 PM3/14/11
to arunm....@gmail.com, Chromium-dev

Your message is pretty vague. You should try describing the problem
in more detail if you would like a useful answer.

Arun M

unread,
Mar 14, 2011, 11:33:54 PM3/14/11
to Chromium-dev

Iam trying to use openmax video decoder instead of ffmpeg video
decoder.
As the first step, i added --enable-openmax in /sbin/
session_manager_setup.sh.
I get the following error in log:
FATAL:media_posix.cc(105)] Cannot load libOmxCore.so. Make sure it
exists for OpenMAX.

But when i disable sandbox( add --no-sandbox in /sbin/
session_manager_setup.sh)
the initialization of openmax library is a success.



On Mar 14, 10:11 pm, Evan Martin <e...@chromium.org> wrote:

Lei Zhang

unread,
Mar 14, 2011, 11:55:09 PM3/14/11
to arunm....@gmail.com, Chromium-dev
Initializing openmax in the renderer after it has been sandboxed will
not work, since the renderer no longer has access to the file system.
You may want to try initializing it in PreSandboxInit() in
content/browser/zygote_main_linux.cc right before you enter the
sandbox.

> --
> Chromium Developers mailing list: chromi...@chromium.org
> View archives, change email options, or unsubscribe:
>    http://groups.google.com/a/chromium.org/group/chromium-dev
>

Andrew Scherkus

unread,
Mar 25, 2011, 2:28:13 AM3/25/11
to the...@chromium.org, arunm....@gmail.com, Chromium-dev
You'll notice we do a similar library loading trick for FFmpeg before engaging the sandbox.

media::InitializeMediaLibrary()

Perhaps you the code that loads libOmxCore.so either needs to be moved inside InitializeMediaLibrary() or put in a similar place.

Feel free to add me as a reviewer,
Andrew

Arun M

unread,
Apr 15, 2011, 7:10:36 AM4/15/11
to Chromium-dev
Hi

I moved InitializeOpenmaxlibrary to the same place.
But then when i do OmxInit(), it is not able to open the .omxregister
file.
And also in my case, some libraries are opened inside Openmax when
GetOmxhandle function is called.

So some changes has to be done in our Openmax library.

Thanks & Regards
Arun


On Mar 25, 3:28 pm, Andrew Scherkus <scher...@chromium.org> wrote:
> You'll notice we do a similar library loading trick for FFmpeg before
> engaging the sandbox.
>
> media::InitializeMediaLibrary()http://google.com/codesearch?hl=en&vert=chromium&lr=&q=InitializeMedi...
>
> Perhaps you the code that loads libOmxCore.so either needs to be moved
> inside InitializeMediaLibrary() or put in a similar place.
>
> Feel free to add me as a reviewer,
> Andrew
>
>
>
>
>
>
>
> On Mon, Mar 14, 2011 at 8:55 PM, Lei Zhang <thes...@chromium.org> wrote:
> > Initializing openmax in the renderer after it has been sandboxed will
> > not work, since the renderer no longer has access to the file system.
> > You may want to try initializing it in PreSandboxInit() in
> > content/browser/zygote_main_linux.cc right before you enter the
> > sandbox.
>
> > On Mon, Mar 14, 2011 at 8:33 PM, Arun M <arunm.chr...@gmail.com> wrote:
>
> > > Iam trying to use openmax video decoder instead of ffmpeg video
> > > decoder.
> > > As the first step, i added --enable-openmax in /sbin/
> > > session_manager_setup.sh.
> > > I get the following error in log:
> > > FATAL:media_posix.cc(105)] Cannot load libOmxCore.so. Make sure it
> > > exists for OpenMAX.
>
> > > But when i disable sandbox( add --no-sandbox in /sbin/
> > > session_manager_setup.sh)
> > > the initialization of openmax library is a success.
>
> > > On Mar 14, 10:11 pm, Evan Martin <e...@chromium.org> wrote:
> > >> On Mon, Mar 14, 2011 at 5:59 AM, Arun M <arunm.chr...@gmail.com> wrote:
> > >> > Iam trying to test Openmax video decoding in chromium browser.
> > >> > libOmxCore.so library is getting loaded only when i disable sandbox(--
> > >> > no-sandbox)
>
> > >> > Is there any way to initialize Openmax in chromium browser without
> > >> > disabling sandbox?
>
> > >> Your message is pretty vague.  You should try describing the problem
> > >> in more detail if you would like a useful answer.
>
> > > --
> > > Chromium Developers mailing list: chromium-...@chromium.org
> > > View archives, change email options, or unsubscribe:
> > >    http://groups.google.com/a/chromium.org/group/chromium-dev
>
> > --
> > Chromium Developers mailing list: chromium-...@chromium.org

Arun M

unread,
Jun 13, 2011, 7:57:44 AM6/13/11
to Chromium-dev
Hi

In the latest version of Chromium browser, OmxVideoDecodeAccelerator
class has been added.
Currently I see this class is being used for hardware accelerated
decoding for pepper based plugins.
Will this class replace the OmxVideoDecodeEngine class in the media
pipeline?

Thanks & Regards
Arun
> > > >> > Iam trying to test Openmaxvideodecoding in chromium browser.

Ami Fischman

unread,
Jun 13, 2011, 12:50:14 PM6/13/11
to arunm....@gmail.com, Chromium-dev
OmxVideoDecodeEngine was only ever used in player_x11, and even so has bit-rotted for a while (b/c it's not triggered by any bots).  Re-reading this thread I'm not surprised you had trouble getting it to run inside the sandbox (since this would be the first time anyone's tried that).  Andrew tried nuking this code in r87790, but that had to be rolled back b/c of some collateral damage.  I expect this code to go away shortly unless someone speaks up with a compelling reason.

OmxVideoDecodeAccelerator is meant to be the "right" way to do OMX HW decode in chromium's architecture, by delegating the HW interface to the GPU process and putting in place an IPC interface to use it from renderers (and pepper plugins).  

You said earlier in the thread that you want to use OMX instead of FFmpeg for video decode.  You mean for HTML5 video in chrome, or some other scenario?  Can you talk about the HW/SW platform you're targeting?
A goal of ours is to tie OmxVideoDecodeAccelerator into the media pipeline so html5 video decode can happen in HW when there is support for it.  But the APIs involved are still evolving so it'd be good to know if you plan to develop against them as well, and what your plans are, to avoid unnecessary conflict.

Cheers,
-a

Chromium Developers mailing list: chromi...@chromium.org

Arun M

unread,
Jun 14, 2011, 8:14:07 AM6/14/11
to Chromium-dev
Thanks for the reply.

We are working on porting Chromium OS on ARM platform
and we want to use the HW decoder(if codec is supported in HW).
The scenarios we are targeting are HTML5 video and the Chrome OS media
player.
Currently we are using the OmxVideoDecodeEngine class with sandbox
disabled.
But the rendering performance is very poor because
of the copy of decoded data from renderer process to GPU process.

I think using OmxVideoDecodeAccelerator will solve all the above
issues.
So we are also interested in integrating OmxVideoDecodeAccelerator
into the media pipeline.
As you explained, this also requires putting in place an IPC interface
between renderer and GPU process.
In earlier version of chromium, I had seen an IpcVideoDecoder class.
But this has been removed in the latest(14.0.791.0)

So if you can share with us any details regarding changes in the media
pipeline,
we can develop against them.
We are also interested to improve the rendering performance when
FFMpeg(fallback to SW when OMX doesn't support the codec) decoder is
used.
Please let me know if you have any plans for the same.

Thanks & Regards
Arun

Ami Fischman

unread,
Jun 14, 2011, 11:07:50 AM6/14/11
to arunm.chrome, Chromium-dev
As you explained, this also requires putting in place an IPC interface
between renderer and GPU process.

 
So if you can share with us any details regarding changes in the media
pipeline, we can develop against them.

I recommend you set up a watchlist (http://dev.chromium.org/developers/contributing-code/watchlists) to monitor the parts of the codebase you care about (sounds like maybe that's media/ content/common/gpu content/gpu).

We are also interested to improve the rendering performance when
FFMpeg(fallback to SW when OMX doesn't support the codec) decoder is
used.  Please let me know if you have any plans for the same.

I don't think I know of anyone specifically working on making FFmpeg faster on ARM, but it'd be great if you were able to contribute improvements to upstream!

Cheers,
-a

Mykola Ostrovskyy

unread,
Jun 17, 2011, 9:16:12 AM6/17/11
to Chromium-dev
Hello Ami,

> To be clear, there is already an (in-progress) IPC interface in place

So is there a plan to connect GpuVideoServiceHost to the media
pipeline? Or you only targeting Pepper at the moment?

What about vendor-specific OMX features? Since you are doing OMX "for
real" this time, do you have a plan to support them (for example, via
configuration vectors or ifdefs), or are you going to stick to vanilla
Khronos OpenMAX IL?

Are there any plans to move HW accelerated video rendering to GPU
process as well?


> FFMpeg(fallback to SW when OMX doesn't support the codec) decoder is
> used. Please let me know if you have any plans for the same.

BTW, is there a plan to have a video decoder factory in the media
pipeline? There was an attempt[1] to do FFmpeg/OMX switch based on
codec id, but somehow it didn't fly far.

[1] http://codereview.chromium.org/5612004/

Regards,
Mykola

Ami Fischman

unread,
Jun 17, 2011, 1:59:15 PM6/17/11
to ostr...@gmail.com, Chromium-dev
Hi Mykola,
 
> To be clear, there is already an (in-progress) IPC interface in place
So is there a plan to connect GpuVideoServiceHost to the media
pipeline? Or you only targeting Pepper at the moment?

The goal is to support both.
 
What about vendor-specific OMX features? Since you are doing OMX "for
real" this time, do you have a plan to support them (for example, via
configuration vectors or ifdefs), or are you going to stick to vanilla
Khronos OpenMAX IL?

The API currently includes config negotiation, but that's pretty much a place-holder so far.
Are you lobbying for anything in particular to be included?
 
Are there any plans to move HW accelerated video rendering to GPU
process as well?


BTW, is there a plan to have a video decoder factory in the media
pipeline?

Not that I'm aware of.

Cheers,
-a

Mykola Ostrovskyy

unread,
Jun 20, 2011, 5:21:57 AM6/20/11
to Chromium-dev
> The goal is to support both.

And I guess there is no time frame attached to this feature, right? ;)


> The API currently includes config negotiation, but that's pretty much a
> place-holder so far.
> Are you lobbying for anything in particular to be included?

Nothing in particular. It's just that OMX spec has things like
OMX_IndexVendorStartUnused, and the implementation I'm currently
working with uses it to extend OMX_INDEXTYPE. I didn't investigate
this extension in detail, so I'm not even sure if there is something
worthy to be exposed outside of OmxVideoDecodeAccelerator. On the
other hand, I do have few headers that I need to include in order to
use OMX. So I was just wondering if SOME_VENDOR_OMX_IMPLEMENTATION
defines will be allowed in Chromium code base.


> HW GL rendering is already implemented (in webkit, for chromium).  I.e.
> http://codesearch.google.com/codesearch#OAMlx_jo-ck/src/third_party/W...
> Are you looking for something else?

On the first look it appears that VideoDecodeAccelerator pushes
bitstream buffers and decoded frames back and forth between the
renderer and GPU processes. I believe that there are some cases where
video frame should travel back to the renderer go through some SW
processing (e.g. some canvas effects like in Destructive Video[1]). On
the other hand for the "plain" video playback GPU could consume the
bitstream and put the video on the screen without passing the pixels
to the renderer. Is it somehow taken care of?

Currently we use a V4L2 renderer that replaces VideoRendererImpl.
Having that in the renderer process gives us some sandboxing issues,
so we would like to move that into the GPU process. Basically, I'm
looking for something that will allow to keep the decoded frame in the
GPU process, and render it through a custom renderer.


[1] http://www.chromeexperiments.com/detail/destructive-video/

Regards,
Mykola

Ami Fischman

unread,
Jun 20, 2011, 1:24:10 PM6/20/11
to ostr...@gmail.com, Chromium-dev
And I guess there is no time frame attached to this feature, right? ;)

Right.

So I was just wondering if SOME_VENDOR_OMX_IMPLEMENTATION
defines will be allowed in Chromium code base.

I don't know enough about the state of the OMX world to make a statement on this.

Basically, I'm looking for something that will allow to keep the decoded frame in the
GPU process, and render it through a custom renderer.

The current work avoids copying the frame from GPU to renderer, by passing texture handles over IPC.  See the ppapi/examples/gles2.cc plugin for an example where we use PPAPI to request textures, pass those textures to the GPU process as the decode target, get them back, and then pass them back for rendering.

-a

Mykola Ostrovskyy

unread,
Jun 21, 2011, 3:43:48 AM6/21/11
to Chromium-dev
> The current work avoids copying the frame from GPU to renderer, by passing
> texture handles over IPC.  See the ppapi/examples/gles2.cc plugin for an
> example where we use PPAPI to request textures, pass those textures to the
> GPU process as the decode target, get them back, and then pass them back for
> rendering.

Okay, thanks. I'll see if I can figure out how make use of this :)


Regards,
Mykola

Arun M

unread,
Jun 21, 2011, 9:38:51 AM6/21/11
to Chromium-dev
Hi

In ppapi/examples/gles2.cc, the actual buffer allocation is done using
glTexImage2D.
Then in OmxVideoDecodeAccelerator, TranslateToEglImage is called to
create an EGL Image from this texture.

On our platform, the texture allocated using glTexImage2D cannot be
shared between GPU and HW decoder.
We use XCreatePixmap to allocate buffer, create an EGL Image from this
pixmap and then use this image as a texture.

This will require passing of EGL Image handle to decoder instead of
texture id.
What about such vendor-specific requirements? Will this be accepted as
well?

Regards
Arun

Ami Fischman

unread,
Jun 21, 2011, 11:16:28 AM6/21/11
to arunm....@gmail.com, Chromium-dev
In principle there's no reason GL textures are more blessed than any other medium (e.g. there is some stubbed-out support for using system memory as the decode target instead of textures, but this will probably be removed unless we find a use for it).  I imagine we'll look at proposals on a case-by-case basis.
Though note that Pixmap handles can't be shared across processes (nor even across threads, unless you're on a magical platform where Xlib is thread-safe; does such a platform even exist?).  So the XCreatePixmap would have to happen in the GPU process, meaning that to make this happen you'll need to add IPC for the renderer to request pixmap creation, handing back an opaque handle, and lifecycle management of the pixmap in the GPU process.  I'm not sure, but once you have a Pixmap in the GPU process, it's possible wrapping it with a GL texture that is then wrapped with an EGL image will be no less efficient than wrapping the pixmap with an egl image directly.

Cheers,
-a

--
Chromium Developers mailing list: chromi...@chromium.org
Reply all
Reply to author
Forward
0 new messages