Use of 3D Hardware Accelerator

132 views
Skip to first unread message

Pivotian

unread,
Nov 25, 2008, 12:58:54 AM11/25/08
to android-porting
From the source code i got to know that Android has its own software
codecs for both audio and video which supports many formats like
H263,MPEG4, H264 , MP3,AAC etc. But what if the device itself has 3D
Hardware Accelerator which contains inbuilt support for formats
including MPEG4, H.263, H.264 etc.At that point how we will disable
the software codec of android and make it work with the device's 3D
Hardware Accelerator? I want the entire video related stuffs of
android to work using this 3D Hardware Accelerator, for example for
video recording also i want to use this hardware to encode the video.

Pivotian

unread,
Nov 25, 2008, 7:31:39 AM11/25/08
to android-porting
I got to know that in Android OPEN_GL ES is configured by default for
software thats why it uses the software packages of android. How I can
configure the OPEN_GL ES for hardware. Is it just a case of replacing
the "android/out/target/product/generic/system/lib/libGLES_CM.so" with
a new one build with hardware configuration? or any thing more than
that. If I replace the shared object file "libGLES_CM.so" with the new
one configured for hardware, will it work fine ? How I will know
whether its working fine with top level application to whom it
provides the interface. Please put some light on it if anyone has any
idea regarding that.

Markus

unread,
Nov 25, 2008, 4:55:00 PM11/25/08
to android-porting
Hi,

are you sure, that you need 3D acceleration to enable those codecs? Or
does your platform offer specialized chips for doing that codec job? I
do not know any OpenGL related stuff, that provides access hardware
codecs, but I am not familiar with OpenGL-ES. Do you know, how you
access on your original platform to those codecs in low level? Are
there special libs, which you could recompile for Android?
If you really need libGLES_CM.so, you must find a way to replace it by
our vendor specific implementation, which must also be converted to
Android and might require a kernel module to get fast hardware access.
So you have to port that kernel module and all the OpenGL related
stuff, that is vendor dependent in you case. Maybe it is just enough
to integrate hardware support to your kernel, as OpenGL is something
like a client/server system. Did you already check, which libraries/
devices does libGLES_CM.so access? At the moment, I cannot give you
more details about hardware 3D on Android, it is something that I will
investigate in the future again.

bye
Markus

Mathias Agopian

unread,
Nov 25, 2008, 5:03:12 PM11/25/08
to android...@googlegroups.com
Hi,

You need to supply "libhgl.so", which must be a regular OpenGL ES
library. libGLES_CM.so, will load it at runtime and defer to it for 3D
h/w acceleration.

implementing your own h/w accelerated libhgl.so right now is very
difficult because Android is not ready for this just yet. this is an
area we're working on as we speak.

mathias

Pivotian

unread,
Nov 26, 2008, 12:04:30 AM11/26/08
to android-porting
thanks Mathias & Markus for your quick responses.
Mathias, currently the system/lib folder doesn't contain the
"libhgl.so" specified by you. Do you mean to say that we have to add
this file extra into this folder so that libGLES_CM.so, will load it
at runtime and defer to it for 3D h/w acceleration ? If this is case i
will try that stuff. And one more thing is, how i am going to make
sure that android will use the hardware codecs rather software codecs
because currently I have no board, so with out the real hardware, how
i am going to know that? As i specified earlier do i have to make any
changes anywhere in the android code to support hardware acceleration
apart from adding the "libhgl.so" file?

On Nov 26, 3:03 am, Mathias Agopian <pixelflin...@google.com> wrote:
> Hi,
>
> You need to supply "libhgl.so", which must be a regular OpenGL ES
> library. libGLES_CM.so, will load it at runtime and defer to it for 3D
> h/w acceleration.
>
> implementing your own h/w accelerated libhgl.so right now is very
> difficult because Android is not ready for this just yet. this is an
> area we're working on as we speak.
>
> mathias
>

Pivotian

unread,
Nov 26, 2008, 12:16:13 AM11/26/08
to android-porting
Also Mathias, i was unable to find any information related to
"libhgl.so", could you please guide me regarding that too. Where i can
get that file?

Dianne Hackborn

unread,
Nov 26, 2008, 12:37:29 AM11/26/08
to android...@googlegroups.com
You are trying to write a driver for hardware you don't have?  That seems...  challenging.
--
Dianne Hackborn
Android framework engineer
hac...@android.com

Note: please don't send private questions to me, as I don't have time to provide private support.  All such questions should be posted on public forums, where I and others can see and answer them.

Mathias Agopian

unread,
Nov 26, 2008, 12:42:43 AM11/26/08
to android...@googlegroups.com
On Tue, Nov 25, 2008 at 9:16 PM, Pivotian <suji...@gmail.com> wrote:
>
> Also Mathias, i was unable to find any information related to
> "libhgl.so", could you please guide me regarding that too. Where i can
> get that file?

There are not information about it at the moment because, as I said,
Android cannot handle arbitrary OpenGL h/w accelerators. This is being
worked on though.

I should have been more clear; the short answer is "it is not possible
to use h/w accelerated GL with version 1.0".

Mathias

Pivotian

unread,
Nov 26, 2008, 1:26:06 AM11/26/08
to android-porting
hi Mathian

The point is that the processor features a built-in, state-of-the-art
3D, 4M triangles/sec hardware accelerator with OpenGL ES 1.1/ 2.0 and
D3DM API support.Also Built-in hardware and Multi-Format Codec to
enhance the multimedia experience: Supports Standard Definition (SD)
level encoding/decoding of multiple content formats including MPEG4, H.
263, H.264 and VC1.

I didn't knew that its a complicated thing to use the capabilities of
our processor which contains inbuilt encoding/decoding of multiple
formats instead of opencore provided by Android.
So you mean to say that with above configuration of hardware its not
currently possible to enable Android to use hardware accelerator
instead of android software codecs.


On Nov 26, 10:42 am, Mathias Agopian <pixelflin...@google.com> wrote:

Pivotian

unread,
Nov 26, 2008, 1:54:06 AM11/26/08
to android-porting
sorry for some confusing questions. I was supposed to ask the 3D
accelerator stuffs ans codec stuffs separately but got mixed up both
the things together in the end. but the things is clear that Currently
Android won't support 3D hardware acceleration and its work is in
progress.

The next thing is about the video codecs which i have just pointed in
the previous question?

Mathias Agopian

unread,
Nov 26, 2008, 3:01:14 AM11/26/08
to android...@googlegroups.com
On Tue, Nov 25, 2008 at 10:54 PM, Pivotian <suji...@gmail.com> wrote:
>
> sorry for some confusing questions. I was supposed to ask the 3D
> accelerator stuffs ans codec stuffs separately but got mixed up both
> the things together in the end. but the things is clear that Currently
> Android won't support 3D hardware acceleration and its work is in
> progress.

Android can support h/w accelerated 3D (the G1 does), however, it is
not easy to integrate an h/w accelerated OpenGL ES driver at the
moment, because there is no "hardware abstraction layer" for it.

The first thing you need to do is write an OpenGL ES driver. Nobody
will do that for you, it is h/w dependent -- this driver consists of a
kernel module and a libhgl.so library which is a full OpenGL
implementation (with EGL), *you* must provide this (or you h/w
vendor). Once you have that, you need to integrate it with Android's
EGL; but there are no documentation (and frankly no clear way to do it
other than "hacking" in various places (mainly libui and
surfaceflinger).

In hopefully the near future, there will be a framework to allow you
to integrate your own OpenGL drivers with Android (more) easily; this
is simply not ready now.

mathias

Pivotian

unread,
Nov 26, 2008, 3:41:57 AM11/26/08
to android-porting
one more thing Mathian

I suppose that Android uses the OPENCORE from packet video to do all
the stuffs related to video encoding and decoding. I want to use the
built in capabilities of my processor which provides Standard level
encoding/decoding of multiple content formats including MPEG4, H.263,
H.264. How i will use the inbuilt hardware based encoding decoding
instead of Android's software based opencore ?



On Nov 26, 1:01 pm, Mathias Agopian <pixelflin...@google.com> wrote:

Phil HUXLEY

unread,
Nov 26, 2008, 4:11:46 AM11/26/08
to android...@googlegroups.com
Hi Mathias,

- With respect to integrating OpenGL ES -
- What forms of OpenGL ES surfaces does Android use? (Window? PBuffer?
NativePixmap?).
- Does Android expect to allow accelerated rendering to a UI component AND
to be able to draw to it directly?
- Which Android UI components translate into being EGL Surfaces of one
shape or another?
- Do you have a list of where the integration 'hacks' need to be made in
the Android code to ease integration issues? (This would be *really* useful
- a driver writer knows and understand ths EGL/Integration issues, but will
be much less aware of any 'engineering modifications' that need applying to
different parts of the Android code).
- Is the source to libhgl around - or is this a third-party entity?
- Does pixelflinger always use OpenGL for surface composition?

- I imagine that hooking up to a 'generic' OpenGL ES driver (requiring no
integration) would result in ugly uploads/downloads being performed (unless
all of the surfaces used for rendering were derived from EGL surfaces).






Mathias Agopian
<pixelflinger@goo
gle.com> To
Sent by: android...@googlegroups.com
android-porting@g cc
ooglegroups.com
Subject
[android-porting] Re: Use of 3D
26/11/2008 08:01 Hardware Accelerator


Please respond to
android-porting@g
ooglegroups.com
ForwardSourceID:NT000039F6

Mathias Agopian

unread,
Nov 26, 2008, 4:38:57 AM11/26/08
to android...@googlegroups.com
Hi Phil,

As I said, all these questions will be answered when we do have the
framework to do this. I can try to answer some:

> - With respect to integrating OpenGL ES -
> - What forms of OpenGL ES surfaces does Android use? (Window? PBuffer?
> NativePixmap?).

Window only for now. Of course, a 3rd party GL application is free to
use whatever it wants.

> - Does Android expect to allow accelerated rendering to a UI component AND
> to be able to draw to it directly?

SurfaceFlinger expects to be able to do this. If this feature is not
available, it will revert to making copies. There are no clear gl
extensions to do this, so we expect "some" integration to have to
happen for each GL implementation.

> - Which Android UI components translate into being EGL Surfaces of one
> shape or another?

All windows in the system.

> - Do you have a list of where the integration 'hacks' need to be made in
> the Android code to ease integration issues? (This would be *really* useful

Yes I do. However, these integrations points will change and/or go away.

> - a driver writer knows and understand ths EGL/Integration issues, but will
> be much less aware of any 'engineering modifications' that need applying to
> different parts of the Android code).

There are mainly 2 issues: one, the EGL implementation needs to know
about the platform's native types (NativeWindowType, this is defined
in egl_natives.h), and then there is the question of GPU management
(when to power it down, when to reset it (if an app goes berserk)) and
finally there is the question of passing surfaces across processes.
These are the "hard" problems that, currently, are not resolved in a
generic way.

> - Is the source to libhgl around - or is this a third-party entity?

Never. libhgl *is* the h/w accelerated gl. This is the library that
Android doesn't want to know about, shouldn't care about -- it just
needs to follow a certain protocol (namely EGL 1.0 at the bare
minium). By definition this cannot be part of Android, it is a "board"
dependent component (a user-space driver if you will).

> - Does pixelflinger always use OpenGL for surface composition?

pixelflinger never use h/w acceleration -- it is just a software
renderer used *when* there is no h/w acceleration. SurfaceFlinger, on
the other hand, android's composition engine uses OpenGL ES.


> - I imagine that hooking up to a 'generic' OpenGL ES driver (requiring no
> integration) would result in ugly uploads/downloads being performed (unless
> all of the surfaces used for rendering were derived from EGL surfaces).

No. It would just not work at all. What would a "generic" driver
(whatever that means) do when eglCreateWindowSurface() is called on
it?


Now if you *reallllllllly* want to use your h/w driver now, there is a
(non efficient, but probably good enough) way to do it. You could do
all the rendering in hardware in "private" buffers, and when
eglSwapBuffers() is called, you could make a copy (with memcpy).

Have a look at libagl.so (it's part of core Android, and therefore is
opensource), it is our *software* OpenGL ES implementation. In
particular, look at egl.cpp, which is its EGL implementation. Yours
should look like something like it. If your GPU is able to render
into "regular" memory you're all set -- it will "just work". If it
doesn't (like most GPU), you'll need to make a copy.

Note that what I describe here is a STOP GAP, until we get the HAL put
together. You won't get good performance with this. Also, be *assured*
that your driver will break when we roll out the new GPU framework and
that no effort will be made to maintain source or binary compatibility
-- you've been warned :-)


I would strongly encourage you to try to make your h/w work with the
"trick" above where you make a copy of the framebuffer upon
eglSwapBuffers(), this will require you to do at least half of the
work needed for the "real thing".

Mathias

Pivotian

unread,
Nov 26, 2008, 4:46:36 AM11/26/08
to android-porting
one more thing Mathian

I suppose that Android uses the OPENCORE from packet video to do all
the stuffs related to video encoding and decoding. I want to use the
built in capabilities of my processor which provides Standard level
encoding/decoding of multiple content formats including MPEG4, H.263,
H.264. How i will use the inbuilt hardware based encoding decoding
instead of Android's software based opencore ?

> ...
>
> read more »

Phil HUXLEY

unread,
Nov 26, 2008, 5:10:35 AM11/26/08
to android...@googlegroups.com
Thanks Mathias,

- So in the world of enabling GL rendering and software rendering to the
same surface (and related copies), whet are the points that require the
buffer to be copied and does the copy need to go both ways - for example if
the following happens...

GL rendering
SW rendering -> Implies that the GL buffer is uploaded.
More GL rendering.. -> Implies that the Buffer is downloaded
again (so that the render target contains both Previously rendered GL stuff
and SW stuff)
More SW rendering -> Implies another upload of the GL buffer
Update the screen -> Assorted buffers are composited onto the screen.

- There is an EGL extension called EGL_lock_surface which is a requirement
for OpenKODE, that enables sw access to a GL buffer (this doesn't mean that
no copies will be going on). Not sure how widely supported this is. You
could base your implementation on using this extension (OpenKODE is gaining
momentumn) - and driver writers will either strive to support it
efficiently, or use copies (which would probably be needed anyway). The
Lock/Unlock of a surface demarks when it's possible to use one over the
other.

- Are there any optimisations that kick in if the GL window is full-screen
so that compositing becomes nothing more than a buffer swap? In the
quality games world, this is actually what you want.

- Thanks for the pointer w.r.t. getting going if we reallllly want to, the
performance might be ugly though.

- Do you have an ETA for the OpenGL ES HAL ? It's not on the published
roadmap.

Thanks,
Phil.







Mathias Agopian
<pixelflinger@goo
gle.com> To
Sent by: android...@googlegroups.com
android-porting@g cc
ooglegroups.com
Subject
[android-porting] Re: Use of 3D
26/11/2008 09:38 Hardware Accelerator
ForwardSourceID:NT00003A32

Pivotian

unread,
Nov 26, 2008, 5:20:28 AM11/26/08
to android-porting
sorry Mathias for repeating my question but every time I put this
question, its overridden by some other questions and it didn't got
your attention. I have some doubt regarding video codecs:

Mathias Agopian

unread,
Nov 26, 2008, 5:25:55 AM11/26/08
to android...@googlegroups.com
Hi,

I don't know the answer to these questions.

Mathias

Pivotian

unread,
Nov 26, 2008, 5:30:08 AM11/26/08
to android-porting
no problem Mathias you already provided some valuable information
regarding 3D accelerator, thanks for that
As you are working in google, you can answer these questions with the
help of someone else working there and who is strong in the area of
video codecs ;) ..........J

On Nov 26, 3:25 pm, Mathias Agopian <pixelflin...@google.com> wrote:
> Hi,
>

Mathias Agopian

unread,
Nov 26, 2008, 5:30:12 AM11/26/08
to android...@googlegroups.com
On Wed, Nov 26, 2008 at 2:10 AM, Phil HUXLEY <phil....@3dlabs.com> wrote:
>
> Thanks Mathias,
>
> - So in the world of enabling GL rendering and software rendering to the
> same surface (and related copies), whet are the points that require the
> buffer to be copied and does the copy need to go both ways - for example if
> the following happens...
>
> GL rendering
> SW rendering -> Implies that the GL buffer is uploaded.
> More GL rendering.. -> Implies that the Buffer is downloaded
> again (so that the render target contains both Previously rendered GL stuff
> and SW stuff)
> More SW rendering -> Implies another upload of the GL buffer
> Update the screen -> Assorted buffers are composited onto the screen.

Basically you'd need to hook up eglWaitGL() and eglWaitNative()
properly. As I said, it becomes ugly.

Android requires that you'd be able to draw into your surfaces with
the CPU, and that these can be used as textures (in another process).
Generally, we don't need to be able to draw in a given surface with
both the CPU and GPU, but the current API doesn't allow to not support
this. The new API will.

> - There is an EGL extension called EGL_lock_surface which is a requirement
> for OpenKODE, that enables sw access to a GL buffer (this doesn't mean that
> no copies will be going on). Not sure how widely supported this is. You
> could base your implementation on using this extension (OpenKODE is gaining
> momentumn) - and driver writers will either strive to support it
> efficiently, or use copies (which would probably be needed anyway). The
> Lock/Unlock of a surface demarks when it's possible to use one over the
> other.

This sounds like a good idea.

> - Are there any optimisations that kick in if the GL window is full-screen
> so that compositing becomes nothing more than a buffer swap? In the
> quality games world, this is actually what you want.

There are not at the moment, but such an optimization in planed. It
will be implemented in SurfaceFlinger.

> - Thanks for the pointer w.r.t. getting going if we reallllly want to, the
> performance might be ugly though.

It may not be that bad depending on your h/w. If your screen is not
too big (QVGA or HVGA) and if you have a good bandwidth between the
GPU and main memory; it depends on your h/w really. A 2 ms overhead
for instance would be acceptable. When you're approaching 5 or 6 ms,
it starts getting too high.


> - Do you have an ETA for the OpenGL ES HAL ? It's not on the published
> roadmap.


No ETA, but It is high on the list of priorities.

Mathias

Phil HUXLEY

unread,
Nov 26, 2008, 8:44:28 AM11/26/08
to android...@googlegroups.com
An API that doesn't require mixed-mode rendering gets my vote (Ask anyone
who had to wrestle with JSR-184 in the java world - there was one instance
of a benchmark that effectively halved the frame rate acheived by drawing
the 'fps' value using Java - not many pixels touched, but ugly
uploads/downloads). Presumably an Application would get a special GL
window that then could not be drawn to using software? GL programmers are
well used to avoiding mixed-mode rendering, it has always been a
significant performance issue. It would be better to composite UI stuff on
top of GL stuff using different windows. The issues of making a GL app go
fast is a separate issue from making the composition work well.

That coupled with a full-screen optimisation will make a significant
performance diffecence to an OpenGL ES game.

ETA - Does high priority mean sometime in H1 2009?






Mathias Agopian
<pixelflinger@goo
gle.com> To
Sent by: android...@googlegroups.com
android-porting@g cc
ooglegroups.com
Subject
[android-porting] Re: Use of 3D
26/11/2008 10:30 Hardware Accelerator


Please respond to
android-porting@g
ooglegroups.com







ForwardSourceID:NT00003A56

Dave Sparks

unread,
Nov 26, 2008, 11:19:01 AM11/26/08
to android-porting
You have two choices for taking advantage of your h/w acceleration:

1. Integrate your codecs into the OpenCore framework. You can do this
using the exising OpenMax decoder node, or you can adapt one of PV's
native decoder nodes to work with your hardware.

2. Implement your own media player (MediaPlayerInterface.h). If your
hardware vendor already has hardware codecs integrated into another
media framework (e.g. gstreamer), you could write an adapter class
that sits on top of the media framework.

Nimit Manglick

unread,
Feb 9, 2009, 1:31:48 AM2/9/09
to android...@googlegroups.com
Hi,

I have my android working on 2.6.24 kernel on TI Omap 3530 EVM, now i want to

enable hardware acceleration onto it.

As this is a very old post (more than 2 months old) , so what are the current steps / way

to enable hardware acceleration onto my omap 3530.

As i understood from this post that my vendor (TI here) has to provide me the

hardware implementation of OpenGL as libhgl.so and the driver , other than this what all i need to do ??

Thanks & Regards
Nimit

Mathias Agopian

unread,
Feb 9, 2009, 1:38:06 AM2/9/09
to android...@googlegroups.com
Android is not ready at the moment to work with a different GPU than
that of the G1. We're working on it :-)

Mathias

Nimit Manglick

unread,
Feb 9, 2009, 2:21:05 AM2/9/09
to android...@googlegroups.com
Hi Mathias,

Thanx for the quick response.

So bottom line is :-

Currently android cant use hardware acceleration of any other hardware other than G1.
Is it ??

And work is going on to make it portable just like any other component like its HAL interface ?? Right ??

If yes then when can such code be availbale to open source ?

Thanks & Regards
Nimit

Mathias Agopian

unread,
Feb 9, 2009, 2:42:41 AM2/9/09
to android...@googlegroups.com
On Sun, Feb 8, 2009 at 11:21 PM, Nimit Manglick <nimita...@gmail.com> wrote:
> Hi Mathias,
>
> Thanx for the quick response.
>
> So bottom line is :-
>
> Currently android cant use hardware acceleration of any other hardware other
> than G1.
> Is it ??

Not without a lot of pain, and actually modifying several elements of
the framework.

> And work is going on to make it portable just like any other component like
> its HAL interface ?? Right ??

Correct.

> If yes then when can such code be availbale to open source ?

That's one of my top priority, unfortunately I don't know when it'll
be ready. "As soon as possible" is the best answer I can give.

Mathias

Nimit Manglick

unread,
Feb 9, 2009, 3:13:53 AM2/9/09
to android...@googlegroups.com
ok..

So can you give me a brief idea what all components / layers / framework needs to modify

in android to support h/w acceleration (If i think to modify).

Nimit Manglick

unread,
Feb 9, 2009, 3:59:18 AM2/9/09
to android...@googlegroups.com
One more point.

If somehow i made it work now by tweaking here and there in android,

will this work of mine become obsolete with future releases of android ??

or android wil be going to maintain its backward compatibilty even it releases with HAL ?

Mathias Agopian

unread,
Feb 9, 2009, 4:03:59 AM2/9/09
to android...@googlegroups.com
On Mon, Feb 9, 2009 at 12:59 AM, Nimit Manglick <nimita...@gmail.com> wrote:
> One more point.
>
> If somehow i made it work now by tweaking here and there in android,
>
> will this work of mine become obsolete with future releases of android ??

Most definitely yes.

> or android wil be going to maintain its backward compatibilty even it
> releases with HAL ?

No, android doesn't maintain any form of backward compatibility with
non public APIs and HAL modules.

Mathias

Nimit Manglick

unread,
Feb 9, 2009, 4:31:03 AM2/9/09
to android...@googlegroups.com
Thanx Mathias.

is there any document which explains the graphical architecture / design of android.?

I mean complete flow from Application till the kernel obviously OpenGl in between. ?

I have struggled with it a lot but still the complete picture is not yet clear to me :(

Regards
Nimit

Mathias Agopian

unread,
Feb 9, 2009, 4:40:02 AM2/9/09
to android...@googlegroups.com
On Mon, Feb 9, 2009 at 1:31 AM, Nimit Manglick <nimita...@gmail.com> wrote:
> Thanx Mathias.
>
> is there any document which explains the graphical architecture / design of
> android.?

Unfortunately no. There will be one once the new architecture is in place.

> I mean complete flow from Application till the kernel obviously OpenGl in
> between. ?
>
> I have struggled with it a lot but still the complete picture is not yet
> clear to me :(

Mathias
Reply all
Reply to author
Forward
0 new messages