SurfaceFlinger, Frame buffer device, Overlays

2020 views
Skip to first unread message

Ryan

unread,
Feb 25, 2010, 9:24:56 PM2/25/10
to android-porting
Hi all,
I am trying to understand the different components of Android's
display system and how they work together. I have been sifting
through the source, but I still have a few fundamental questions.
Here is my current understanding of a couple of these components:

SurfaceFlinger: It composes the various Surfaces, or layers,
together. As of 1.6, it seems that EGLDisplaySurface is working with
the frame buffer device (/dev/graphics/fb0) in order to output to the
screen.

Framebuffer device: It's an abstraction of the graphics hardware.
The most direct way to access the display is through the frame buffer
device (e.g. read from it to take a screenshot).

Hardware Overlays: The SurfaceFlinger punches a hole in the window
surface in order to let the hardware overlay compose its frame data
directly to the screen. They are used with image capture and hardware
acceleration devices.

With this in mind, I am unclear about the following:

1. What is responsible for composing the overlay image with the main
surface? From what I've read it seems like the overlay implementation
would, but I've yet to see anything directly address this.

2. If so, does the overlay driver write to the frame buffer device,
or does it output to the screen in a different way? Is this
implementation-dependent?

3. My ultimate concern is if hardware overlays are in use and I took
a screenshot by reading from the frame buffer device, would I see the
"hole" punched out by the SurfaceFlinger, or would I see the same
image as seen on my screen?

4. What are the use-cases for overlays? I assume they are used to
allow hardware to handle the frame data manipulation (rather than
software), but I still don't know why/when an overlay would be
necessary.

Answers to these questions and any more information as to how
surfaceflinger, hardware overlays, and the frame buffer device
interact would be very much appreciated!

Thanks,
Ryan

Ryan

unread,
Mar 1, 2010, 9:55:57 AM3/1/10
to android-porting
Is there a different group I should be posting this to?

Gilad Ben-Yossef

unread,
Mar 3, 2010, 3:59:44 AM3/3/10
to ry...@scudellari.com, android-porting

Ryan wrote:

Hi all,
I am trying to understand the different components of Android's
display system and how they work together.  I have been sifting
through the source, but I still have a few fundamental questions.
Here is my current understanding of a couple of these components:

SurfaceFlinger:  It composes the various Surfaces, or layers,
together.  As of 1.6, it seems that EGLDisplaySurface is working with
the frame buffer device (/dev/graphics/fb0) in order to output to the
screen.

Framebuffer device:  It's an abstraction of the graphics hardware.
The most direct way to access the display is through the frame buffer
device (e.g. read from it to take a screenshot).

Hardware Overlays:  The SurfaceFlinger punches a hole in the window
surface in order to let the hardware overlay compose its frame data
directly to the screen.  They are used with image capture and hardware
acceleration devices.

With this in mind, I am unclear about the following:

1.  What is responsible for composing the overlay image with the main
surface?  From what I've read it seems like the overlay implementation
would, but I've yet to see anything directly address this.
  
Well, check out  CopyBits usage in the surfaceflinger source code (not technically an overlay but uses the same mechanism) or what how the camera app display a video preview of the camera view.

2.  If so, does the overlay driver write to the frame buffer device,
or does it output to the screen in a different way?  Is this
implementation-dependent?
  
AFAIK, it writes to the frame buffer memory via DMA, although I guess an alternate implementation is possible.

3.  My ultimate concern is if hardware overlays are in use and I took
a screenshot by reading from the frame buffer device, would I see the
"hole" punched out by the SurfaceFlinger, or would I see the same
image as seen on my screen?
  
If it's not on the frame buffer, it's not an the screen, at least in the devices I have look through so far.

4.  What are the use-cases for overlays?  I assume they are used to
allow hardware to handle the frame data manipulation (rather than
software), but I still don't know why/when an overlay would be
necessary.
  
Take for example translating the data taken from the camera device and displaying the preview on the screen.
The pixel format of the camera and the frame buffer do not match, so you need to translate, probably even stretch the image.

You can do it in software (in fact, there is a software implementation) but a hardware device can do it much quicker and without eating valuable CPU time, hence use of an overlay.

Answers to these questions and any more information as to how
surfaceflinger, hardware overlays, and the frame buffer device
interact would be very much appreciated!

  
I have a question - what are you trying to do? sounds interesting.

Gilad
Thanks,
Ryan

  


-- 
Gilad Ben-Yossef
Chief Coffee Drinker & CTO
Codefidence Ltd.

Web:   http://codefidence.com
Cell:  +972-52-8260388
Skype: gilad_codefidence
Tel:   +972-8-9316883 ext. 201
Fax:   +972-8-9316884
Email: gi...@codefidence.com

Check out our Open Source technology and training blog - http://tuxology.net

	"That is not dead which can eternal lie.
	 And with strange aeons even death may die."

steve2641

unread,
Apr 1, 2010, 6:08:11 PM4/1/10
to android-porting
In general terms, an "overlay" is part of a display controller
hardware block that it uses primarily to combine multiple streams of
image frames into a single output frame which is sent to a display
panel of some sort. The Android abstraction for this is a hardware
specific library that implements the interface defined in the
overlay.h file, nested deep within the root hardware folder. In this
abstraction, the surface flinger provides positional control
information and some other entity provide the frame data. The typical
frame data providers are Camera HAL and Video Playback "HAL".

To answer your questions directly:

1) the hardware performs the frame composition and the overlay library
implementation performs the hardware setup
2) this is implementation dependent. Some overlay implementations are
based of secondary frame buffer device, other have more proprietary
means.
3) more that likely you would see a black hole where the overlay data
was suppose to show up. This occurs since many display controllers
don't provide access to the hardware composed output, it goes directly
to the display and never hits a memory buffer you can read from. So
the screen shot code generally will only read from the UI input buffer
and the overlay contents are missing. Not all hardware works this
way, but many do.
4) The main use cases are camera preview and video playback. On most
hardware, it more power efficient to render the higher frame rate data
to the display via hardware overlays.

Steve.

On Feb 25, 9:24 pm, Ryan <r...@scudellari.com> wrote:

Girish

unread,
Sep 27, 2011, 4:06:03 PM9/27/11
to steve2641, android...@googlegroups.com
Hi Steve,

Is it not possible to tap the composed screen data at the framework ?
If yes where exactly in the framework i can tap this data !!

Regards
Girish

Reply all
Reply to author
Forward
0 new messages