Hi,
I am also trying to understand the Tunneled Video Playback and have some views on the queries below. I am still trying to get a comprehensive understanding and hence, my response has some assumptions inbuilt into the same.
My first assumption is that this setup i.e. tunneled video playback is suited for an end HW device which can support hardware A/V synchronization. One of the devices that come to my mind is TV. It could be possible that some of the smartphone or tablet platforms could also support such feature, but in the absence of some examples, I am assuming that TV seems to be the targeted device/application.
An OMX component has to advertise that the component supports an advanced feature as "feature-tunneled-playback" as shown
here:
Before proceeding with the setting of the configureVideoTunnelMode parameters, let's consider the "audio-hw-sync" parameter. This is retrieved from the AudioFlinger through the AudioSystem i.e. getAudioHWSyncForSession (sessionid). The
comments/implementation in AudioFlinger indicates that the underlying HW mechanism is required for AV Sync. One more reference in
source.android.com can be found here.
If the OMX component supports video tunneling,
ACodec skips the allocation of output buffers from the native window. AwesomePlayer doesn't play any further role in the allocation of the output buffers or in A/V synchronization. In other words, the synchronization is purely through a private channel and doesn't involve any of the framework components.
pSideBandStream is a native_handle to sideband stream i.e. a video stream which is communicated directly or tunneled from OMX component to the video sink. In the rendering framework i.e. Surface / SurfaceFlinger / HwComposer , it becomes more apparent that the handling of sideband stream is completely device dependent.
BufferQueueProducer and
Surface implementations have some interesting comments. In HwComposer, the
layer is set to HWC_SIDEBAND. If the
HWComposer for some reason can't support HWC_SIDEBAND, then
it can revert the layer back to HWC_FRAMEBUFFER where only a solid color is displayed as indicated. Finally, in
SurfaceFlinger's doComposeSurfaces method, we can observe that there is no handling for HWC_SIDEBAND.
To summarize, when Tunneled video playback is active,
- OMX Component has advertised the support for the same
- Complete feature is a vendor specific implementation and framework doesn't come into picture for resource allocation or synchronization
- A/V Sync is handled at the hardware level - Support is required in Audio HAL to retrieve the references for the same
- Video Decoder pushes the current frame directly to the display hardware
- The Display hardware "probably" composes this picture on top of current picture being displayed and directly outputs to the screen
- If the underlying HW is unable to support, a solid color is displayed
It would be nice if any other expert could confirm or help to correct and augment my understanding as above.
--GV