What does "tunnelled video" mean?

4,291 views
Skip to first unread message

einmus

unread,
Nov 9, 2014, 9:21:24 PM11/9/14
to android-...@googlegroups.com
Hi all,
    I'm an Android BSP developer. Since Android L source code released a few days ago, we're busy reviewing its new features and what's required for BSP. There's a new feature in multimedia that I couldn't figure it out by source codes and comments along.

    State of the art video technology with support for HEVC main profile to allow for UHD 4K 10-bit video playback, tunneled hardware video decoding to save power and improved HLS support for streaming

    In source codes, I see there's a new extensions added to OMX components. OMX.google.android.index.configureVideoTunnelMode. Its setup parameter is ConfigureVideoTunnelModeParams.

    What is pSidebandWindow mean? Since it is 'OUT' parameter, does it mean OMX should create it? something like overlay device on an external display? If so, how's system's default display affected and interacting with it?

    What is nAudioHwSync mean? Is it telling which audio device media playback should be synced to? If so, how about system's default (AwesomePlayer's) AV sync process affected?

    Who is going to use this extension? An external program? Since I cannot find anywhere in source codes to use this extension.

   
Great appreciated for answering!

Ganesh V

unread,
Dec 7, 2014, 9:05:59 AM12/7/14
to android-...@googlegroups.com
Hi,

I am also trying to understand the Tunneled Video Playback and have some views on the queries below. I am still trying to get a comprehensive understanding and hence, my response has some assumptions inbuilt into the same.

My first assumption is that this setup i.e. tunneled video playback is suited for an end HW device which can support hardware A/V synchronization. One of the devices that come to my mind is TV. It could be possible that some of the smartphone or tablet platforms could also support such feature, but in the absence of some examples, I am assuming that TV seems to be the targeted device/application.

An OMX component has to advertise that the component supports an advanced feature as "feature-tunneled-playback" as shown here:

Before proceeding with the setting of the configureVideoTunnelMode parameters, let's consider the "audio-hw-sync" parameter. This is retrieved from the AudioFlinger through the AudioSystem i.e. getAudioHWSyncForSession (sessionid). The comments/implementation in AudioFlinger indicates that the underlying HW mechanism is required for AV Sync. One more reference in source.android.com can be found here.

If the OMX component supports video tunneling, ACodec skips the allocation of output buffers from the native window. AwesomePlayer doesn't play any further role in the allocation of the output buffers or in A/V synchronization. In other words, the synchronization is purely through a private channel and doesn't involve any of the framework components.

pSideBandStream is a native_handle to sideband stream i.e. a video stream which is communicated directly or tunneled from OMX component to the video sink. In the rendering framework i.e. Surface / SurfaceFlinger / HwComposer , it becomes more apparent that the handling of sideband stream is completely device dependent.

BufferQueueProducer and Surface implementations have some interesting comments. In HwComposer, the layer is set to HWC_SIDEBAND. If the HWComposer for some reason can't support HWC_SIDEBAND, then it can revert the layer back to HWC_FRAMEBUFFER where only a solid color is displayed as indicated. Finally, in SurfaceFlinger's doComposeSurfaces method, we can observe that there is no handling for HWC_SIDEBAND.

To summarize, when Tunneled video playback is active,
  • OMX Component has advertised the support for the same
  • Complete feature is a vendor specific implementation and framework doesn't come into picture for resource allocation or synchronization
  • A/V Sync is handled at the hardware level - Support is required in Audio HAL to retrieve the references for the same
  • Video Decoder pushes the current frame directly to the display hardware
  • The Display hardware "probably" composes this picture on top of current picture being displayed and directly outputs to the screen
  • If the underlying HW is unable to support, a solid color is displayed
It would be nice if any other expert could confirm or help to correct and augment my understanding as above.

--GV
Reply all
Reply to author
Forward
0 new messages