Video using H.264 on iOS

1,371 views
Skip to first unread message

DavidW

unread,
Mar 18, 2014, 1:58:13 PM3/18/14
to bigblueb...@googlegroups.com
After attempting many different scenarios, I've not been able to successfully display an H.264 video stream on iOS.  Was curious how/if anyone else has been successful doing this.

If I turn off H.264 on the server, and attempt this using the same app(s), it works great.

I've tried this using the latest bbb-air-client code and also using a custom app I've written just to subscribe to the specific video stream.  I've added the H264VideoStreamSettings object to the NetStream, using:

var streamSettings:H264VideoStreamSettings = new H264VideoStreamSettings();

streamSettings.setProfileLevel(H264Profile.BASELINE, H264Level.LEVEL_3_1);

_recvStream.videoStreamSettings=streamSettings;

Settings in config.xml:

<module name="VideoconfModule"

      …

      enableH264="true"

      h264Level="3.1"

      h264Profile="baseline"

      …

</module>

Attempted baseline/3.1, baseline/3.0, main/2.1 (default for BBB), but, nothing seems to work to display the video.  I've attempted using both flash.media.Video, and StageVideo in the client.  The Apple guidelines "seem" to state these H264 settings are valid for the device I'm testing on (iPhone 5), but, nothing seems to work.  I'm using the Flex 4.12 SDK.

Again, turning off H.264 in BBB thru the config.xl, "enableH264=false", then the client(s) I'm testing with work fine.  Just to make sure the client is delivering the right video settings, I've traced into the client using the FB debugger (in VideoProxy.as) and it is definitely seeing the configuration correctly and setting the appropriate H264 settings.

Has anyone gotten a video stream to work using H.264 to iOS?

Also, just curious, and, as I beat my head against the wall with this, it seems that, in the mobile use case of using BBB, only one stream will be able to be subscribed to by the mobile client.  And, assuming only one stream is viable (i.e. presenter) in the mobile use case, seems feasible that 1) the stream doesn't need to be H.264 (rtmpt works to iOS/Android), and 2) separate streams could easily be published, one in H.264 (for web-only clients, i.e. non-Flash based clients), and one rtmpt could be used for real application clients (i.e. Flash-based PC clients and mobile app clients).  Also, I believe that you could combine all the streams on the red5 side into one stream to allow video conf scenario viewable to all on mobile (I believe someone is working on this).  Anyway, just curious about what the thoughts are here.

Chad Pilkey

unread,
Mar 18, 2014, 2:04:20 PM3/18/14
to bigblueb...@googlegroups.com
Adobe AIR doesn't support h264 on iOS.

The problem with providing multiple versions of stream is that the secondary streams have to be created and that takes both computational resources and time so the stream will be delayed. The mobile client also allows you to view any user's stream by going to their profile from the participants list and selecting "Show camera" so every stream would need to be duplicated.

Combining the streams into one is also a possibility, but creating the stream adds delay and uses up resources. You would also lose the ability to display the streams so that they use up as much space as possible because you are then stuck at a specific aspect ratio.

DavidW

unread,
Mar 18, 2014, 2:41:04 PM3/18/14
to bigblueb...@googlegroups.com

Chad:
Thanks for posting, I have read over Adobe's comment regarding this before:

For H.264 video, the iOS APIs for video playback accept only a URL to a file or
stream. You cannot pass in a buffer of H264 video data to be decoded. Depending
on your video source, pass the appropriate argument to NetStream.play() as
follows:   

- For progressive playback: Pass the URL of the file (local or remote).

- For streaming video: Pass the URL of a playlist in Apple's HTTP Live Streaming
(HLS) format. This file can be hosted by any server; Flash Media Server 4.5 and
higher has the advantage of being able to encode streams in HLS format.
but, guess I was just not interpreting the Adobe docs correctly, as in "..accept a stream…", which, to me, says it will "accept a stream", if you use "appropriate arguments…"! ;-)  Guess I was being overly optimistic.  It just seems very strange to have API's for H264 and not be able to decode only on iOS.  And, if you break down rtmpt, it's just a series of http requests, similar to HLS, so, I don't get the logic why Apple could detect the difference.  The Adobe API is the one that is closing the stream, not the Apple device.  

Regarding the single stream scenario, seems like delay wouldn't be that much of an issue if the audio were also delayed.  (Sorry, I'm still trying to understand the whole audio/video streaming space, so, I definitely don't know all the implications!)  But, I get it regarding your point about being stuck at a specific aspect ratio, was just trying to think how to be a minimum acceptable use case, given the constraints.

Thanks for the feedback!




Calvin Walton

unread,
Mar 18, 2014, 4:04:46 PM3/18/14
to bigblueb...@googlegroups.com
On Tue, 2014-03-18 at 11:41 -0700, DavidW wrote:


> For H.264 video, the iOS APIs for video playback accept only a URL to a file or
> stream. You cannot pass in a buffer of H264 video data to be decoded. Depending
> on your video source, pass the appropriate argument to NetStream.play() as
> follows:
>
> - For progressive playback: Pass the URL of the file (local or remote).
>
> - For streaming video: Pass the URL of a playlist in Apple's HTTP Live Streaming
> (HLS) format. This file can be hosted by any server; Flash Media Server 4.5 and
> higher has the advantage of being able to encode streams in HLS format.
> but, guess I was just not interpreting the Adobe docs correctly, as in
> "..accept a stream…", which, to me, says it will "accept a stream", if
> you use "appropriate arguments…"! ;-) Guess I was being overly
> optimistic. It just seems very strange to have API's for H264 and not
> be able to decode only on iOS. And, if you break down rtmpt, it's
> just a series of http requests, similar to HLS, so, I don't get the
> logic why Apple could detect the difference. The Adobe API is the one
> that is closing the stream, not the Apple device.

The issue is that you don't provide a "stream" to the Apple APIs; you
only provide a remote URL or local filename. The Apple API is then
responsible for connecting to the server and streaming the video; the
details are all hidden from the application. The Apple API only supports
basic HTTP and HLS protocols.

Some of the the Red5 developers have been looking into adding support
for producing HLS streams. If this is a possibility, then on the Android
applications we could use RTMP, but on Apple devices we could have it
use the URL of the HLS stream instead.

I have no idea about the current state of that project, and it will
probably require some amount of work to get BigBlueButton running on a
newer version of Red5.

--
Calvin Walton <calvin...@kepstin.ca>

DavidW

unread,
Mar 19, 2014, 9:35:25 AM3/19/14
to bigblueb...@googlegroups.com
The issue is that you don't provide a "stream" to the Apple APIs; you 
only provide a remote URL or local filename. The Apple API is then 
responsible for connecting to the server and streaming the video; the 
details are all hidden from the application. The Apple API only supports 
basic HTTP and HLS protocols. 

Don't understand "only supports basic HTTP and HLS protocols", for a live stream, you can use RTMPT and non-H.264 stream (Flash Video) on iOS from red5/bbb, and it works well.  

And, if you look at the raw bytes being transferred on the http requests, it's making ActionScript calls, not XCode/Apple calls.  I assume you mean this scenario (rtmpt/non-H.264) works because the video is simply a chain of HTTP post requests.

Calvin Walton

unread,
Mar 19, 2014, 9:43:25 AM3/19/14
to bigblueb...@googlegroups.com
My understanding is that for codecs other than H.264, Air is doing
software decoding in the Air runtime, rather than using the Apple video
playback APIs.

I don't know why they aren't doing software decoding of H.264; a couple
of possibilities are that:
* Apple's App Store policies might not allow applications that do
software decoding of H.264 (I'm not a registered iOS developer, so I
can't check if this is true).
* They're worried about battery life (H.264 decoding can be CPU
intensive)
* Adobe simply doesn't want to do H.264 licensing on iOS? The probably
isn't the case, since Flash on other platforms includes a software
decoder.

In any case, Adobe being Adobe, it's not like we can really do anything
about it...

--
Calvin Walton <calvin...@kepstin.ca>

Reply all
Reply to author
Forward
0 new messages