Play Audio and Video on Android using FFMpeg

3,449 views
Skip to first unread message

Yadnesh

unread,
Jun 18, 2010, 6:16:49 AM6/18/10
to android-ndk
Hi All,

I have built FFMpeg for on Android NDK with help from all the
information available on google groups and other resources on the
net.

I have the following binaries available in NDK_ROOT/out/apps/ffmpeg/:
libavcodec.a
libavformat.a
libavutil.a
libpostproc.a
libswscale.a

I have following questions:
1) Where may I look to refer to the interfaces of these libraries for
using these libraries in my Native application for playing A/V?

2) Is there any open source that I can use to be able to render the
Audio and Video onto Android device using Native code?

3) Does ffserver work on Android? Is it possible to stream a A/V file
using ffserver on android and use the link to Play using the
MediaPlayer class in SDK?

Probably too many or incorrect questions asked here. But I am novice
on Android platform and not sure of intecracies involved in accessing
the buffers from Native application for displaying on device UI

All suggestions and pointers are welcome :)

Thanks and Regards,
Yadnesh

Onur Cinar

unread,
Jun 18, 2010, 1:29:32 PM6/18/10
to android-ndk

Hi Yadnesh,

> 1) Where may I look to refer to the interfaces of these libraries for
> using these libraries in my Native application for playing A/V?

Probably the best example source code for this would be the code for
ffplay which should be part of the same package I believe.

> 2) Is there any open source that I can use to be able to render the
> Audio and Video onto Android device using Native code?

There is no official NDK way of doing this currently. However, if you
get a copy of the platform/framework package, you can find the
AudioRecord.h and AudioTrack.h interfaces which will allow you to
capture and playback PCM on the device. They are similar to the Java
AudioRecord and AudioTrack, so if you would like to make your
application more portable, then you can rely on the Java interface
(which is part of the public API) to do the capture and playback, and
then you can communicate with the FFmpeg libraries through JNI.

For the video recording, you will need to look for the Camera.h in the
same package. However, again, this is not part of the official/public
API, so the alternative would be using the Java Camera interface, and
passing the received frames to FFmpeg through JNI. This approach used
to be very slow on Droid devices, but on HTC devices I had really good
frame rates.

> 3) Does ffserver work on Android?  Is it possible to stream a A/V file
> using ffserver on android and use the link to Play using the
> MediaPlayer class in SDK?

I believe that ffserver should work fine. As long as you can stream
HTTP or RTSP streams, MediaPlayer should be able to play them but
please double check if your codecs are supported by OpenCore in that
case. MediaPlayer does buffering initially so it's not suitable if you
would like to make a realtime communication application.

Regards,

-onur



---
www.zdo.com



nayan

unread,
Jul 13, 2010, 12:11:55 AM7/13/10
to android-ndk
Hi all,

I am confused how to build ffserver/ffmpeg on android. Can you give
steps to do this?
Then after ffmpeg build, I want to play RTSP stream in android. How
can i do this?

Thanks,
NBK

Angus Lees

unread,
Jul 13, 2010, 7:58:53 PM7/13/10
to andro...@googlegroups.com
On Tue, Jul 13, 2010 at 14:11, nayan <kapadi...@gmail.com> wrote:
Hi all,

I am confused how to build ffserver/ffmpeg on android. Can you give
steps to do this?
Then after ffmpeg build, I want to play RTSP stream in android. How
can i do this?

Android doesn't provide any of the normal Linux low-level audio methods (OSS, ALSA, etc), so you will have to write some code to shovel the PCM audio out of ffmpeg and into Android's AudioTrack class.

 - Gus

--
You received this message because you are subscribed to the Google Groups "android-ndk" group.
To post to this group, send email to andro...@googlegroups.com.
To unsubscribe from this group, send email to android-ndk...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/android-ndk?hl=en.


Yadnesh

unread,
Jul 20, 2010, 1:42:11 AM7/20/10
to android-ndk
I am streaming mpeg4 Video and AAC audio over RTP using ffmpeg and
ffserver. But I do not see any playback in Android 1.6 VideoView.
I have posted a query on android developers forum but no solution
yet. Could anyone on this forum provide any hints?

http://groups.google.com/group/android-developers/browse_thread/thread/8173097661d44b1d/d74cf719e80b2727?lnk=gst&q=mpeg4#

I did not understand the comment "you will have to write some code to
shovel the PCM audio out of ffmpeg and into Android's AudioTrack
class". I always thought that using ffserver, if a media format that
is listed in "Android supported formats" is streamed, then that should
be playable on Android. Could you please elaborate?

I have transcoded media using ffmpeg to MP4 file, and these play well
on Android. But I want to playback streaming media over RTSP.

Format of MP4 file that plays on android: (as shown by ffprobe)
=======================================================================================
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'test.mp4':
Metadata:
major_brand : isom
minor_version : 0
compatible_brands: mp41
encoder : vlc 1.0.1 stream output
encoder-eng : vlc 1.0.1 stream output
Duration: 00:00:09.80, start: 0.000000, bitrate: 456 kb/s
Stream #0.0(eng): Audio: aac, 44100 Hz, mono, s16, 18 kb/s
Stream #0.1(eng): Video: mpeg4, yuv420p, 192x144 [PAR 3:4 DAR
1:1], 432 kb/s, 14.99 fps, 25 tbr, 1001 tbn, 25 tbc
=======================================================================================
Regards,
Yadnesh

On Jul 14, 4:58 am, Angus Lees <al...@google.com> wrote:
> > android-ndk...@googlegroups.com<android-ndk%2Bunsu...@googlegroups.com>
> > .

Angus Lees

unread,
Jul 20, 2010, 2:07:50 AM7/20/10
to andro...@googlegroups.com
I'm confused. Does the version of ffmpeg you are running have any
Android-specific display code in it? How is the video content
supposed to be jumping from your ffmpeg code and into the VideoView?

Oh - are you running this ffmpeg code on the Android (ie: writing your
own video client), or is this server-side and then you are asking why
the normal Android media streamer can't view this stream? If
server-side then this is not a question for android-ndk;
android-developers is probably a better place to ask.

 - Gus

> To unsubscribe from this group, send email to android-ndk...@googlegroups.com.

Yadnesh

unread,
Jul 20, 2010, 3:27:48 AM7/20/10
to android-ndk
Agreed, My issue is not something to do with the NDK. So I had posted
on 'Android developers'. Sorry to cross-link the posts,...just
desperate for a solution.

Using Android NDK I have built ffmpeg. I use FFMpeg as a Media
streamer over RTSP.

And I intend to use VideoView or MediaPlayer as client for that
stream.

Regards,
Yadnesh

On Jul 20, 11:07 am, Angus Lees <al...@google.com> wrote:
> I'm confused.  Does the version of ffmpeg you are running have any
> Android-specific display code in it?  How is the video content
> supposed to be jumping from your ffmpeg code and into the VideoView?
>
> Oh - are you running this ffmpeg code on the Android (ie: writing your
> own video client), or is this server-side and then you are asking why
> the normal Android media streamer can't view this stream?  If
> server-side then this is not a question for android-ndk;
> android-developers is probably a better place to ask.
>
>  - Gus
>
> On Tue, Jul 20, 2010 at 15:42, Yadnesh <yadn...@gmail.com> wrote:
> > I am streaming mpeg4 Video and AAC audio over RTP using ffmpeg and
> > ffserver.  But I do not see any playback in Android 1.6 VideoView.
> > I have posted a query on android developers forum but no solution
> > yet.  Could anyone on this forum provide any hints?
>
> >http://groups.google.com/group/android-developers/browse_thread/threa...
Reply all
Reply to author
Forward
0 new messages