Hi Yadnesh,
> 1) Where may I look to refer to the interfaces of these libraries for
> using these libraries in my Native application for playing A/V?
Probably the best example source code for this would be the code for
ffplay which should be part of the same package I believe.
> 2) Is there any open source that I can use to be able to render the
> Audio and Video onto Android device using Native code?
There is no official NDK way of doing this currently. However, if you
get a copy of the platform/framework package, you can find the
AudioRecord.h and AudioTrack.h interfaces which will allow you to
capture and playback PCM on the device. They are similar to the Java
AudioRecord and AudioTrack, so if you would like to make your
application more portable, then you can rely on the Java interface
(which is part of the public API) to do the capture and playback, and
then you can communicate with the FFmpeg libraries through JNI.
For the video recording, you will need to look for the Camera.h in the
same package. However, again, this is not part of the official/public
API, so the alternative would be using the Java Camera interface, and
passing the received frames to FFmpeg through JNI. This approach used
to be very slow on Droid devices, but on HTC devices I had really good
frame rates.
> 3) Does ffserver work on Android? Is it possible to stream a A/V file
> using ffserver on android and use the link to Play using the
> MediaPlayer class in SDK?
I believe that ffserver should work fine. As long as you can stream
HTTP or RTSP streams, MediaPlayer should be able to play them but
please double check if your codecs are supported by OpenCore in that
case. MediaPlayer does buffering initially so it's not suitable if you
would like to make a realtime communication application.
Regards,
-onur
---
www.zdo.com