Capturing RTSP stream using javacv on Android

3,920 views
Skip to first unread message

Ashish Nangia

unread,
Nov 7, 2013, 6:07:14 PM11/7/13
to jav...@googlegroups.com
The aim is to capture RTSP streams (H.264 encoding - framerate 30, ) from IPCAM and display on ImageView in my Android Application.

Specs of video stream
Encoding: H.264 (IP camera also supports MPEG-4). At this stage I do not care but H.264 would be preferred.
Frame rate: 30
Frame size: 384x216


I have done the following so far:
1. Created a new project in Eclipse.
2. Followed instructions from JAVACV readme.txt
    • Downloaded javacv-0.6-bin.zip
    • Downloaded javacv-0.6-cppjars.zip
    • Copied 'javacpp.jar' and 'javacv.jar' to libs folder
    • Extract all the `*.so` files from `javacv-android-arm.jar`, `opencv-2.4.6.1-android-arm.jar`, and `ffmpeg-2.0.1-android-arm.jar` directly into the newly created "libs/armeabi" folder, without creating any of the subdirectories found in the JAR files.
    • Navigate to Project > Properties > Java Build Path > Libraries and click "Add JARs...".
    • Select both `javacpp.jar` and `javacv.jar` from the newly created "libs" folder.
3. Verified that the RTSP stream from my IP Camera is working correctly. I used VLC on Win7 to play the RTSP stream. I also used an Android application called 'IP Cam Viewer Basic v4.8.7.apk' and it was able to play the RTSP stream.
4. Wrote a very simple Android application as follow:

                try
                {
                    //OpenCVFrameGrabber grabber = new OpenCVFrameGrabber("rtsp://192.168.8.122:554/live2.sdp");
                    FrameGrabber grabber = new FFmpegFrameGrabber("rtsp://192.168.8.122:554/live2.sdp");
                    grabber.setFormat("rtp");
                    grabber.setFrameRate(30);
                    grabber.start();
                    IplImage frame = grabber.grab();

                    while (/*canvasFrame.isVisible() &&*/ (frame = grabber.grab()) != null)
                    {
                        //canvasFrame.showImage(frame);
                    }
                    grabber.stop();
                    //canvasFrame.dispose();
                }
                catch(Exception e)
                {
                    Log.e(LOGTAG, "Exception MyRunnable: " + e.getMessage());
                }


Initially I was using FFmpegFrameGrabber but logcat reported that
Exception MyRunnable: avformat_open_input() error -1330794744: Could not open input "rtsp://192.168.8.122:554/live2.sdp". (Has setFormat() been called?)

I have tried using the following with no luck.
grabber.setFormat("rtsp");
grabber.setFormat("rtp");
grabber.setFormat("h264");

After some googling I came across the following link which indicated to use OpenCVFrameGrabber, but still no luck.

If I use OpenCVFrameGrabber then I get the following Exception in logcat:
Exception MyRunnable: cvCreateFileCapture() Error: Could not create camera capture.

Can someone please advice on what I can do in order to get RTSP packets captured?
Final step (once I get the above working) would be to convert Ipllmage to Bitmap.

Many thanks.

Samuel Audet

unread,
Nov 17, 2013, 7:29:38 AM11/17/13
to jav...@googlegroups.com
On 11/08/2013 08:07 AM, Ashish Nangia wrote:
> Can someone please advice on *what I can do in order* to get RTSP packets captured?
> Final step (once I get the above working) would be to convert Ipllmage to Bitmap.

If your RTSP stream uses TCP, it seems like we need to set the
rtsp_transport format option accordingly:
https://trac.ffmpeg.org/wiki/StreamingGuide
This interface isn't exposed by FFmpegFrameGrabber, but I'm guessing
that calling
av_dict_set(options, "rtsp_transport", "tcp", 0);
just before the call to `avformat_open_input()` should to the trick.

Please post any patch to FFmpegFrameGrabber that you create as a new
"issue" here to let other users benefit.
http://code.google.com/p/javacv/issues/
Thanks!

Samuel
Reply all
Reply to author
Forward
0 new messages