JavaCV+FFmpeg+RTP: Stream WebCam from Server to Client with RTP

3,965 views
Skip to first unread message

Lu Totò

unread,
Feb 21, 2013, 5:55:55 AM2/21/13
to jav...@googlegroups.com
Hello,

how to send the stream of webcam from Server (that captures the stream) to client (that receives it)?

I have the following code (to capture the webcam and send it with rtp):

package server;

/** Questa classe cattura il flusso di frame della webcam e li salva in un file in formato *.mpeg*/

import com.googlecode.javacpp.Loader;
import java.io.File;

import com.googlecode.javacv.CanvasFrame;
import com.googlecode.javacv.FFmpegFrameGrabber;
import com.googlecode.javacv.FFmpegFrameRecorder;
import com.googlecode.javacv.FrameGrabber;
import com.googlecode.javacv.FrameRecorder;
import com.googlecode.javacv.OpenCVFrameGrabber;
import com.googlecode.javacv.cpp.avcodec;
import com.googlecode.javacv.cpp.avcodec.AVCodec;
import com.googlecode.javacv.cpp.avformat;
import com.googlecode.javacv.cpp.avutil;
import com.googlecode.javacv.cpp.opencv_core.CvMemStorage;
import com.googlecode.javacv.cpp.opencv_core.IplImage;
import com.googlecode.javacv.cpp.opencv_objdetect;

public class WebCam {
public static void main(String[] args) throws Exception {
       
        Loader.load(opencv_objdetect.class);
        CanvasFrame frame = new CanvasFrame("Video registrazione");
        FrameGrabber grabber = new OpenCVFrameGrabber(0);
        grabber.setImageHeight(1024);
        grabber.setImageWidth(1024);
        grabber.start();
        IplImage grabbedImage = grabber.grab();
        CvMemStorage storage = CvMemStorage.create();
        FFmpegFrameRecorder recorder = new FFmpegFrameRecorder("rtp://192.168.1.2:15000", grabber.getImageWidth(), grabber.getImageHeight());
        recorder.setVideoCodec(13);
        recorder.setFormat("flv");
        recorder.setPixelFormat(avutil.PIX_FMT_YUV420P);
        recorder.setFrameRate(30);
        recorder.setVideoBitrate(10 * 1024 * 1024);

        recorder.start();
        
   int i = 0;
        while (frame.isVisible() && (grabbedImage = grabber.grab()) != null) {
System.out.println("(" + i++ + ") Invio. . .");
frame.showImage(grabbedImage);
recorder.record(grabbedImage);
        }
cvClearMemStorage(storage);
recorder.stop();
grabber.stop();
frame.dispose();
}

    private static void cvClearMemStorage(CvMemStorage storage) {}
}


Samuel Audet

unread,
Feb 23, 2013, 9:48:00 PM2/23/13
to jav...@googlegroups.com
On 02/21/2013 07:55 PM, Lu Tot� wrote:
>
> how to send the stream of webcam from Server (that captures the stream)
> to client (that receives it)?
>
> I have the following code (to capture the webcam and send it with rtp):

AFAIK, calling recorder.setFormat("flv") won't work for RTP. I think we
need to call recorder.setFormat("rtp"). In any case, please check the
error messages of FFmpeg by running your program on the command line

Samuel

Samuel Audet

unread,
Feb 23, 2013, 9:53:24 PM2/23/13
to jav...@googlegroups.com
On 02/23/2013 08:12 PM, Giuseppe Desolda wrote:
> I've tryed many solution for example to change recorder.setFormat("flv")
> to recorder.setFormat("rtp") but in this case there is the fatal error
> of JVM reported below:

Are you using the version of FFmpeg for Windows specified in the
README.txt file? It's not guaranteed to work with any other binary files..

Samuel

Message has been deleted
Message has been deleted

Samuel Audet

unread,
Mar 3, 2013, 3:20:12 AM3/3/13
to jav...@googlegroups.com
Hello,

On 02/24/2013 05:54 PM, Lu Tot� wrote:
> Hello Samuel,
>
> I found the solution to the problem.
> Now, I can send the stream from server to client using the protocol RTP and codecVideo H264.

Great! Thanks for letting us know how you did it

> It's all right, but the client receives the stream in black and white
> with the reflected image.

You're getting a lot of error messages.. Do you get the same thing when
doing it on the command line with the ffmpeg and ffplay executables? If
you do, then this isn't a problem with JavaCV, but with FFmpeg... So let
me know if it works fine without JavaCV, post it as an issue on the
site, and I'll look into it, thanks

Samuel

Samuel Audet

unread,
Mar 3, 2013, 9:27:01 AM3/3/13
to jav...@googlegroups.com
On 02/26/2013 08:08 PM, Lu Tot� wrote:
> how to set the "tune" parameter of ffmpeg using javacv?
> I was able to set the "profile" and "preset" parameters but I can not
> find a method to set "tune" parameter.

I haven't exported that no, but it should not be too hard to modify
FFmpegFrameRecorder to accommodate any of those use cases. And if you
would like to contribute your modifications, feel free to upload a patch
as an issue on the site. I'm sure others would very much appreciate it,
thank you!

Samuel
Message has been deleted

Lu Totò

unread,
Mar 4, 2013, 12:06:49 PM3/4/13
to jav...@googlegroups.com
Hello,
I'm an italian student of Computer Science.
I'm working to realize a server/client system to send a stream from server to client using the rtp protocol using framework javacv+ffmpeg in LAN.
The stream is captured by webcam and is encoded with h264 by server side. 
Unfortunately the stream received by the client is very distrubed (see picture attached).
Did somebody have similar problems?
The code of the class used are these:

*** Server Side
...
webCam.start();
grabbedImage = webCam.grab();
...
stream = new FFmpegFrameRecorder("rtp://192.168.1.2:8086", 640, 480);
stream.setVideoBitrate(716800);
stream. setPixelFormat(PIX_FMT_YUV420P);
stream.  setVideoCodec(AV_CODEC_ID_H264);
stream.   setFrameRate(30);
stream.     setProfile("high444");
stream.      setPreset("ultrafast");
stream.      setFormat("h264");
...              
stream.start();
...
frameWebCam.showImage(grabbedImage); 
    
stream.record(grabbedImage); }
...
stream.stop(); 
webCam.stop();


*** Client Side
FFmpegFrameGrabber stream= new FFmpegFrameGrabber("rtp://192.168.1.2:8086");
stream.setFormat("h264");
...
stream.start();

Help me, please.
Salvatore.
Immagine.png

Samuel Audet

unread,
Mar 9, 2013, 1:44:56 AM3/9/13
to jav...@googlegroups.com
Hello,

Thanks for asking your question on the mailing list, but please refrain
from attaching large files to your messages next time, thank you.

As for the RTP transfer, does it work well when you try it on the
command line with ffmpeg and ffplay? If you get the same kind of results
on the command line without JavaCV, then we'll know this isn't an issue
related to JavaCV... Let us know, thanks

Samuel

Samuel Audet

unread,
Mar 9, 2013, 1:45:53 AM3/9/13
to jav...@googlegroups.com
Hello,

I'll repeat myself, but does it work well when you try it on the command
line with ffmpeg and ffplay? If you get the same kind of results on the
command line without JavaCV, then we'll know this isn't an issue related
to JavaCV... Let us know, thanks

Samuel

On 03/05/2013 02:36 AM, Giuseppe Desolda wrote:
> Hi Samuel,
> I've the same problem of LuTot� user; I'm trying since some days to
> stream using javacv as done by LuTot� but I've the same problem.
> What do you think about? I know that FFMPEG is one of the best framework
> for encoding and streaming and I was frustrated due this problem and I
> tought that the problem was me...but now I've seen that my problem is
> quite common. Do you think that you are able to help me and LuToto?
> Thanks

Samuel Audet

unread,
Mar 9, 2013, 2:00:04 AM3/9/13
to jav...@googlegroups.com
On 03/07/2013 07:34 PM, Lu Tot� wrote:
> in addition to my problem, not yet resolved, I tried to send the stream
> with my java class (server side) and receive with ffplay. In this way,
> the reception is almost perfect but if I try to get with my client I
> have a low quality. In this way, I realized that the problem is not
> server side but client side. Any help is crucial for me.

Ok, maybe you're not calling grab() fast enough. Have you tried running
FFmpegFrameGrabber in a dedicated thread?

Samuel

Samuel Audet

unread,
Mar 9, 2013, 9:35:48 AM3/9/13
to jav...@googlegroups.com
On 03/09/2013 09:53 PM, Lu Tot� wrote:
> Sending the stream with my server and receiving with ffplay the result is perfect.
> I'm almost certain that the problem is on the my client side but I don't understand it.
> I'm working on for so many days but I can not solve this problem.

I'm afraid I don't understand any more than you do. ffplay and
FFmpegFrameGrabber are obviously doing something differently. If we find
the difference, we can then fix it so it works the same way. However,
given that FFmpegFrameGrabber works fine with reliable protocols such as
HTTP, my guess is that there is some timing issue..

Samuel

Samuel Audet

unread,
Mar 10, 2013, 8:12:09 AM3/10/13
to jav...@googlegroups.com
On 03/10/2013 05:28 PM, Lu Tot� wrote:
> while(true) {
> image = stream.grab();
> frameWebCam.setCanvasSize(640, 480);
> if(image!=null) { frameWebCam.showImage(image); }
> }

As I explained earlier, you should try to create a high-priority thread
that calls FrameGrabber.grab() in a very tight loop and that does
absolutely *nothing else*. AFAIK, that should help..

Samuel
Message has been deleted

Samuel Audet

unread,
Mar 10, 2013, 10:25:24 AM3/10/13
to jav...@googlegroups.com
On 03/10/2013 10:36 PM, Lu Tot� wrote:
> I tried to create a thread just to call grab () but the result is the
> same. The thread has the highest priority
> I don't know what to do.

Find the differences between FFmpegFrameGrabber.java and ffplay.c that
makes your application work, and apply them to FFmpegFrameGrabber.java

Samuel Audet

unread,
Mar 23, 2013, 8:44:25 AM3/23/13
to jav...@googlegroups.com
On 03/18/2013 03:12 PM, Lu Tot� wrote:
> I analyzed "FFPLAY.c" and "FFMpegFrameGrabber.java" but I can not find important differences.
> Do you have any more specific advice?

Like I said, since FFmpegFrameGrabber works fine with reliable protocols
like HTTP, it's probably a timing issue. Make sure that
avcodec_decode_video2() and avcodec_decode_audio4() get called at the
same time, at the same speed, as inside ffplay.c.

Samuel

Samuel Audet

unread,
Mar 23, 2013, 10:33:49 PM3/23/13
to jav...@googlegroups.com
On 03/23/2013 10:44 PM, Lu Tot� wrote:
> the problem is not related to javacv but to ffmpeg. In fact streaming with rtp in ffmpeg there is any setting that improve the stream received in the client.
> At this point I decided to try another protocol and I chose udp. With UDP protocol, transmission and reception are perfect.

It's still strange that RTP works fine with ffplay... Let me know if you
figure out how to make it work better, thanks!

Samuel

Samuel Audet

unread,
Mar 31, 2013, 7:34:28 AM3/31/13
to jav...@googlegroups.com
On 03/30/2013 06:44 PM, Lu Tot� wrote:
> Hello Samuel, Can I encrypt the data stream captured by webcams using
> the AES encryption algorithm in JavaCV or FFMPEG?

There seems to be support for that in FFmpeg, but I haven't taken time
to look into that.. If you figure out what we need to do, please create
a new "issue" and post a patch, thanks!

Samuel

David Tang

unread,
Oct 16, 2013, 12:42:59 AM10/16/13
to jav...@googlegroups.com
Hi! Samuel and Lu,

First post here.

I just want to say that after your investigation and following your path, I would like to comment that the disturbance of image is likely caused by the memory management in the client.
The code in ffplay and FFmpegFrameGrabber is not the issue. From Lu's code, I would suggest:

1. moving setCanvasSize() out of loop. Or even delete the line as showImage() already calls that line.
2. if showImage() involves large amount of instantiation, I would suggest forgoing this high level function and Lu to implement lower level lifting and minimise instantiation inside the loop. (This could be a consideration for improvement when designing next major version bump.)



On Sunday, March 24, 2013 10:33:49 AM UTC+8, Samuel Audet wrote:

Samuel Audet

unread,
Oct 26, 2013, 6:18:31 AM10/26/13
to jav...@googlegroups.com
Hello,

On 10/16/2013 01:42 PM, David Tang wrote:
> Hi! Samuel and Lu,
>
> First post here.
>
> I just want to say that after your investigation and following your
> path, I would like to comment that the disturbance of image is likely
> caused by *the memory management in the client*.
> The code in ffplay and FFmpegFrameGrabber is not the issue. From Lu's
> code, I would suggest:
>
> 1. moving setCanvasSize() out of loop. Or even delete the line as
> showImage() already calls that line.
> 2. if showImage() involves large amount of instantiation, I would
> suggest forgoing this high level function and Lu to implement lower
> level lifting and minimise instantiation inside the loop. (This could be
> a consideration for improvement when designing next major version bump.)

Yes, it would be nice to enhance FFmpegFrameGrabber in various ways. Let
me know if you have concrete changes in mind, thanks!

Samuel

Abhishek Pandey

unread,
Dec 30, 2013, 12:38:15 PM12/30/13
to jav...@googlegroups.com
Can any1 tell me how to live stream from server to android phone ???
any sample program will do/... plz help

Samuel Audet

unread,
Jan 4, 2014, 8:05:19 AM1/4/14
to jav...@googlegroups.com
On 12/31/2013 02:38 AM, Abhishek Pandey wrote:
> Can any1 tell me how to live stream from server to android phone ???
> any sample program will do/... plz help

We could use FFmpegFrameGrabber to receive frames from the RTP stream,
and display the images with Android's API...

No one has contributed a sample for that yet, so if you would like to
make a contribution, it would be welcome! Thank you

Samuel
Reply all
Reply to author
Forward
0 new messages