RTMP desktop streaming

359 views
Skip to first unread message

Kotik

unread,
Dec 13, 2014, 8:23:31 PM12/13/14
to jav...@googlegroups.com
Hello everyone. I'm trying to stream a part of desktop using javacv.
Looks like javacv is a working solution, but I'm facing a lot of problems.
My main with javacv api usage looks like this:


    public static final String URL = "rtmp://example.com.ip:1935/muve/mystream";

  public static void main(String[] args) throws FrameGrabber.Exception, IOException, FrameRecorder.Exception, AWTException, InterruptedException,LineUnavailableException {
        FFmpegFrameRecorder recorder = null;

        try (MicrophoneAudioSource audioSource = new MicrophoneAudioSource();
             VideoSource videoSource = new VideoSource(new Rectangle(x, y, width, height));
        ) {


            recorder = new FFmpegFrameRecorder(URL, width, height, audioSource.getChannels());
            recorder.setFormat("flv");
            recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
            recorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
            recorder.setVideoBitrate(500_000);
            recorder.setVideoOption("preset", "ultrafast");
            recorder.setVideoOption("tune", "zerolatency");
            recorder.setVideoOption("fflags", "nobuffer");



//         recorder.setFrameRate(5f);
            recorder.setAudioCodec(avcodec.AV_CODEC_ID_MP3);
            recorder.setAudioBitrate(audioSource.getBitrate());
            recorder.setSampleRate(audioSource.getSampleRate());
//            recorder.setSampleFormat(avutil.AV_SAMPLE_FMT_U8P);

            recorder.setGopSize(100);
            recorder.setInterleaved(false);

            CyclicBarrier latch = new CyclicBarrier(3);
            audioSource.start(latch);
            videoSource.start(latch);
            recorder.start();
            latch.await();



            int i = 0;
            long startTime = System.currentTimeMillis();
            long prevTimestamp = startTime;
            while (i < framesToEncode) {
                //grab the screenshot


                long now = System.currentTimeMillis();
                long timestamp = now - startTime;
                recorder.setTimestamp(timestamp * 1000);

                long duration = now - prevTimestamp;
                prevTimestamp = now;
                logger.debug("i {} timestamp {}", i, timestamp);
//                    recorder.setFrameNumber(i);

                    opencv_core.IplImage img = videoSource.getNextChunk();
                Buffer nextChunk = audioSource.getNextChunk(timestamp);//here chunk size either null or 1+ seconds of audio. 

                float ratio = (float) timestamp / audioSource.getCurrentTimestamp();
                System.out.println("timestampratio= "+ ratio);
                org.bytedeco.javacv.Frame frame = new Frame();
                if (ratio<1.01&&(i%2==0)) {
                    logger.debug("sending video");

                    frame.image = img;
                }
                if (nextChunk != null) {
                    frame.samples = new Buffer[]{nextChunk};
                    frame.sampleRate = audioSource.getSampleRate();
                    frame.audioChannels = audioSource.getChannels();
                }
                recorder.record(frame);

                logger.debug("sent\n");

                i++;
            }
        } catch (Exception exc){
            exc.printStackTrace();
        }finally {
            if (recorder != null) recorder.stop();
        }
    }

It but not for too long

Main problem is that it hangs writing frame(see attached screenshot)
if setInterleaved(true) then it hangs on line 890

looks like audio and video are not getting synchronised, but I'm not sure how to do this correctly for the live stream.

Samuel Audet

unread,
Dec 14, 2014, 6:56:23 AM12/14/14
to jav...@googlegroups.com
On 12/14/2014 10:23 AM, Kotik wrote:
> recorder.setInterleaved(false);

You shouldn't set that to `false`, unless you really know what you are
doing.

What happens when you leave it set to `true`?

Samuel

Kotik

unread,
Dec 15, 2014, 1:56:15 AM12/15/14
to jav...@googlegroups.com
It hangs too. 
I'm thinking it might be missing some data?

I wanted to try the latest 0.9.1 snapshot, but seems only mac version is available(but looks like it resolved another problem I had before with dylib dependencies)

воскресенье, 14 декабря 2014 г., 5:56:23 UTC-6 пользователь Samuel Audet написал:

Samuel Audet

unread,
Dec 19, 2014, 8:55:27 PM12/19/14
to jav...@googlegroups.com
On 12/15/2014 03:56 PM, Kotik wrote:
> It hangs too.
> I'm thinking it might be missing some data?
>
> I wanted to try the latest 0.9.1 snapshot, but seems only mac version is
> available(but looks like it resolved another problem I had before with
> dylib dependencies)

Are you saying that it works fine on your Mac with 0.9.1-SNAPSHOT?

Samuel

Kotik

unread,
Dec 20, 2014, 3:08:26 AM12/20/14
to jav...@googlegroups.com
yes

пятница, 19 декабря 2014 г., 19:55:27 UTC-6 пользователь Samuel Audet написал:

Kotik

unread,
Dec 23, 2014, 1:09:45 AM12/23/14
to jav...@googlegroups.com
Looks like what had to be done is 

            recorder.setVideoOption("preset", "ultrafast");
            recorder.setVideoOption("tune", "zerolatency");

this is if for streaming. Without these options it hangs even with the latest snapshot

пятница, 19 декабря 2014 г., 19:55:27 UTC-6 пользователь Samuel Audet написал:

Samuel Audet

unread,
Dec 23, 2014, 2:37:07 AM12/23/14
to jav...@googlegroups.com
On 12/23/2014 03:09 PM, Kotik wrote:
> Looks like what had to be done is
>
> recorder.setVideoOption("preset", "ultrafast");
> recorder.setVideoOption("tune", "zerolatency");
>
> this is if for streaming. Without these options it hangs even with the
> latest snapshot

Good to know! Thanks for letting us know

Samuel

Richard Alam

unread,
Jan 16, 2015, 3:53:41 PM1/16/15
to jav...@googlegroups.com
Hi,

Are you able to share your code?

I'm trying to do the same thing. I thought I'd ask if you are able to share before I go implement one myself.

Thank you very much.

Richard
 

Richard Alam

unread,
Jan 17, 2015, 2:51:34 PM1/17/15
to jav...@googlegroups.com
Hi,

I posted yesterday but looks like it did not get sent.


On Tuesday, December 23, 2014 at 1:09:45 AM UTC-5, Kotik wrote:
Looks like what had to be done is 

            recorder.setVideoOption("preset", "ultrafast");
            recorder.setVideoOption("tune", "zerolatency");

this is if for streaming. Without these options it hangs even with the latest snapshot

Do you mind sharing your working code to capture the desktop and send using RTMP?

I'm trying to do the same thing and was wondering if I could look into how you were doing it before
I go exploring on my own.

Thanks.

Richard

Kotik

unread,
Jan 18, 2015, 9:11:39 PM1/18/15
to jav...@googlegroups.com
Hi. My code related to javacv look like this


   public void run() {
        FFmpegFrameRecorder recorder = null;
        FrameRectangleSource frameRectangleSource = settings.getFrameRectangleSource();

        try {
            audioSource = new SyncMicrophoneAudioSource(fps, settings.getMicMixer());
            audioSource.start();

            videoSource = new RobotVideoSource(frameRectangleSource, fps);
            Rectangle frameRectangle = frameRectangleSource.getFrameRectangle();
            recorder = new FFmpegFrameRecorder(settings.getUrl(),(int) frameRectangle.getWidth(), (int)frameRectangle.getHeight(), audioSource.getChannels());
            recorder.setFormat("flv");
            recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
            recorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
            recorder.setVideoBitrate(1000_000);
            recorder.setVideoQuality(0);
            recorder.setVideoOption("preset", "ultrafast");
            recorder.setVideoOption("tune", "zerolatency");
            recorder.setFrameRate(fps);

            recorder.setAudioCodec(avcodec.AV_CODEC_ID_AAC);
            recorder.setAudioBitrate(audioSource.getBitrate());
            recorder.setSampleRate(audioSource.getSampleRate());
            recorder.setAudioQuality(1);
            recorder.setSampleFormat(avutil.AV_SAMPLE_FMT_S16);
            recorder.setGopSize(10);
            recorder.setInterleaved(true);

            videoSource.start();
            recorder.start();

            listener.onStart();

            int i = 0;
            long startTime = System.currentTimeMillis();
            long prevTimestamp = startTime;
            while (running) {


                long now = i==0?startTime:System.currentTimeMillis();
                long timestamp = now - startTime;

                long duration = now - prevTimestamp;
                prevTimestamp = now;
                logger.debug("i {} timestamp {}", i, timestamp);
                recorder.setFrameNumber(i);

                opencv_core.IplImage img = videoSource.getNextChunk(timestamp);
                byte[] nextChunk = audioSource.getNextChunk(timestamp);
//                recorder.setTimestamp(timestamp*1000);

                long currentAudioTimestamp = audioSource.getCurrentTimestamp();
                float ratio = (float) timestamp / currentAudioTimestamp;
                logger.debug("timestamp {} audiotimestamp{} ratio = {}", timestamp, currentAudioTimestamp, ratio);
                org.bytedeco.javacv.Frame frame = new org.bytedeco.javacv.Frame();
//                if (ratio < 1.01 || (i % 2 == 0)) {

                    frame.image = img;
//                }
                if (nextChunk != null) {
                    frame.samples = new Buffer[]{ByteBuffer.wrap(nextChunk)};
                    frame.sampleRate = audioSource.getSampleRate();
                    frame.audioChannels = audioSource.getChannels();
                } else {
                    System.out.println("nextChunk=null");
                }
                logger.debug("recording...");
                recorder.record(frame);
                logger.debug("sent keyframe {}\n", frame.keyFrame);

                i++;
            }
        } catch (FrameRecorder.Exception | AWTException e) {
            e.printStackTrace();
        } finally {
            if (recorder != null) try {
                recorder.stop();
            } catch (FrameRecorder.Exception e) {
                e.printStackTrace();
            }
            if (audioSource != null) {
                audioSource.close();
            }
            if (videoSource != null) {
                videoSource.close();
            }
            listener.onStop();
            logger.info("Stopped streamer");
        }
    } 
Sometimes it works pretty well, but sometimes hangs when trying to write frame.

пятница, 16 января 2015 г., 14:53:41 UTC-6 пользователь Richard Alam написал:

Kotik

unread,
Jan 18, 2015, 9:16:27 PM1/18/15
to jav...@googlegroups.com
The only thing you'll need to  change is to use  FFmpegRecorder from the develop branch. You can just copy the code from github and create a new class onside your package.. 

воскресенье, 18 января 2015 г., 20:11:39 UTC-6 пользователь Kotik написал:

Richard Alam

unread,
Jan 19, 2015, 2:42:33 PM1/19/15
to jav...@googlegroups.com
Thanks for sharing what you have.

I'm only be implementing desktop sharing, so no audio. From what I read, audio is giving you grief.

Below is my code. However, I've tried different things but can't make RTMP to work. Either, it's not publishing
properly or the video can't be viewed using FLowplayer.
 
When I save the video to file, it's working fine.

Do you see anything wrong with my code that results in working with RTMP?

Thanks.

Richard

===============

import java.awt.AWTException;
import java.awt.Dimension;
import java.awt.Rectangle;
import java.awt.Robot;
import java.awt.Toolkit;
import java.awt.image.BufferedImage;
import java.io.IOException;
import org.bytedeco.javacpp.opencv_core.IplImage;
//import org.bytedeco.javacv.FFmpegFrameRecorder;
import org.bytedeco.javacv.FrameGrabber;
import org.bytedeco.javacv.FrameRecorder;
import static org.bytedeco.javacpp.avcodec.*;
import static org.bytedeco.javacpp.avutil.*;

public class DesktopDemo {
  //public static final String URL = "rtmp://192.168.23.23/live/foo/room2";
  public static final String URL = "out.mp4";
  private static int framesToEncode = 160;
  private static int x = 0;
  private static int y = 0;
  
  public static void main(String[] args) throws FrameGrabber.Exception, IOException, 
                                                FrameRecorder.Exception, 
                                                AWTException, InterruptedException {
    
      MyFFmpegFrameRecorder recorder = null;
      int height = 480;
      int width = 640;
      Dimension screenBounds;
      Double frameRate = 12.0;

      screenBounds = Toolkit.getDefaultToolkit().getScreenSize();
      width = screenBounds.width;
      height = screenBounds.height;
          
      System.out.println("Capturing width=[" + width + "] height=[" + height + "]");

      recorder = new MyFFmpegFrameRecorder(URL, width, height);
      recorder.setFormat("flv");
      recorder.setVideoCodec(AV_CODEC_ID_H264);
      recorder.setPixelFormat(AV_PIX_FMT_YUV420P);
      recorder.setVideoBitrate(1000_000);
      recorder.setFrameRate(frameRate);
      recorder.setVideoQuality(0);
//    recorder.setVideoOption("f", "gdigrab");
//    recorder.setVideoOption("i", "desktop");
//    recorder.setVideoOption("g", "24");
      recorder.setVideoOption("preset", "ultrafast");
      recorder.setVideoOption("tune", "zerolatency");
//      recorder.setVideoOption("rtmp_buffer", "0");
//      recorder.setVideoOption("rtmp_live", "live");
//    recorder.setVideoOption("fflags", "nobuffer");
          
      recorder.setGopSize(10);
//    recorder.setInterleaved(true);
      
      recorder.start();

      int i = 0;
      long startTime = System.currentTimeMillis();

          
      Robot robot = new Robot();
          
      while (i < framesToEncode) {
        long now = System.currentTimeMillis();
           
        // grab the screenshot
        BufferedImage image = robot.createScreenCapture(new Rectangle(x, y, width, height));
        BufferedImage currentScreenshot = new BufferedImage(image.getWidth(), image.getHeight(), BufferedImage.TYPE_3BYTE_BGR);
        currentScreenshot.getGraphics().drawImage(image, 0, 0, null);
        
        IplImage rgbimage = IplImage.createFrom(currentScreenshot);
            
        long timestamp = now - startTime;
        recorder.setTimestamp(timestamp * 1000);

        System.out.println("i=[" + i + "] timestamp=[" + timestamp + "]");
        recorder.setFrameNumber(i);

        org.bytedeco.javacv.Frame frame = new org.bytedeco.javacv.Frame();              
        // if (i % 2 == 0) {
          frame.image = rgbimage;
//        frame.keyFrame = false;
        // }
         if (i % 4 == 0) {
           frame.keyFrame = true;
         } 

         //recorder.record(rgbimage, AV_PIX_FMT_YUV420P);
         recorder.record(frame);

         System.out.println("[ENCODER] encoded image " + i + " in " + (System.currentTimeMillis() - now));
         i++;
              
         try {
            // sleep for framerate milliseconds
            Thread.sleep(Math.max((long) (1000 / frameRate) - (System.currentTimeMillis() - now), 0));
         } catch (InterruptedException e) {
            e.printStackTrace();
         }
       }

       if (recorder != null) {
         recorder.stop();
         recorder.release();
       }

  }
}

Richard Alam

unread,
Jan 19, 2015, 2:49:27 PM1/19/15
to jav...@googlegroups.com
I'm trying to replicate this command with my code below.

ffmpeg -f gdigrab -framerate 30 -video_size 1920x1080 -i desktop -pix_fmt yuv420p -c:v libx264 -preset veryfast -tune zerolatency -g 30 -b:v 1000k -g 24

-f flv rtmp://192.168.23.23/live/foo/room2

Am I on the right track? Can I just pass the parameters through setVideoOption()?

Thanks.

Richard

Kotik

unread,
Jan 19, 2015, 5:25:03 PM1/19/15
to jav...@googlegroups.com
It looks good except 
  if (i % 4 == 0) {
           frame.keyFrame = true;
         } 
 You don't need to set keyFrame, but you can read it after recording
Also you need either to set FrameNumber or set Timestamp. Because both of them do the same thing and use each other(see source code)

понедельник, 19 января 2015 г., 13:42:33 UTC-6 пользователь Richard Alam написал:

Kotik

unread,
Jan 19, 2015, 10:27:13 PM1/19/15
to jav...@googlegroups.com
I've just also  faced problem with not being able to play video in flash players. Looks like the issue is the codec. Changing it to flv helps... until I started getting error:
org.bytedeco.javacv.FrameRecorder$Exception: avcodec_encode_video2() error -1: Could not encode video packet.




понедельник, 19 января 2015 г., 13:49:27 UTC-6 пользователь Richard Alam написал:

Samuel Audet

unread,
Jan 24, 2015, 11:51:04 PM1/24/15
to jav...@googlegroups.com
On 01/20/2015 04:49 AM, Richard Alam wrote:
> I'm trying to replicate this command with my code below.
>
> ffmpeg -f gdigrab -framerate 30 -video_size 1920x1080 -i desktop -pix_fmt yuv420p -c:v libx264 -preset veryfast -tune zerolatency -g 30 -b:v 1000k -g 24
> -f flv rtmp://192.168.23.23/live/foo/room2
>
> Am I on the right track? Can I just pass the parameters through setVideoOption()?

Sure, we can pass options like "preset" and "tune" via `setVideoOption()`.

Samuel

Samuel Audet

unread,
Jan 24, 2015, 11:53:21 PM1/24/15
to jav...@googlegroups.com
On 01/20/2015 12:27 PM, Kotik wrote:
> I've just also faced problem with not being able to play video in flash
> players. Looks like the issue is the codec. Changing it to flv helps...
> until I started getting error:
> org.bytedeco.javacv.FrameRecorder$Exception: avcodec_encode_video2()
> error -1: Could not encode video packet.

Are you talking about this https://github.com/bytedeco/javacv/issues/83 ?

Samuel

Reply all
Reply to author
Forward
0 new messages