Hello everyone. I'm trying to stream a part of desktop using javacv.
Looks like javacv is a working solution, but I'm facing a lot of problems.
public static void main(String[] args) throws FrameGrabber.Exception, IOException, FrameRecorder.Exception, AWTException, InterruptedException,LineUnavailableException {
FFmpegFrameRecorder recorder = null;
try (MicrophoneAudioSource audioSource = new MicrophoneAudioSource();
VideoSource videoSource = new VideoSource(new Rectangle(x, y, width, height));
) {
recorder = new FFmpegFrameRecorder(URL, width, height, audioSource.getChannels());
recorder.setFormat("flv");
recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
recorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
recorder.setVideoBitrate(500_000);
recorder.setVideoOption("preset", "ultrafast");
recorder.setVideoOption("tune", "zerolatency");
recorder.setVideoOption("fflags", "nobuffer");
// recorder.setFrameRate(5f);
recorder.setAudioCodec(avcodec.AV_CODEC_ID_MP3);
recorder.setAudioBitrate(audioSource.getBitrate());
recorder.setSampleRate(audioSource.getSampleRate());
// recorder.setSampleFormat(avutil.AV_SAMPLE_FMT_U8P);
recorder.setGopSize(100);
recorder.setInterleaved(false);
CyclicBarrier latch = new CyclicBarrier(3);
audioSource.start(latch);
videoSource.start(latch);
recorder.start();
latch.await();
int i = 0;
long startTime = System.currentTimeMillis();
long prevTimestamp = startTime;
while (i < framesToEncode) {
//grab the screenshot
long now = System.currentTimeMillis();
long timestamp = now - startTime;
recorder.setTimestamp(timestamp * 1000);
long duration = now - prevTimestamp;
prevTimestamp = now;
logger.debug("i {} timestamp {}", i, timestamp);
// recorder.setFrameNumber(i);
opencv_core.IplImage img = videoSource.getNextChunk();
Buffer nextChunk = audioSource.getNextChunk(timestamp);//here chunk size either null or 1+ seconds of audio.
float ratio = (float) timestamp / audioSource.getCurrentTimestamp();
System.out.println("timestampratio= "+ ratio);
org.bytedeco.javacv.Frame frame = new Frame();
if (ratio<1.01&&(i%2==0)) {
logger.debug("sending video");
frame.image = img;
}
if (nextChunk != null) {
frame.samples = new Buffer[]{nextChunk};
frame.sampleRate = audioSource.getSampleRate();
frame.audioChannels = audioSource.getChannels();
}
recorder.record(frame);
logger.debug("sent\n");
i++;
}
} catch (Exception exc){
exc.printStackTrace();
}finally {
if (recorder != null) recorder.stop();
}
}
looks like audio and video are not getting synchronised, but I'm not sure how to do this correctly for the live stream.