FFmpegFrameGrabber streamGrabber = new FFmpegFrameGrabber(feed.getStream(), 0);
Config.logInfo("Got stream!!!!");
streamGrabber.setFrameRate(headers.getFrameRate());
streamGrabber.setVideoBitrate(headers.getBitRate());
streamGrabber.setImageWidth(headers.getWidth());
streamGrabber.setImageHeight(headers.getHeight());
streamGrabber.setPixelFormat(MpegEncContext.PIX_FMT_YUV420P);
streamGrabber.setVideoCodecName("h264_mediacodec");
streamGrabber.setImageMode(FrameGrabber.ImageMode.COLOR);
Java2DFrameConverter converter = new Java2DFrameConverter();
Config.logInfo("CALLED STREAM GRABBER");
streamGrabber.start();
Config.logInfo("Called STREAM GRABBER's start method");
Exception in thread "Thread-14" java.lang.NoClassDefFoundError: Could not initialize class org.bytedeco.ffmpeg.global.avutil
at java.base/java.lang.Class.forName0(Native Method)
at java.base/java.lang.Class.forName(Class.java:415)
at org.bytedeco.javacpp.Loader.load(Loader.java:1109)
at org.bytedeco.javacpp.Loader.load(Loader.java:1042)
at org.bytedeco.ffmpeg.avformat.Read_packet_Pointer_BytePointer_int.<clinit>(Read_packet_Pointer_BytePointer_int.java:44)
at org.bytedeco.javacv.FFmpegFrameGrabber.<clinit>(FFmpegFrameGrabber.java:340)
at com.itis.liveservice.test.PlayStream.decodeToImage(PlayStream.java:47)
at com.itis.liveservice.core.data.models.Feed$1.run(Feed.java:205)
at java.base/java.lang.Thread.run(Thread.java:835)
System.setProperty("org.bytedeco.javacpp.logger.debug", "true");
Loading class org.bytedeco.ffmpeg.global.avutil
Loading class org.bytedeco.ffmpeg.global.avutil
Loading library avutil
Failed to load for avutil@.56: java.lang.UnsatisfiedLinkError: no avutil in java.library.path: [/Users/gbemirojiboye/Library/Java/Extensions, /Library/Java/Extensions, /Network/Library/Java/Extensions, /System/Library/Java/Extensions, /usr/lib/java, .]
Loading library jniavutil
Failed to load for jniavutil: java.lang.UnsatisfiedLinkError: no jniavutil in java.library.path: [/Users/gbemirojiboye/Library/Java/Extensions, /Library/Java/Extensions, /Network/Library/Java/Extensions, /System/Library/Java/Extensions, /usr/lib/java, .]
Loading class org.bytedeco.ffmpeg.global.avutil
Exception in thread "main" java.lang.NoClassDefFoundError: Could not initialize class org.bytedeco.ffmpeg.global.avutil
--
---
You received this message because you are subscribed to the Google Groups "javacv" group.
To unsubscribe from this group and stop receiving emails from it, send an email to javacv+un...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/javacv/bdbac7a2-0bc0-4515-b9ba-288e304d99dc%40googlegroups.com.
To unsubscribe from this group and stop receiving emails from it, send an email to jav...@googlegroups.com.
<dependency>
<groupId>org.jogamp.jocl</groupId>
<artifactId>jocl-main</artifactId>
<version>2.3.2</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.bytedeco</groupId>
<artifactId>javacpp</artifactId>
<version>1.5</version>
<type>jar</type>
</dependency>
<dependency>
<groupId>org.bytedeco</groupId>
<artifactId>javacpp-presets</artifactId>
<version>1.5</version>
<type>pom</type>
</dependency>
<dependency>
<groupId>org.bytedeco</groupId>
<artifactId>javacv</artifactId>
<version>1.5</version>
<type>jar</type>
</dependency>
<dependency>
<groupId>org.bytedeco</groupId>
<artifactId>opencv-platform</artifactId>
<version>4.0.1-1.5</version>
<type>jar</type>
</dependency>
<dependency>
<groupId>org.bytedeco</groupId>
<artifactId>ffmpeg-platform</artifactId>
<version>4.1.3-1.5</version>
<type>jar</type>
</dependency>
The errors went away, but the code went back to hanging on the start() method.My custom InputStream blocks to read byte array inputs from the websocket endpoint.Should I instead, just wrap each byte array (frame) as they are delivered, in a new inputstream and give to FFmpegFrameGrabber?Thanks.
public void onMessage(byte[] payload , ...){}
public class H264Stream extends InputStream {
private boolean socketOpen;
private final ArrayBlockingQueue<byte[]> dataQueue = new ArrayBlockingQueue<>(100);
private int currentIndex;
private byte[] currentArray;
public H264Stream() {
this.socketOpen = true;
}
public void feed(byte[] in) {
try {
this.dataQueue.put(in);
} catch (InterruptedException ex) {
Logger.getLogger(H264Stream.class.getName()).log(Level.SEVERE, null, ex);
}
}
@Override
public int read() throws IOException {
if (socketOpen) {
if (currentArray == null || currentIndex == currentArray.length) {
//reset the index to zero in preparation for iddexing throught the new array
currentIndex = 0;
//Wait till we have data
try {
currentArray = dataQueue.take();
} catch (InterruptedException ex) {
Logger.getLogger(H264Stream.class.getName()).log(Level.SEVERE, null, ex);
return -1;
}
}
//You surely have data
int ind = currentIndex;
currentIndex++;
return byteToInt(currentArray[ind]);
}
return -1;
}
private static int byteToInt(byte b) {
return (b & 0xff);
}
@Override
public void close() throws IOException {
super.close();
socketOpen = false;
}
}
If your InputStream hangs on read, FFmpegFrameGrabber will also hang, that's normal, but you're probably asking something else. Could you rephrase the question?
2020年3月19日(木) 0:41 Gbenro Jiboye <gbenro...@gmail.com>:
The errors went away, but the code went back to hanging on the start() method.My custom InputStream blocks to read byte array inputs from the websocket endpoint.Should I instead, just wrap each byte array (frame) as they are delivered, in a new inputstream and give to FFmpegFrameGrabber?
Thanks.
public class BytePipeStream extends PipedInputStream {
private PipedOutputStream outputStream;
private boolean socketOpen;
public BytePipeStream() {
super(12000 * PIPE_SIZE);
try {
this.outputStream = new PipedOutputStream(this);
this.socketOpen = true;
} catch (IOException ex) {
ex.printStackTrace();
}
}
public void write(byte[] in) {
try {
this.outputStream.write(in);
this.outputStream.flush();
} catch (IOException ex) {
ex.printStackTrace();
}
}
public void setSocketOpen(boolean open){
this.socketOpen = opem;
}
@Override
public int read() throws IOException {
return super.read();
}
@Override
public synchronized int read(byte[] b, int off, int len) throws IOException {
return socketOpen ? super.read(b, off, len) : -1;
}
@Override
public void close() throws IOException {
this.outputStream.close();
super.close();
socketOpen = false;
}
}
BytePipeStream remoteCameraStream = new BytePipeStream();// the stream you handed to javacvpublic void onMessage(byte[] bin , Session s){
remoteCameraStream.write(bin);//write the media bytes coming in over websocket to the stream.
}
FFmpegFrameGrabber frameGrabber = new FFmpegFrameGrabber(remoteCameraStream, 0);
Good morning and I hope you are safe and doing well.
Thanks for your outstanding help so far
I wanted to let you know that I got both the audio and video feeds from their InputStreams to work.
How do I get FFmpegFrameGrabber to stop grabbing from an InputStream after my users have closed their streaming apps?
When they close the streaming app, no data is being sent again but framegrabber.grab() is still stuck on trying to grab the next frame from the inputstream in the while loop's condition.
I tried to call stop , release and releaseUnsafe from a separate thread but it didn't work.
Based on my research on some of the issues on GitHub, I noted that you said that framerecorder's stop and record methods cannot be called from
separate threads.
I believe that may apply to frame grabber's grab and stop methods too.
If that is so, how do I stop frame grabber.grab when the Inputstream is being fed over a network and data stops being fed into it?
Thanks a lot.
--
---
You received this message because you are subscribed to the Google Groups "javacv" group.
To unsubscribe from this group and stop receiving emails from it, send an email to javacv+un...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/javacv/cfb4e8d0-ed07-4774-ac1b-a34f899b5983%40googlegroups.com.
I captured a few frames(3 actually) during streaming and then when streaming ends, I send those frames through the grabber with a condition to break out of the while loop when streaming ends, of course.
It worked excellently.
The right way for everyone would be to have an interruptible stream though.
Thanks
I trust you are doing well and I hope you are safe.
I am currently muxing the streams from my devices on my desktop app.
I have a fixed frame rate for my video streams and a fixed sample rate(44100Hz) for my audio.
Each feed saves an h264 video and an aac audio from the devices.
I mux both files into an mp4 file at the end of the live stream with FFmpegFrameRecorder.
Muxing works but I noticed that the audio ends faster than the video.
Also, the frame rate set for the video from the device is 30 frames per second, but on the desktop, FFmpegFrameGrabber reports it as 25 frames per second. I am not sure if this could be the cause of the issue.
I would be grateful for any advise or recommendations as to how I can properly mux the files in JavaSE while the audio and the video match appropriately?
I saw the code for RecordActivity and I tried a couple of other examples, but I cant seem to get it right.
--
---
You received this message because you are subscribed to the Google Groups "javacv" group.
To unsubscribe from this group and stop receiving emails from it, send an email to javacv+un...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/javacv/8ce7b168-20bb-47b8-aa08-63456f3ad830%40googlegroups.com.
To unsubscribe from this group and stop receiving emails from it, send an email to jav...@googlegroups.com.
To unsubscribe from this group and stop receiving emails from it, send an email to javacv+un...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/javacv/cd5f96d3-1ab5-455f-a891-b6b05a773588%40googlegroups.com.
this.bufferSize = AudioRecord.getMinBufferSize(this.bitRate, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT) * channelCount
This line determines the size of the PCM data in the audio frames evidently and the value stored in buffer size has to match what javacv is using.
Once I set the buffer-size on Android to 2048 bytes to match what javacv was working with, muxing worked smoothly. All the discrepancies been reported by ffmpeg in the recording length and the
number of audio frames also went away, mostly.
I hope my experience here can also help others.
For your help, I say God bless you, and I wish you the very best in your work and thanks for been so patient.
To view this discussion on the web visit https://groups.google.com/d/msgid/javacv/cd5f96d3-1ab5-455f-a891-b6b05a773588%40googlegroups.com.