Reading H264 videos with JavaCV

1,822 views
Skip to first unread message

Ronnie Pottayya

unread,
Jul 27, 2016, 5:22:48 AM7/27/16
to javacv
Hi everyone,

Can anyone tell me please how to read and display (for example in a JFrame) a video recorded with FrameRecorder in flv format using JavaCV?
The video codec used is H264.

Thanks for your help.
Ronnie 

Samuel Audet

unread,
Jul 27, 2016, 5:43:42 AM7/27/16
to jav...@googlegroups.com
Just use the Demo class in the README.md file:
https://github.com/bytedeco/javacv#sample-usage
but use FFmpegFrameGrabber("filename.flv") as FrameGrabber.

Samuel

Ronnie Pottayya

unread,
Jul 28, 2016, 10:12:59 AM7/28/16
to javacv
Hi Samuel,

Thank you for your answer.
Can you tell me whether it is possible to feed directly the FFmpegFrameGrabber with H264 frames (for example a byte array) rather than a video file ?

Thanks for your help.
Ronnie

Samuel Audet

unread,
Jul 29, 2016, 7:30:17 AM7/29/16
to jav...@googlegroups.com
It's possible, but we would need to add support for InputStream first:
https://github.com/bytedeco/javacv/issues/95
If you'd like to make a contribution, that would be a good one!

Samuel

Ronnie Pottayya

unread,
Aug 26, 2016, 6:05:47 AM8/26/16
to javacv
Thanks Samuel, i'll take a look at it.

Actually, i'm trying to encode frames in h264, send them over the network, receive them and decode them.

Be sure, that if i succeed, i'll make a contribution. I think that this will be of a great help to some..

Thanks.
Ronnie

Samuel Audet

unread,
Aug 28, 2016, 7:48:13 AM8/28/16
to jav...@googlegroups.com
Awesome! Thanks in advance :)

Samuel

Gbenro Jiboye

unread,
Mar 17, 2020, 4:15:55 AM3/17/20
to javacv
Hi Ronnie, Hi Saudet.

I was wondering if any progress was made on this?

I have Ronnie's exact use case. I am encoding the camera output from MediaCodec(using video/avc) (and the audio output also which I send in a separate byte stream suitably tagged).
and sending over websockets to a desktop application written in standard Java (JavaSE). 

I noticed that the FFmpegFrameGrabber now has an InputStream constructor. So I wrote an InputStream subclass to feed the byte arrays to the constructor. Unfortunately it hangs on the start method.



       
FFmpegFrameGrabber streamGrabber = new FFmpegFrameGrabber(feed.getStream(), 0);
       
Config.logInfo("Got stream!!!!");
        streamGrabber
.setFrameRate(headers.getFrameRate());
        streamGrabber
.setVideoBitrate(headers.getBitRate());
        streamGrabber
.setImageWidth(headers.getWidth());
        streamGrabber
.setImageHeight(headers.getHeight());
        streamGrabber
.setPixelFormat(MpegEncContext.PIX_FMT_YUV420P);
        streamGrabber
.setVideoCodecName("h264_mediacodec");
        streamGrabber
.setImageMode(FrameGrabber.ImageMode.COLOR);
       
Java2DFrameConverter converter = new Java2DFrameConverter();
       
       
Config.logInfo("CALLED STREAM GRABBER");
        streamGrabber
.start();
       
Config.logInfo("Called STREAM GRABBER's start method");


Were you able to solve this? how do I deal with this, please

Samuel Audet

unread,
Mar 17, 2020, 4:36:59 AM3/17/20
to jav...@googlegroups.com, Gbenro Jiboye
Most codecs don't support streaming very well and it's probably just waiting for the end of the stream.
To force it to make do without trying to seek, simply set the maximumSize to 0 as explained here:

Samuel

Gbenro Jiboye

unread,
Mar 17, 2020, 5:04:31 AM3/17/20
to javacv
I am glad you responded, thanks!

I used the FFmpegFramGrabber with maximumSize = 0 in the original code that I posted above In fact, the issue happens both for maximumSize = 0 and above 0.

In the issues link that you posted though, you seem to suggest that the behaviour can be mitigated with maximumSize < 0 for JavaCV 1.5.2? am I correct in my understanding of your submission in that post?
I would have tried it out, but the latest JavaCV on maven seems to be 1.5 not 1.5.2(the version that contains the fix).

Samuel Audet

unread,
Mar 17, 2020, 5:28:36 AM3/17/20
to jav...@googlegroups.com, Gbenro Jiboye

Gbenro Jiboye

unread,
Mar 17, 2020, 5:30:56 AM3/17/20
to javacv
I will definitely try it out.

Thanks for the help.

Gbenro Jiboye

unread,
Mar 17, 2020, 9:07:02 PM3/17/20
to javacv
Unfortunately for some weird reason, when I installed the new libs using the maven link, the code kept failing with the exception:

Exception in thread "Thread-14" java.lang.NoClassDefFoundError: Could not initialize class org.bytedeco.ffmpeg.global.avutil
 at java
.base/java.lang.Class.forName0(Native Method)
 at java
.base/java.lang.Class.forName(Class.java:415)
 at org
.bytedeco.javacpp.Loader.load(Loader.java:1109)
 at org
.bytedeco.javacpp.Loader.load(Loader.java:1042)
 at org
.bytedeco.ffmpeg.avformat.Read_packet_Pointer_BytePointer_int.<clinit>(Read_packet_Pointer_BytePointer_int.java:44)
 at org
.bytedeco.javacv.FFmpegFrameGrabber.<clinit>(FFmpegFrameGrabber.java:340)
 at com
.itis.liveservice.test.PlayStream.decodeToImage(PlayStream.java:47)
 at com
.itis.liveservice.core.data.models.Feed$1.run(Feed.java:205)
 at java
.base/java.lang.Thread.run(Thread.java:835)

Gbenro Jiboye

unread,
Mar 17, 2020, 9:13:05 PM3/17/20
to javacv
When I re-ran it with    

System.setProperty("org.bytedeco.javacpp.logger.debug", "true");

set, it gave:



Loading class org.bytedeco.ffmpeg.global.avutil
Loading class org.bytedeco.ffmpeg.global.avutil
Loading library avutil
Failed to load for avutil@.56: java.lang.UnsatisfiedLinkError: no avutil in java.library.path: [/Users/gbemirojiboye/Library/Java/Extensions, /Library/Java/Extensions, /Network/Library/Java/Extensions, /System/Library/Java/Extensions, /usr/lib/java, .]
Loading library jniavutil
Failed to load for jniavutil: java.lang.UnsatisfiedLinkError: no jniavutil in java.library.path: [/Users/gbemirojiboye/Library/Java/Extensions, /Library/Java/Extensions, /Network/Library/Java/Extensions, /System/Library/Java/Extensions, /usr/lib/java, .]
Loading class org.bytedeco.ffmpeg.global.avutil
Exception in thread "main" java.lang.NoClassDefFoundError: Could not initialize class org.bytedeco.ffmpeg.global.avutil

Samuel Audet

unread,
Mar 18, 2020, 2:21:20 AM3/18/20
to javacv
Make sure to add a dependency on "javacv-platform", not just "javacv".

2020年3月18日(水) 10:13 Gbenro Jiboye <gbenro...@gmail.com>:
--

---
You received this message because you are subscribed to the Google Groups "javacv" group.
To unsubscribe from this group and stop receiving emails from it, send an email to javacv+un...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/javacv/bdbac7a2-0bc0-4515-b9ba-288e304d99dc%40googlegroups.com.

Gbenro Jiboye

unread,
Mar 18, 2020, 2:23:37 AM3/18/20
to javacv
Hi Samuel, good morning(by my time here).
I will do that and let you know how it goes. Thanks a lot.
To unsubscribe from this group and stop receiving emails from it, send an email to jav...@googlegroups.com.

Gbenro Jiboye

unread,
Mar 18, 2020, 3:16:36 AM3/18/20
to javacv
A little observation please.

I noticed that therre are many dependiencies in the link for javacv which you shared that I didnt need when I was running javacv 1.5.  For example:


       
<dependency>
           
<groupId>org.jogamp.jocl</groupId>
           
<artifactId>jocl-main</artifactId>
           
<version>2.3.2</version>
           
<optional>true</optional>
       
</dependency>

Do I really need those?

When I was using javacv 1.5, all I needed was:

   <dependency>
           
<groupId>org.bytedeco</groupId>
           
<artifactId>javacpp</artifactId>
           
<version>1.5</version>
           
<type>jar</type>
       
</dependency>
       
<dependency>
           
<groupId>org.bytedeco</groupId>
           
<artifactId>javacpp-presets</artifactId>
           
<version>1.5</version>
           
<type>pom</type>
       
</dependency>
       
<dependency>
           
<groupId>org.bytedeco</groupId>
           
<artifactId>javacv</artifactId>
           
<version>1.5</version>
           
<type>jar</type>
       
</dependency>
       
<dependency>
           
<groupId>org.bytedeco</groupId>
           
<artifactId>opencv-platform</artifactId>
           
<version>4.0.1-1.5</version>
           
<type>jar</type>
       
</dependency>
       
<dependency>
           
<groupId>org.bytedeco</groupId>
           
<artifactId>ffmpeg-platform</artifactId>
           
<version>4.1.3-1.5</version>
           
<type>jar</type>
       
</dependency>

and they came with a whole lot of jars. 

If you could please advice on whether I need to ignore some of the entries in the link that you shared, I would be most grateful.

Samuel Audet

unread,
Mar 18, 2020, 3:23:27 AM3/18/20
to jav...@googlegroups.com, Gbenro Jiboye
Yes, of course, you can exclude anything you don't need, see this issue:

Samuel Audet

unread,
Mar 18, 2020, 9:00:13 PM3/18/20
to Gbenro Jiboye, javacv
If your InputStream hangs on read, FFmpegFrameGrabber will also hang, that's normal, but you're probably asking something else. Could you rephrase the question?

2020年3月19日(木) 0:41 Gbenro Jiboye <gbenro...@gmail.com>:
The errors went away, but the code went back to hanging on the start() method.

My custom InputStream blocks to read byte array inputs from the websocket endpoint.

Should I instead, just wrap each byte array (frame) as they are delivered, in a new   inputstream and give to FFmpegFrameGrabber?

Thanks.

Gbenro Jiboye

unread,
Mar 18, 2020, 10:41:55 PM3/18/20
to javacv
My original InputStream sort of waits on the websocket

public void onMessage(byte[] payload , ...){}

callback. That means that I feed the custom InputStream with the byte array whenever it comes in which is then available for reading in the implemented read method.

For example:



public class H264Stream extends InputStream {


   
private boolean socketOpen;


   
private final ArrayBlockingQueue<byte[]> dataQueue = new ArrayBlockingQueue<>(100);


   
private int currentIndex;
   
private byte[] currentArray;


   
public H264Stream() {
       
this.socketOpen = true;
   
}


   
public void feed(byte[] in) {
       
try {
           
this.dataQueue.put(in);
       
} catch (InterruptedException ex) {
           
Logger.getLogger(H264Stream.class.getName()).log(Level.SEVERE, null, ex);
       
}
   
}


   
@Override
   
public int read() throws IOException {


       
if (socketOpen) {


           
if (currentArray == null || currentIndex == currentArray.length) {
               
//reset the index to zero in preparation for iddexing throught the new array
                currentIndex
= 0;


               
//Wait till we have data
               
try {
                    currentArray
= dataQueue.take();
               
} catch (InterruptedException ex) {
                   
Logger.getLogger(H264Stream.class.getName()).log(Level.SEVERE, null, ex);
                   
return -1;
               
}
           
}
           
//You surely have data
           
int ind = currentIndex;
            currentIndex
++;
           
return byteToInt(currentArray[ind]);


       
}


       
return -1;


   
}
 


   
private static int byteToInt(byte b) {
       
return (b & 0xff);
   
}


   
@Override
   
public void close() throws IOException {
       
super.close();
        socketOpen
= false;
   
}


}


This stream is initialized once the stream starts and is handed over to FFmpegFrameGrabber.

The second approach takes each byte array frame that comes over the websocket connection, wraps it in a ByteArrayInputStream and hands it over to the FFmpegFrameGrabber. So whenever a byte[] frame comes, a new InputStream is created.

I was wondering which of the 2 approaches would be the correct one when using the FFmpegFrameGrabber(InputStream) constructor?


On Thursday, March 19, 2020 at 2:00:13 AM UTC+1, Samuel Audet wrote:
If your InputStream hangs on read, FFmpegFrameGrabber will also hang, that's normal, but you're probably asking something else. Could you rephrase the question?

2020年3月19日(木) 0:41 Gbenro Jiboye <gbenro...@gmail.com>:
The errors went away, but the code went back to hanging on the start() method.

My custom InputStream blocks to read byte array inputs from the websocket endpoint.

Should I instead, just wrap each byte array (frame) as they are delivered, in a new   inputstream and give to FFmpegFrameGrabber?

Thanks.

Samuel Audet

unread,
Mar 18, 2020, 11:36:23 PM3/18/20
to jav...@googlegroups.com, Gbenro Jiboye
I don't understand what you're trying to achieve, but just wrapping an InputStream with another InputStream isn't going to change or solve anything, no.

With JavaCV 1.5.2, you just need to set maximumSize to 0 and it should work.

Gbenro Jiboye

unread,
Mar 24, 2020, 6:07:03 AM3/24/20
to javacv
Hi Samuel! 

I hope you are safe!

I wanted to thank you for your help. I got it working finally.
I discovered some bugs in my InputStream implementation that was causing it to drop bytes.
After fixing it, the streaming worked, (though it takes a few seconds to load up)

Here are some of the things I discovered that might be helpful to people who have my use-case.( I hope you dont mind me saying all this here...)

My use-case was:
An Android client, sending the camera output encoded via MediaCodec via websocket to a websocket server(jetty) embedded in a J2SE Swing desktop application.

The app is integrated with javacv.

The media bytes come in through the onMessage(byte b , Session s ) callback of the websocket server.

The bytes need to be forwarded to the FFmpegFrameGrabber which can only accept an InputStream. 

I created a custom stream to handle this. My tests showed that the stream was capable(based on my Mac's hardware resources) of transmitting at 21.1MB/sec - 21.5 MB/sec
Instead of this, you can use piped io to deliver the bytes as a stream if you do not mind pre-allocating? buffers and tinkering with buffer size.

I did tinker with the buffer size just in case someone needed the info, and I discovered that piped io came close to my custom stream's implementation when the buffer size was about 12MB; it did about 20.x MB/sec

A good example of Piped-IO functionality  usable with javacv would be:



public class BytePipeStream extends PipedInputStream {


   
private PipedOutputStream outputStream;


   
private boolean socketOpen;


   
public BytePipeStream() {
       
super(12000 * PIPE_SIZE);
       
try {
           
this.outputStream = new PipedOutputStream(this);
           
this.socketOpen = true;
       
} catch (IOException ex) {
            ex
.printStackTrace();
       
}
   
}


   
public void write(byte[] in) {
       
try {
           
this.outputStream.write(in);
           
this.outputStream.flush();
       
} catch (IOException ex) {
            ex
.printStackTrace();
       
}
   
}

   
public void setSocketOpen(boolean open){
   
this.socketOpen = opem;

   
}

   
@Override
   
public int read() throws IOException {

       
return super.read();
   
}


   
@Override
   
public synchronized int read(byte[] b, int off, int len) throws IOException {
       
return socketOpen ? super.read(b, off, len) : -1;

   
}


   
@Override
   
public void close() throws IOException {

       
this.outputStream.close();
       
super.close();
        socketOpen
= false;
   
}
}




Usage would be:

BytePipeStream remoteCameraStream = new BytePipeStream();// the stream you handed to javacv
public void onMessage(byte[] bin , Session s){
remoteCameraStream.write(bin);//write the media bytes coming in over websocket to the stream.
}

Finally, make sure the maximum size is set to zero.
e.g:             

FFmpegFrameGrabber
frameGrabber = new FFmpegFrameGrabber(remoteCameraStream, 0);


I want to work on playing the audio from the stream now.

Thanks again!

Samuel Audet

unread,
Mar 24, 2020, 10:39:21 PM3/24/20
to jav...@googlegroups.com, Gbenro Jiboye
Glad to hear you got it working and thanks for all the information!
I'm sure it is going to be useful to others. Also feel free to add code for this here:

BTW, FFmpeg supports many protocols: https://ffmpeg.org/ffmpeg-protocols.html
We don't necessarily need to use InputStream for this use case,
although it doesn't look like WebSocket or WebRTC is part of the lot there...

Samuel

Gbenro Jiboye

unread,
Apr 12, 2020, 1:19:40 AM4/12/20
to javacv
Hi Samuel,

Good morning and I hope you are safe and doing well.

Thanks for your outstanding help so far
I wanted to let you know that I got both the audio and video feeds from their InputStreams to work.

How do I get FFmpegFrameGrabber to stop grabbing from an InputStream after my users have closed their streaming apps?

When they close the streaming app, no data is being sent again but framegrabber.grab() is still stuck on trying to grab the next frame from the inputstream in the while loop's condition.

I tried to call stop , release and releaseUnsafe from a separate thread but it didn't work.

Based on my research on some of the issues on GitHub, I noted that you said that framerecorder's stop and record methods cannot be called from
separate threads.

I believe that may apply to frame grabber's grab and stop methods too.

If that is so, how do I stop frame grabber.grab when the Inputstream is being fed over a network and data stops being fed into it?

Thanks a lot.

Samuel Audet

unread,
Apr 12, 2020, 4:04:04 AM4/12/20
to javacv
I think the best thing we can do for that is to wrap the InputStream with a custom implementation that is interruptible. Sounds good? 

2020年4月12日(日) 14:19 Gbenro Jiboye <gbenro...@gmail.com>:
--

---
You received this message because you are subscribed to the Google Groups "javacv" group.
To unsubscribe from this group and stop receiving emails from it, send an email to javacv+un...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/javacv/cfb4e8d0-ed07-4774-ac1b-a34f899b5983%40googlegroups.com.

Gbenro Jiboye

unread,
Apr 12, 2020, 6:36:03 AM4/12/20
to javacv
True. Though I went with an hack for now.

I captured a few frames(3 actually) during streaming and then when streaming ends, I send those frames through the grabber with a condition to break out of the while loop when streaming ends, of course.

It worked excellently.

The right way for everyone would be to have an interruptible stream though.

Thanks

Gbenro Jiboye

unread,
Apr 13, 2020, 9:54:30 AM4/13/20
to javacv
Hi Samuel,

I trust you are doing well and I hope you are safe.

I am currently muxing the streams from my devices on my desktop app.

I have a fixed frame rate for my video streams and a fixed sample rate(44100Hz) for my audio.

Each feed saves an h264 video and an aac audio from the devices.

I mux both files into an mp4 file at the end of the live stream with FFmpegFrameRecorder.

Muxing works but I noticed that the audio ends faster than the video.

Also, the frame rate set for the video from the device is 30 frames per second, but on the desktop, FFmpegFrameGrabber reports it as 25 frames per second. I am not sure if this could be the cause of the issue.

I would be grateful for any advise or recommendations as to how I can properly mux the files in JavaSE while the audio and the video match appropriately?

I saw the code for RecordActivity and I tried a couple of other examples, but I cant seem to get it right.

Samuel Audet

unread,
Apr 13, 2020, 7:51:23 PM4/13/20
to jav...@googlegroups.com
You'll need to call FFmpegFrameRecorder.setFrameRate(25) before start()
to get a video file at 25 FPS.

Gbenro Jiboye

unread,
May 15, 2020, 1:39:49 AM5/15/20
to javacv
Hi Samuel,

Had to revisit the code again.

I cant achieve sync(lip-sync) at all between the muxer output of javacv on desktop.

I have a h264 file and an aac file streamed from MediaCodec on android.

To make things easier, I have the timestamps of the individual frames for both the h264 and the aac file.

I really need to achieve lip-sync in the muxed mp4 output.

I felt that having the timestamps would have made things easier, but it has not been so.


What is the right way to go about muxing h264 and aac  into mp4 when I have the time stamps of the individual frames available?

Thanks a lot

Samuel Audet

unread,
May 16, 2020, 4:07:59 AM5/16/20
to jav...@googlegroups.com, Gbenro Jiboye
For that, we can call setTimestamp() with the correct time values in
microseconds, before each frame that you record, and for audio, and
sometimes for video too, we have to pad any missing frames with empty ones.

Samuel Audet

unread,
May 16, 2020, 8:38:22 AM5/16/20
to Gbenro Jiboye, javacv
Like I told Martin, there's probably an example somewhere for sure, but
nothing super easy to understand, but in the end, he got it working well
pretty easily:
https://groups.google.com/forum/#!topic/javacv/OaBmv5mYJrw

Let me know if his example doesn't help though.

On 5/16/20 7:53 PM, Gbenro Jiboye wrote:
> Thanks for responding, Samuel.
>
> Please can you point me in the direction of a good example that shows
> how I can pad with missing frames, because I am not sure I am creating
> the empty frames right.
> Also, when I call setTimestamp(durationInMicros) , I keep getting:
>
>   org.bytedeco.javacv.FrameRecorder$Exception: av_interleaved_write_frame() error -22 while writing interleaved video packet.
> at org.bytedeco.javacv.FFmpegFrameRecorder.writePacket(FFmpegFrameRecorder.java:1236)
> at org.bytedeco.javacv.FFmpegFrameRecorder.recordImage(FFmpegFrameRecorder.java:1031)
> at org.bytedeco.javacv.FFmpegFrameRecorder.record(FFmpegFrameRecorder.java:918)
> at org.bytedeco.javacv.FFmpegFrameRecorder.record(FFmpegFrameRecorder.java:911)
> at com.itis.liveservice.utils.MuxProcessor.muxMeNow(MuxProcessor.java:245)
> at com.itis.liveservice.utils.MuxProcessor$1.run(MuxProcessor.java:110)
> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
> at java.base/java.lang.Thread.run(Thread.java:830)
>
> The timestamps for both audio and video are in microseconds relative to
> the start of the video recording, So the first video frame is at zero
> while all the other frames are relative to that.
>
>
> Thanks a lot
>
>
>

Gbenro Jiboye

unread,
May 16, 2020, 11:14:25 AM5/16/20
to Samuel Audet, javacv
Thanks, I'll check it out. 

Gbenro Jiboye

unread,
May 19, 2020, 4:23:02 PM5/19/20
to javacv
Hi Samuel, thanks for all the help.

I have been doing a lot of research on this topic and I seem to have made a lo of progress compared to before. However,I have to clear this up:

Question:

Does padding the audio with frames alter the quality a bit, even when properly done?

I noticed this when I implemented some sort of padding based on my use-case, since I didn't get any good example to use. and Martin's case totally didn't work for me even though I tried to modify it to my use-case.

I wrote something that seems to be working but I am still testing. It worked for 2 muxing operations on h264/aac sets and their timestamp files and I am planning to test on more.

If you can confirm that a little degradation in audio quality cannot be avoided when padding with audio frames, then I can quit trying to debug how to restore the original audio quality and 
test my implementation on more recordings.


Once again, here is my use case:

h264 file and aac file from Mediacodec each with their timestamp files (in byte array format) being muxed using FFmpegFrameRecorder on Java Desktop.

Thanks so much once again.

Samuel Audet

unread,
May 19, 2020, 8:49:28 PM5/19/20
to jav...@googlegroups.com, Gbenro Jiboye
Well, inserting empty audio frames will most certainly severely degrade
audio quality! That's why it should be done only as a last resort, but
it's better than corrupting the stream and/or crashing your application.

On 5/20/20 5:23 AM, Gbenro Jiboye wrote:
> Hi Samuel, thanks for all the help.
>
> I have been doing a lot of research on this topic and I seem to have
> made a lo of progress compared to before. However,I have to clear this up:
>
> *Question:*
> *
> *
> /Does padding the audio with frames alter the quality a bit, even when
> properly done?/

Gbenro Jiboye

unread,
May 23, 2020, 1:07:15 AM5/23/20
to javacv
Hi Samuel,

I would like to know what the format of the audio data in an audio frame is. I am sending aac audio from the mobile device(encoded as aac from 16 bit pcm).

When I decode it on the desktop, using the audiocodec: AV_CODEC_ID_AAC, is it decoded back into pcm data in the audio Frame object?
I have to ask because the original pcm data on the android device comes in 2406 byte blocks before being encoded into aac and streamed.

The audio data decoded in the frames however are only 2048 bytes each.

What do you make of this, please?

Samuel Audet

unread,
May 23, 2020, 3:01:27 AM5/23/20
to javacv
The size of each of frame depends on the codec. That sounds all perfectly normal.

2020年5月23日(土) 14:07 Gbenro Jiboye <gbenro...@gmail.com>:
--

---
You received this message because you are subscribed to the Google Groups "javacv" group.
To unsubscribe from this group and stop receiving emails from it, send an email to javacv+un...@googlegroups.com.

Gbenro Jiboye

unread,
May 23, 2020, 3:22:35 AM5/23/20
to javacv
I assume that is decoded into pcm data then.

I am trying to calculate the amount of audio samples that plays for each video frame so I can determine the padding to apply to the audio frames.

Can I use the normal formula for calculating samples from duration?

e.g  (duration * sample_rate  * audio_channels)/(sample_size/16)

If not, any advice in calculating the padding to apply would be received happily.

Thanks


To unsubscribe from this group and stop receiving emails from it, send an email to jav...@googlegroups.com.

Samuel Audet

unread,
May 23, 2020, 10:01:10 PM5/23/20
to javacv, Gbenro Jiboye
Yes, decoded frames are PCM, and yes we need consider the number of samples for such calculations. The number of audio frames isn't reliable information.

2020年5月23日(土) 16:22 Gbenro Jiboye <gbenro...@gmail.com>:
To unsubscribe from this group and stop receiving emails from it, send an email to javacv+un...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/javacv/cd5f96d3-1ab5-455f-a891-b6b05a773588%40googlegroups.com.

Gbenro Jiboye

unread,
May 24, 2020, 11:33:05 AM5/24/20
to javacv
Hi Samuel,

I am really excited to tell you that all the muxing issues I was having have been resolved by God's grace.

I thank you for your help and for your time spent making javacv. I also thank you for how you support so many developers all over the world.
Thanks for your passion.

The fix needed no hack at the end of the day, so my audio quality is unaffected which is really great for my use-case. 

The issue was that the AAC codec (decoder) in javacv was producing samples in 2048 byte chunks while, the buffer size in my android app(the client) had a different size; you know the path that calculates the buffer to be used by AudioRecord:


i.e:

this.bufferSize = AudioRecord.getMinBufferSize(this.bitRate, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT) * channelCount

This line determines the size of the PCM data in the audio frames evidently and the value stored in buffer size has to match what javacv is using.

Once I set the buffer-size on Android to 2048 bytes to match what javacv was working with, muxing worked smoothly. All the discrepancies been reported by ffmpeg in the recording length and the
number of audio frames also went away, mostly.

I hope my experience here can also help others.

For your help, I say God bless you, and I wish you the very best in your work and thanks for been so patient.




Reply all
Reply to author
Forward
0 new messages