Re: [javacv] Applying fade in fade out effects while recording a video

793 views
Skip to first unread message

Samuel Audet

unread,
Jan 8, 2016, 8:33:20 PM1/8/16
to jav...@googlegroups.com
On 01/09/2016 04:15 AM, 404error wrote:
> I have created a video from a set of images. But currently the
> transition between two images is quite abrupt. I want to add some
> effects like fade in fade out, 3D rotation etc. to make the transition
> look less abrupt. I know this can be done in ffmpeg. But I want to know
> how can we do it in android using javacv?

Should be possible. Could you provide the graph as well as the rest of
the command for ffmpeg that works on the command line but that doesn't
work with FFmpegFrameFilter? Thanks!

Samuel

404error

unread,
Jan 10, 2016, 6:07:23 AM1/10/16
to javacv
Actually, I am facing the same issue as mentioned here: https://groups.google.com/forum/#!topic/javacv/N7TDcsn9JJs
Can you please give a short example on how to use ffmpegFrameFilter to produce a fade in effect like :
fade=t=in:st=2.5:d=1  
Thanks

Samuel Audet

unread,
Jan 10, 2016, 6:36:57 AM1/10/16
to jav...@googlegroups.com
Could you also provide the full command to the ffmpeg executable that
you are using on the command line? Thanks

Samuel

404error

unread,
Jan 10, 2016, 8:24:38 AM1/10/16
to javacv
I am not using ffmpeg on the command line, rather I am using the javacv library in an android app. However, below is the command to produce the
desired effect for a video.
ffmpeg -i slide.mp4 -y -vf fade=t=in:st=2.5:d=1 slide_fade_in.mp4

1. I want to know what exactly is the usage of FFmpegFrameFilter ?
2. Does it process the video as a whole or frame by frame.
Thanks.

Samuel Audet

unread,
Jan 10, 2016, 10:25:42 PM1/10/16
to jav...@googlegroups.com
On 01/10/2016 10:24 PM, 404error wrote:
> I am not using ffmpeg on the command line, rather I am using the javacv library in an android app. However, below is the command to produce the
> desired effect for a video.
> ffmpeg -i slide.mp4 -y -vf fade=t=in:st=2.5:d=1 slide_fade_in.mp4

Ok, then according to the source code of the ffmpeg program at
https://github.com/FFmpeg/FFmpeg/blob/n2.8.4/ffmpeg_filter.c#L773
that command automatically appends a "setpts=N" filter to the graph. So,
we can do that with FFmpegFrameFilter as well this way:

filter = new FFmpegFrameFilter("setpts=N,fade=t=in:st=2.5:d=1", ...);

And that works as expected.

> 1. I want to know what exactly is the usage of FFmpegFrameFilter ?

Well, since I'm not the author of FFmpeg, I would first have to read all
of its source code to answer that question. I don't have time for that.
On the other hand, if you have the time to do that and document
everything, that would be great contribution! Interested? It would be
helpful to a lot of people I'm sure. :)

> 2. Does it process the video as a whole or frame by frame.

Both: It processes the video as a whole, frame by frame. But you're
probably asking for something else. You'll need to make your question
clearer if you want an answer!

Samuel

404error

unread,
Jan 11, 2016, 3:24:45 PM1/11/16
to javacv
here is the relevant code:
try {

   
FrameGrabber grabber1 = new FFmpegFrameGrabber(paths.get(0));
   
FrameGrabber grabber2 = new FFmpegFrameGrabber(recordings.get(0));
    grabber1
.start();
    grabber2
.start();
   
FFmpegFrameRecorder recorder = new FFmpegFrameRecorder(video, 320,
           
240, grabber2.getAudioChannels());
    recorder
.setVideoCodec(avcodec.AV_CODEC_ID_MPEG4);
    recorder
.setFrameRate(10);
    recorder
.setVideoBitrate(10 * 1024 * 1024);
    recorder
.setFormat("mp4");
    recorder
.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
    recorder
.start();
   
Frame frame1, frame2;

   
for (int i = 0; i < paths.size(); i++) {
        frame1
= grabber1 == null ? null : grabber1.grabFrame();
        frame2
= grabber2 == null ? null : grabber2.grabFrame();
       
long duration = grabber2.getLengthInTime();
       
FFmpegFrameFilter fFmpegFrameFilter = new FFmpegFrameFilter("setpts=N,fade=t=in:st=0.5:d=1,fade=t=out:st="+((duration/1000000)-1)+":d=1"
         ,grabber1.getImageWidth(),grabber1.getImageHeight());
        fFmpegFrameFilter
.start();
       
long startTime = System.currentTimeMillis();
       
boolean first = true;
        recorder
.setTimestamp(1000 * startTime);
       
while((System.currentTimeMillis() - startTime) < (grabber2.getLengthInTime()/1000)){
            fFmpegFrameFilter
.push(frame1);
           
// while((frame1 = fFmpegFrameFilter.pull()) != null){
            frame1 = fFmpegFrameFilter.pull();
           
Log.d("frame1", "" +grabber2.getLengthInTime()+" " + (System.currentTimeMillis() - startTime));
            recorder
.record(frame1);
       
}
       
System.out.println(recorder.getTimestamp());
       
while (frame2 != null) {
           
if (first) {
                recorder
.setTimestamp(1000 * startTime);
           
}
            recorder
.record(frame2);
           
System.out.println(recorder.getTimestamp());
            frame2
= grabber2.grabFrame();
            first
= false;
       
}

       
if (i < paths.size() - 1) {
           
           
if (paths.get(i + 1) != null) {
                grabber1
.stop();
                grabber1
= new FFmpegFrameGrabber(paths.get(i + 1));
                grabber1
.start();
           
} else
                grabber1 = null;
           
if (recordings.get(i + 1) != null) {
                grabber2
.stop();
                grabber2
= new FFmpegFrameGrabber(recordings.get(i + 1));
                grabber2
.start();
           
} else
                grabber2 = null;

       
}
   
}
    recorder
.stop();
    grabber1
.stop();
    grabber2
.stop();
}

This still gives me an all-black video. The code works fine when filter was not added. What is wrong?

1. Sure, I would love to contribute to this wonderful library. Please guide me how to do that. Is there any sample documentation format to follow?

2. Since I am creating a slide show from images, and applying fade in - fade out effects during the image transitions, I am creating a new filter for every image (please see the code) i.e. it is being applied to different parts of a single video. So I just wanted to know if this can be done or the filter can only be applied to the whole video at once.

Samuel Audet

unread,
Jan 14, 2016, 7:59:38 AM1/14/16
to jav...@googlegroups.com
On 01/12/2016 05:24 AM, 404error wrote:
> This still gives me an all-black video. The code works fine when filter was not added. What is wrong?

Let's start with something a bit simpler. The following works for me.
Does it works at your end too?

FFmpegFrameGrabber grabber = new FFmpegFrameGrabber("input.mp4");
grabber.start();

FFmpegFrameRecorder recorder = new
FFmpegFrameRecorder("output.mp4", grabber.getImageWidth(),
grabber.getImageHeight(), grabber.getAudioChannels());
recorder.start();

FFmpegFrameFilter filter = new
FFmpegFrameFilter("setpts=N,fade=t=in:st=2.5:d=1",
grabber.getImageWidth(), grabber.getImageHeight());
filter.start();

Frame frame;
while ((frame = grabber.grabImage()) != null) {
filter.push(frame, grabber.getPixelFormat());
Frame frame2;
while ((frame2 = filter.pull()) != null) {
recorder.record(frame2, grabber.getPixelFormat());
}
}
recorder.stop();
grabber.stop();

> 1. Sure, I would love to contribute to this wonderful library. Please
> guide me how to do that. Is there any sample documentation format to follow?

If you'd like to contribute code documentation for say
FFmpegFrameFilter, FFmpegFrameGrabber, or FFmpegFrameRecorder, simply
follow the usual Javadoc conventions:
http://www.oracle.com/technetwork/articles/java/index-137868.html

If you'd like to contribute sample code, just send a pull request for
stand alone .java files here:
https://github.com/bytedeco/javacv/tree/master/samples
Or for complete sample projects here:
https://github.com/bytedeco/sample-projects

The wiki is write accessible to all, so you could also post a guide
about how to do XYZ to help other people do the same thing:
https://github.com/bytedeco/javacv/wiki

That's about all there is when it comes to "format"... Interested?

> 2. Since I am creating a slide show from images, and applying fade in -
> fade out effects during the image transitions, I am creating a new
> filter for every image (please see the code) i.e. it is being applied to
> different parts of a single video. So I just wanted to know if this can
> be done or the filter can only be applied to the whole video at once.

Creating one filter for each image isn't going to get you fading. You
need to create only one filter for the whole video.

Samuel

404error

unread,
Jan 16, 2016, 12:41:42 PM1/16/16
to javacv
// apply filter
FrameGrabber grabber = new FFmpegFrameGrabber(temp.getAbsolutePath());
grabber
.start();

FFmpegFrameRecorder recorder1 = new FFmpegFrameRecorder(video, 320,
       
240, grabber.getAudioChannels());

Frame frame;
recorder1
.setVideoCodec(avcodec.AV_CODEC_ID_MPEG4);
recorder1
.setFrameRate(10);
recorder1
.setVideoBitrate(10 * 1024 * 1024);
recorder1
.setFormat("mp4");
recorder1
.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
recorder1
.start();
FFmpegFrameFilter fFmpegFrameFilter = new FFmpegFrameFilter("setpts=N,fade=t=in:st=2.5:d=4",320, 240);
//"setpts=N,fade=t=in:st=0.5:d=1,fade=t=out:st="+((duration/1000000)-1)+":d=1"
fFmpegFrameFilter.setFrameRate(10);
fFmpegFrameFilter
.setFilters("setpts=N,fade=t=in:st=2.5:d=4");

   fFmpegFrameFilter
.start();
while((frame = grabber.grabFrame()) != null){
    fFmpegFrameFilter
.push(frame, grabber.getPixelFormat());
   
Frame frame3;
   
while ((frame3 = fFmpegFrameFilter.pull()) != null) {
        recorder1
.record(frame3, grabber.getPixelFormat());
   
}
}

recorder1
.stop();
grabber
.stop();

Now I am getting an error on this line:
while((frame = grabber.grabFrame()) != null)
the error is: java.lang.NullPointerException: Attempt to invoke virtual method 'java.nio.Buffer java.nio.Buffer.limit(int)' on a null object reference
the grabber is not null (I have checked it), the path for the video is correct, but even grabber.getImageWidth/Height is returning 0 which was causing
a problem with the initialisation of the FfmpegFrameFilter before.

That's about all there is when it comes to "format"... Interested?

I will surely contribute as soon as I am done with this project :)

Samuel Audet

unread,
Jan 16, 2016, 7:33:57 PM1/16/16
to jav...@googlegroups.com
On 01/17/2016 02:41 AM, 404error wrote:

Now I am getting an error on this line:
while((frame = grabber.grabFrame()) != null)
the error is: java.lang.NullPointerException: Attempt to invoke virtual method 'java.nio.Buffer java.nio.Buffer.limit(int)' on a null object reference
the grabber is not null (I have checked it), the path for the video is correct, but even grabber.getImageWidth/Height is returning 0 which was causing
a problem with the initialisation of the FfmpegFrameFilter before.
Try to call grab() instead of grabFrame(). The latter is only there for backward compatibility.

Samuel

404error

unread,
Jan 17, 2016, 1:23:11 AM1/17/16
to javacv
Tried replacing grabFrame with grab but the error is still there.
The error is at line no. 612 in https://github.com/bytedeco/javacv/blob/b2c22914d5fe6454ec90696a340a3e91af1d996f/src/main/java/org/bytedeco/javacv/FFmpegFrameGrabber.java#L612 : frame.image[0] seems to be null. Any hints why can it be null?
Thanks

Samuel Audet

unread,
Jan 17, 2016, 2:16:50 AM1/17/16
to javacv

Well, it can't allocate an image of width and height 0, so if your stream doesn't provide that info, try to call setImageWidth() and setImageHeight() before start().

Samuel

2016/01/17 15:23 "404error" <ssqu...@gmail.com>:
--

---
You received this message because you are subscribed to the Google Groups "javacv" group.
To unsubscribe from this group and stop receiving emails from it, send an email to javacv+un...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

404error

unread,
Jan 17, 2016, 2:20:11 PM1/17/16
to javacv
It seems the stream is not providing most of the required info, even for the audio part : I tried all this but the video generated is black and not even of the same duration as it should be.
grabber.setImageWidth(240);
grabber
.setImageHeight(320);
grabber
.setFormat("mp4");
grabber
.setFrameRate(10);
grabber
.setVideoCodec(avcodec.AV_CODEC_ID_MPEG4);
grabber
.setVideoBitrate(10 * 1024 * 1024);
grabber
.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
grabber
.start();

Same settings work for ffmpegFrameRecorder when i generate the video and store it to 'temp' i.e.the following works and generates a video

FFmpegFrameRecorder recorder = new FFmpegFrameRecorder(temp, 320,

       
240, grabber2.getAudioChannels());
recorder
.setVideoCodec(avcodec.AV_CODEC_ID_MPEG4);
recorder
.setFrameRate(10);
recorder
.setVideoBitrate(10 * 1024 * 1024);
recorder
.setFormat("mp4");
recorder
.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
recorder
.start();

Why is grabber not able to extract this information from the 'temp' variable? Is there any other way than to set each of the parameters for grabber explicitly?
Thanks

Samuel Audet

unread,
Jan 17, 2016, 9:12:49 PM1/17/16
to jav...@googlegroups.com
On 01/18/2016 04:20 AM, 404error wrote:
> Why is grabber not able to extract this information from the 'temp'
> variable? Is there any other way than to set each of the parameters for
> grabber explicitly?

Let's see, could you try to run ffprobe on that "temp" stream to see
what it says?

Samuel

404error

unread,
Jan 18, 2016, 8:00:47 AM1/18/16
to javacv
ffprobe temp.mp4 produced the following output:
ffprobe version N-77455-g4707497 Copyright (c) 2007-2015 the FFmpeg developers
  built
with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04)
  configuration
: --extra-libs=-ldl --prefix=/opt/ffmpeg --mandir=/usr/share/man --enable-avresample --disable-debug --enable-nonfree --enable-gpl --enable-version3 --enable-libopencore-amrnb --enable-libopencore-amrwb --disable-decoder=amrnb --disable-decoder=amrwb --enable-libpulse --enable-libdcadec --enable-libfreetype --enable-libx264 --enable-libx265 --enable-libfdk-aac --enable-libvorbis --enable-libmp3lame --enable-libopus --enable-libvpx --enable-libspeex --enable-libass --enable-avisynth --enable-libsoxr --enable-libxvid --enable-libvo-aacenc --enable-libvidstab
  libavutil      
55. 11.100 / 55. 11.100
  libavcodec    
57. 20.100 / 57. 20.100
  libavformat    
57. 20.100 / 57. 20.100
  libavdevice    
57.  0.100 / 57.  0.100
  libavfilter    
6. 21.101 /  6. 21.101
  libavresample  
3.  0.  0 /  3.  0.  0
  libswscale      
4.  0.100 /  4.  0.100
  libswresample  
2.  0.101 /  2.  0.101
  libpostproc    
54.  0.100 / 54.  0.100
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x3697c00] Could not find codec parameters for stream 0 (Video: mpeg4 (mp4v / 0x7634706D), none, 11 kb/s): unspecified size
Consider increasing the value for the 'analyzeduration' and 'probesize' options
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'temp.mp4':
 
Metadata:
    major_brand    
: isom
    minor_version  
: 512
    compatible_brands
: isomiso2mp41
    encoder        
: Lavf56.40.101
 
Duration: 00:00:18.60, start: 0.023242, bitrate: 50 kb/s
   
Stream #0:0(und): Video: mpeg4 (mp4v / 0x7634706D), none, 11 kb/s, 1.45 fps, 10 tbr, 10240 tbn, 10240 tbc (default)
   
Metadata:
      handler_name    
: VideoHandler
   
Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, mono, fltp, 40 kb/s (default)
   
Metadata:
      handler_name    
: SoundHandler


Thanks

Samuel Audet

unread,
Jan 18, 2016, 8:45:39 AM1/18/16
to jav...@googlegroups.com
On 01/18/2016 10:00 PM, 404error wrote:
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x3697c00] Could not find codec parameters for stream 0 (Video: mpeg4 (mp4v / 0x7634706D), none, 11 kb/s): unspecified size
Consider increasing the value for the 'analyzeduration' and 'probesize' options

FFmpeg isn't able to read your file. This isn't an issue with JavaCV. Please report upstream! Thanks

Samuel

404error

unread,
Jan 18, 2016, 11:47:14 AM1/18/16
to javacv
Okay. Thanks for taking time to help me out!

Samuel Audet

unread,
Jan 18, 2016, 7:20:46 PM1/18/16
to javacv

BTW, with what player are you able to playback that file??

2016/01/19 1:47 "404error" <ssqu...@gmail.com>:
--

404error

unread,
Jan 19, 2016, 8:16:44 AM1/19/16
to javacv
Well, it works with google photos app for android. But I think I will have to change the codecs etc. to make it work with ffmpeg.
Thanks

404error

unread,
Jan 19, 2016, 3:55:24 PM1/19/16
to javacv
Hi
I tried this
FFmpegFrameRecorder recorder = new FFmpegFrameRecorder(temp, 960,
       
540, grabber2.getAudioChannels());
recorder
.setInterleaved(true);
recorder
.setVideoOption("tune", "zerolatency");
recorder
.setVideoOption("preset", "ultrafast");
recorder
.setVideoBitrate(68*1024*8);
recorder
.setVideoCodec(avcodec.AV_CODEC_ID_H264);
recorder
.setFormat("mp4");
recorder
.setFrameRate(30);
recorder
.setAudioBitrate(125*1024*8);
recorder
.setAudioCodec(avcodec.AV_CODEC_ID_AAC);
recorder
.setSampleRate(44100);
recorder
.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
recorder
.start();

but ffprobe is giving this output:
 Could not find codec parameters for stream 0 (Video: h264 (avc1 / 0x31637661), none, 368x240, 25 kb/s): unspecified pixel format
Consider increasing the value for the 'analyzeduration' and 'probesize' options
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'temp.mp4':
 
Metadata:
    major_brand    
: isom
    minor_version  
: 512

    compatible_brands
: isomiso2avc1mp41
    encoder        
: Lavf56.40.101
 
Duration: 00:00:18.47, start: 0.023242, bitrate: 61 kb/s
   
Stream #0:0(und): Video: h264 (avc1 / 0x31637661), none, 368x240, 25 kb/s, 1.35 fps, 30 tbr, 15360 tbn, 30720 tbc (default)
   
Metadata:
      handler_name    
: VideoHandler
   
Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, mono, fltp, 37 kb/s (default)
   
Metadata:
      handler_name    
: SoundHandler


It says unspecified pixel format although I have specified it. Why is this happening now?
Thanks

Samuel Audet

unread,
Jan 23, 2016, 7:59:21 PM1/23/16
to jav...@googlegroups.com
Ok, then maybe you could try to set the 'analyzeduration' and
'probesize' options like it tells you to (with FrameGrabber.setOptions()
in JavaCV).

Samuel

Samuel Audet

unread,
Jan 23, 2016, 8:00:51 PM1/23/16
to jav...@googlegroups.com
On 01/20/2016 05:55 AM, 404error wrote:
> It says unspecified pixel format although I have specified it. Why is this happening now?

Are you calling recorder.record() and recorder.stop() somewhere in your
code?

Samuel

404error

unread,
Jan 26, 2016, 7:13:46 AM1/26/16
to javacv


Are you calling recorder.record() and recorder.stop() somewhere in your
code?
Yes I am calling both these methods.
Can it be due to mismatch of the pixel formats of the source images (png files) and the recorder (AV_PIX_FMT_YUV420P)? I have tried changing those png files from rgb format to yuv but either I am not doing it the right way or the problem is something else.
Thanks

404error

unread,
Jan 28, 2016, 3:06:48 PM1/28/16
to javacv
Hi,
I tried the following command and it worked:
ffprobe -analyzeduration 100000000 temp.mp4
The problem is if the analyzeduration is greater than the duration of the video, it correctly detects the pixel format but if it is less than that, it gives me an error of 'unspecified pixel format'.
temp.mp4 is recorded using ffmpegframerecorder from a set of images. So using setOption with analyzeduration in javacv android code is not helping as the video is still black on vlc player.
I should be able to play these videos on most players including vlc - where do you think the problem is?
Thanks

404error

unread,
Jan 28, 2016, 3:13:31 PM1/28/16
to javacv
Just if this helps:
This is the output of ffprobe from a video that plays in vlc
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'test.mp4':
 
Metadata:
    major_brand    
: mp42
    minor_version  
: 0
    compatible_brands
: isommp42
    creation_time  
: 2016-01-08 22:19:59
 
Duration: 00:00:38.20, start: 0.000000, bitrate: 3416 kb/s
   
Stream #0:0(eng): Video: h264 (Baseline) (avc1 / 0x31637661), yuv420p, 1280x720, 3293 kb/s, SAR 1:1 DAR 16:9, 30 fps, 30 tbr, 90k tbn, 180k tbc (default)
   
Metadata:
      creation_time  
: 2016-01-08 22:19:59
      handler_name    
: VideoHandle
   
Stream #0:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default)
   
Metadata:
      creation_time  
: 2016-01-08 22:19:59
      handler_name    
: SoundHandle

while this is from my video file:
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'temp.mp4':
 
Metadata:
    major_brand    
: isom
    minor_version  
: 512
    compatible_brands
: isomiso2avc1mp41
    encoder        
: Lavf56.40.101

 
Duration: 00:00:17.14, start: 0.023220, bitrate: 72 kb/s
   
Stream #0:0(und): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p, 960x540, 586 kb/s, 30 fps, 30 tbr, 15360 tbn, 60 tbc (default)

   
Metadata:
      handler_name    
: VideoHandler
   
Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, mono, fltp, 37 kb/s (default)
   
Metadata:
      handler_name    
: SoundHandler
Enter code here...

Samuel Audet

unread,
Jan 30, 2016, 1:36:30 AM1/30/16
to jav...@googlegroups.com
On 01/26/2016 09:13 PM, 404error wrote:
>> Are you calling recorder.record() and recorder.stop() somewhere in your code?
>
> Yes I am calling both these methods.
> Can it be due to mismatch of the pixel formats of the source images (png
> files) and the recorder (AV_PIX_FMT_YUV420P)? I have tried changing those
> png files from rgb format to yuv but either I am not doing it the right way
> or the problem is something else.

The pixel format usually needs to be converted anyway, so it's not
something that should cause any problems.

How about video files generated using the Demo class from the README.md
file or with the RecordActivity sample? Are you having any problems
playing those back as well?

Samuel

404error

unread,
Feb 1, 2016, 4:06:26 PM2/1/16
to javacv
Hi,
I tried playing the video in vlc using command line and this error was thrown: Could not convert timestamp 21940473053061
So its a timestamp problem. I wonder how other players are playing it correctly. Is it something specific about vlc timestamps that needs to be taken care of? How?
Thanks

Samuel Audet

unread,
Feb 2, 2016, 8:55:45 AM2/2/16
to jav...@googlegroups.com
On 02/02/2016 06:06 AM, 404error wrote:
> Hi,
> I tried playing the video in vlc using command line and this error was
> thrown: Could not convert timestamp 21940473053061
> So its a timestamp problem. I wonder how other players are playing it
> correctly. Is it something specific about vlc timestamps that needs to
> be taken care of? How?

Maybe it's just your version of VLC that is buggy. It happens often
enough, so try to download the latest version! :)

Samuel

404error

unread,
Feb 4, 2016, 3:31:42 PM2/4/16
to javacv
Hi

I got it working now. It was a timestamp problem. I removed the code setting timestamps, and now it plays in vlc and recognised by ffmpegframefilter as well :)
Thanks for this wonderful library! would love to contribute to javacv source code in a short while.

Thanks

Samuel Audet

unread,
Feb 6, 2016, 2:29:15 AM2/6/16
to jav...@googlegroups.com
I see, so it's an issue with variable frame rates? That shouldn't happen
with MP4 and H.264 though... If you could prepare a small self-contained
example to reproduce the issue, that would be good contribution! It
might end up being some issue with FFmpeg itself. :) Thanks!

Samuel
Reply all
Reply to author
Forward
0 new messages