Record audio and video at the same time

517 views
Skip to first unread message

Federico Sendra

unread,
Aug 27, 2014, 8:16:35 PM8/27/14
to jav...@googlegroups.com
So there is this app im doing which i need to always be recording video (with audio) and then when the user press the "stop" button, then it will save the last 5 seconds (could be 10, 15 or any number of seconds).

So, im using javacv, ffmpeg and all those libraries (https://github.com/bytedeco/javacv)

Im using my own simple circular buffer where i store the data from camera and audio. When the button is pressed i record whatever is in the buffer.
It works fine when only recording the video, when i try to also add the audio, the file gets all messy (audio is just some random noises, total duration is way more than 5 seconds (anywhere between 7-14 seconds) and sometimes video just freezes while audio keeps playing, etc.
Not sure what im doing wrong, whether i should continue trying this approach or just save both files (video and audio) separately and then merge them (please provide code for this if it is the best way)

I have very little experience using this libraries so its probably something easy to fix.

ALSO: As a plus i would like to add a little watermark to the result video, i have no idea how to do this.

Some variables im using
    private int sampleAudioRateInHz = 44100;
    private int imageWidth = 320;
    private int imageHeight = 240;
    private int frameRate = 30;

Here is my main code.

InitRecorder is called from onCreate()

private void initRecorder() {

       
Log.w(LOG_TAG,"init recorder");

       
if (yuvIplimage == null) {
            yuvIplimage
= IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
       
}

       
Log.i(LOG_TAG, "ffmpeg_url: " + ffmpeg_link);
        recorder
= new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
        recorder
.setFormat("flv");

        recorder
.setSampleRate(sampleAudioRateInHz);
       
// Set in the surface changed method
        recorder
.setFrameRate(frameRate);

       
Log.i(LOG_TAG, "recorder initialize success");

        audioRecordRunnable
= new AudioRecordRunnable();
        audioThread
= new Thread(audioRecordRunnable);
        runAudioThread
= true;
 
}


 
private MediaFrame[] mediaFrames = new MediaFrame[SECS_TO_BUFFER*frameRate];
   
   
private int currentMediaFrame = 0;
   
class MediaFrame {
   
long timestamp;
   
byte[] videoFrame;
   
short[] audioFrame;
   
}

//for now called from another button, but will be called from onCreate
public void startRecording() {
            recording
= true;

      audioThread
.start();
}


//called from stop button, should record only whats inside buffer (5 secs)
   
public void stopRecording() {


        runAudioThread
= false;
       
try {
            audioThread
.join();
       
} catch (InterruptedException e) {
            e
.printStackTrace();
       
}
        audioRecordRunnable
= null;
        audioThread
= null;


       
if (recorder != null && recording) {
            recording
= false;
           
Log.v(LOG_TAG,"Finishing recording, calling stop and release on recorder");
           
           

           
try {
                recorder
.start();


               
// Assuming we recorded at least a full buffer full of frames, the currentMediaFrame%length will be the oldest one
                startTime
= mediaFrames[currentMediaFrame%mediaFrames.length].timestamp;


               
for (int f = 0; f < mediaFrames.length; f++) {
               
if (mediaFrames[(currentMediaFrame+f)%mediaFrames.length].videoFrame != null) {
               
long t = mediaFrames[(currentMediaFrame+f)%mediaFrames.length].timestamp - startTime;
               
if (t > recorder.getTimestamp()) {
                            recorder
.setTimestamp(t);
                       
}
               
//Video
               
Log.v(LOG_TAG,"Adding in frame: " + ((currentMediaFrame+f)%mediaFrames.length));
                yuvIplimage
.getByteBuffer().put(mediaFrames[(currentMediaFrame+f)%mediaFrames.length].videoFrame);
                                 recorder
.record(yuvIplimage);

               
               
//Audio
               
short[] audioData = mediaFrames[currentMediaFrame%mediaFrames.length].audioFrame;
               
if (audioData!=null){
               
int  bufferReadResult = audioRecord.read(audioData, 0, audioData.length);
               
if (bufferReadResult > 0) {
                recorder
.record(ShortBuffer.wrap(audioData, 0, bufferReadResult));
               
}
             
}
             
}
           }
               
                       
Log.v(LOG_TAG,"AudioThread Finished, release audioRecord");



               
/* encoding finish, release recorder */
               
if (audioRecord != null) {
                    audioRecord
.stop();
                    audioRecord
.release();
                    audioRecord
= null;
                   
Log.v(LOG_TAG,"audioRecord released");
               
}
               
                recorder
.stop();
                recorder
.release();


           
} catch (FFmpegFrameRecorder.Exception e) {
            e
.printStackTrace();
           
}
           
            recorder
= null;


       
}
   
}


////AUDIO THREAD

 
//---------------------------------------------
   
// audio thread, gets and encodes audio data
   
//---------------------------------------------
   
class AudioRecordRunnable implements Runnable {


       
@Override
       
public void run() {
            android
.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);


           
// Audio
           
int bufferSize;
           
short[] audioData;
           
int bufferReadResult;


            bufferSize
= AudioRecord.getMinBufferSize(sampleAudioRateInHz,
                   
AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
            audioRecord
= new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
                   
AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);


            audioData
= new short[bufferSize];


           
Log.d(LOG_TAG, "audioRecord.startRecording()");
            audioRecord
.startRecording();


           
/* ffmpeg_audio encoding loop */
           
while (runAudioThread) {
                    bufferReadResult
= audioRecord.read(audioData, 0, audioData.length);

               
if (bufferReadResult > 0) {
                   
if (recording) {
                     
                                mediaFrames
[currentMediaFrame%mediaFrames.length].audioFrame = new short[audioData.length];

                   
System.arraycopy(audioData, 0, mediaFrames[currentMediaFrame%mediaFrames.length].audioFrame, 0, audioData.length );

                       
//This is commmented, tested overwriting the timestamp and advancing the currentMediaFrame, does not work, video gets crazy
                       
//maybe save different audio timestamp and video timestamp?
                       
//mediaFrames[currentMediaFrame%mediaFrames.length].timestamp = 1000 * System.currentTimeMillis();

                   
//currentMediaFrame++;
                   
}
               
}
           
}
       
}
   
}

Method in class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback


       
@Override
       
public void onPreviewFrame(byte[] data, Camera camera) {
         
if(recording){

        mediaFrames
[currentMediaFrame%mediaFrames.length].timestamp = 1000 * System.currentTimeMillis();
        mediaFrames
[currentMediaFrame%mediaFrames.length].videoFrame = data;
       
Log.v(LOG_TAG,"Buffered " + currentMediaFrame + " " + (currentMediaFrame%mediaFrames.length));
        currentMediaFrame
++;
       
}
       
}
   
}


Sorry for the formating and thank you very much

Federico Sendra

unread,
Aug 28, 2014, 6:40:49 PM8/28/14
to jav...@googlegroups.com
I found out that recording the audio with its own timestamp and after the whole video is recorded, the result video seems to have the audio "better" than it was before.
But still the video and audio are not in complete sync, as the video is only 5 seconds and the audio sometimes has up to 9 seconds which makes the whole file 9 seconds long.

This is the code for recording it separately, in the stop recorder.

 startTime = mediaFrames[currentMediaFrame%mediaFrames.length].audioTimestamp;

        Log.v(LOG_TAG,"Adding audio");

                for (int f = 0; f < mediaFrames.length; f++) {
                if (mediaFrames[(currentMediaFrame+f)%mediaFrames.length].audioFrame != null) {
               
              //Audio
                short[] audioData = mediaFrames[currentMediaFrame%mediaFrames.length].audioFrame;
        if (audioData!=null){
        long t = mediaFrames[(currentMediaFrame+f)%mediaFrames.length].audioTimestamp - startTime;
            if (t > recorder.getTimestamp()) {
                        recorder.setTimestamp(t);
                    }
...

Samuel Audet

unread,
Aug 30, 2014, 1:25:50 AM8/30/14
to jav...@googlegroups.com
On 08/29/2014 07:40 AM, Federico Sendra wrote:
> I found out that recording the audio with its own timestamp and after
> the whole video is recorded, the result video seems to have the audio
> "better" than it was before.
> But still the video and audio are not in complete sync, as the video is
> only 5 seconds and the audio sometimes has up to 9 seconds which makes
> the whole file 9 seconds long.

First things first: Does the RecordActivity.java sample work properly?
If that sample works properly, you should try to base your own code on
that. It should work better then.

Samuel

Federico Sendra

unread,
Aug 30, 2014, 8:53:02 PM8/30/14
to jav...@googlegroups.com
Hi Samuel, 
Thank you very much for answering.

The RecordActivity sample works perfectly, and I indeed based my code on it.
However, what i am trying to accomplish is different.

I need to record indefinitely and when i press stop, save the last 10 seconds.
The sample code just records between your start and stop.

That is why i implemented a circular buffer.
Im not recording the data as soon as i get it, i store it in an array and after the user presses STOP, then i get that data and record it. Which is what is bringing me problems.

Also, if you could give me a hint of how to add a watermark (small png image) to the video, that would be great.

Thanks again Samuel

Federico Sendra

unread,
Sep 1, 2014, 8:57:14 PM9/1/14
to jav...@googlegroups.com
Still stuck on this one. 

Could it be a encoding problem? not sure how to do this


El sábado, 30 de agosto de 2014 02:25:50 UTC-3, Samuel Audet escribió:

Samuel Audet

unread,
Sep 7, 2014, 1:36:53 AM9/7/14
to jav...@googlegroups.com
On 08/31/2014 09:53 AM, Federico Sendra wrote:
> Hi Samuel,
> Thank you very much for answering.
>
> The RecordActivity sample works perfectly, and I indeed based my code on it.
> However, what i am trying to accomplish is different.
>
> I need to record indefinitely and when i press stop, save the last 10
> seconds.
> The sample code just records between your start and stop.
>
> That is why i implemented a circular buffer.
> Im not recording the data as soon as i get it, i store it in an array
> and after the user presses STOP, then i get that data and record it.
> Which is what is bringing me problems.

Well, if RecordActivity works fine, then there is probably something
wrong with the rest of your code... If you think there is something
wrong with FFmpegFrameRecorder, can you be more specific about what is
going wrong with the calls to FFmpegFrameRecorder? And if possible
provide 5~10 lines of code that can reproduce the issue. Thanks!

> Also, if you could give me a hint of how to add a watermark (small png
> image) to the video, that would be great.

There is an example of that here:
https://code.google.com/p/javacv/wiki/OpenCV2_Cookbook_Examples_Chapter_2#Example:_Watermarking

Samuel

Federico Sendra

unread,
Sep 7, 2014, 1:36:12 PM9/7/14
to jav...@googlegroups.com
Hi Samuel.

I am not sure you have read what I have posted.
Yes, i know there is something wrong with my code, that is why I am asking here for some help.

Which way would you recomend me to record audio and video into a circular buffer? What things should i take into account when then recording it into the FFMPEGRecorder? Because im stuck with this for some time now. I've posted all the code.


*****WATERMARK*****
I have followed that example and this is my code

public static IplImage addOverlayImage(IplImage bg, String overlayPath){
   IplImage mask = cvLoadImage(overlayPath, CV_LOAD_IMAGE_GRAYSCALE);
   IplImage overlay = cvLoadImage(overlayPath, CV_LOAD_IMAGE_COLOR);
   //IplImage image = bg.clone();

   IplImage image = cvCloneImage(bg);
   IplROI roi = new IplROI();
   roi.xOffset(385);
   roi.yOffset(270);
   roi.width(overlay.width());
   roi.height(overlay.height());
   
   IplImage backImageWithRoi = image.roi(roi);
   
   //cvSetImageCOI(backImageWithRoi, roi.coi());

   cvCopy(overlay, backImageWithRoi, mask );
   image.roi(null);
   roi = null;
   backImageWithRoi = null;
   return image;
}

Which throws the following exception

java.lang.RuntimeException: /home/saudet/projects/bytedeco/javacpp-presets/opencv/cppbuild/android-arm/opencv-2.4.9/modules/core/src/copy.cpp:574: error: (-215) src.channels() == dst.channels() in function void cvCopy(const void*, void*, const void*)
at org.bytedeco.javacpp.opencv_core.cvCopy(Native Method)

I am not sure how to fix it, while debuging i could see that in
   cvCopy(overlay, backImageWithRoi, mask );
overlay has 3 channels
backImageWithRoi 2 channels
mask 1 channel.

I do not know what it means, nor if this is relevant.
Do i need to convert/change the channels? If so, how do i do this?

Thank you

Federico Sendra

unread,
Sep 10, 2014, 5:50:12 PM9/10/14
to jav...@googlegroups.com
I am writing these questions here because i have not found any other place where i could ask questions about javacv.

thanks


El domingo, 7 de septiembre de 2014 02:36:53 UTC-3, Samuel Audet escribió:

Samuel Audet

unread,
Sep 14, 2014, 2:03:16 AM9/14/14
to jav...@googlegroups.com
On 09/08/2014 02:36 AM, Federico Sendra wrote:
> Which way would you recomend me to record audio and video into a
> circular buffer? What things should i take into account when then
> recording it into the FFMPEGRecorder? Because im stuck with this for
> some time now. I've posted all the code.

As long as the same frames are given as input, the output is going to be
the same. As for how to do this, well that shouldn't be too hard to
figure out!

> I am not sure how to fix it, while debuging i could see that in
> cvCopy(overlay, backImageWithRoi, mask );
> overlay has 3 channels
> backImageWithRoi 2 channels
> mask 1 channel.
>
> I do not know what it means, nor if this is relevant.
> Do i need to convert/change the channels? If so, how do i do this?

Yes, OpenCV doesn't really support YUV, expect for the cvCvtColor()
function that we can use to convert images to RGB.

Samuel
Reply all
Reply to author
Forward
0 new messages