Using FFmpegFrameRecorder on Android

5,673 views
Skip to first unread message

Shawn Van Every

unread,
Dec 13, 2012, 5:42:48 PM12/13/12
to jav...@googlegroups.com
Hi All,

I have been working with some sample code provided by Qianliang Zhang in the comments of Issue 160 to capture live video and audio from the camera and microphone and encode it to a file. https://code.google.com/p/javacv/issues/detail?id=160

I wasn't quite certain what was happening in the original code with the audio encoding and I couldn't get it to work so I simplified it quite a bit.  Right now, the audio samples are handed to FFmpegFrameRecorder as shorts received by the AudioRecorder at 44100 hz, 16 bit.

The video is being handed to the FFmpegFrameRecorder in the onPreviewFrame callback.

My issues are that I don't believe my device can keep up with the 30 fps framerate that is default in the FFmpegFrameRecorder but every time I change the framerate (to say 15 frames per second) I am left with a file that doesn't have any audio (or any audio that can be played).  I have similar issues if I change the sample rate of the audio encoding (to something like 22050 or 11025).  

Are there limitations in the FFmpegFrameRecorder class on frame rate and sample rate?

Additionally, my audio is very glitchy as is.  It is there but sounds like it is chopped up.

Here is my current code.  Any suggestions would be welcome.

package com.example.javacv.stream.test2;

import static com.googlecode.javacv.cpp.avutil.*;
import static com.googlecode.javacv.cpp.opencv_core.*;
import static com.googlecode.javacv.cpp.opencv_core.IplImage;

import com.googlecode.javacv.FFmpegFrameRecorder;
import com.googlecode.javacv.FrameRecorder.Exception;
import com.googlecode.javacv.cpp.avcodec;

import java.io.IOException;
import java.nio.ByteBuffer;
import java.nio.ShortBuffer;

import android.app.Activity;
import android.content.Context;
import android.content.pm.ActivityInfo;
import android.hardware.Camera;
import android.hardware.Camera.PreviewCallback;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.os.PowerManager;
import android.util.Log;
import android.view.Display;
import android.view.KeyEvent;
import android.view.LayoutInflater;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.view.View.OnClickListener;
import android.view.WindowManager;
import android.widget.Button;
import android.widget.LinearLayout;
import android.widget.RelativeLayout;
import android.widget.Toast;

public class MainActivity extends Activity implements OnClickListener {
private final static String LOG_TAG = "RecordActivity";
private PowerManager.WakeLock mWakeLock;
    
private String ffmpeg_link = "/mnt/sdcard/stream.mp4";

public boolean recording = false;
private volatile FFmpegFrameRecorder recorder;
    private boolean isPreviewOn = false;
    
    private int sampleAudioRateInHz = 44100;
    private int imageWidth = 176;
    private int imageHeight = 144;
    private int frameRate = 30;
   
private AudioRecord audioRecord;
private AudioRecordRunnable audioRecordRunnable;
private Thread audioThread;
private Camera cameraDevice;
private CameraView cameraView;
private IplImage yuvIplimage = null;
private final int bg_screen_bx = 232;
private final int bg_screen_by = 128;
private final int bg_screen_width = 700;
private final int bg_screen_height = 500;
private final int bg_width = 1123;
private final int bg_height = 715;
private final int live_width = 640;
private final int live_height = 480;
private int screenWidth, screenHeight;
private Button btnRecorderControl;
@Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);

        setContentView(R.layout.activity_main);

        PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE); 
        mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, "XYTEST"); 
        mWakeLock.acquire(); 
        
        initLayout();
        initRecorder();
    }
    

@Override
protected void onResume() {
super.onResume();
if (mWakeLock == null) {
  PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
  mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, "XYTEST");
  mWakeLock.acquire();
}
}

@Override
protected void onPause() {
super.onPause();
if (mWakeLock != null) {
mWakeLock.release();
mWakeLock = null;
}
}
    
@Override
protected void onDestroy() {
super.onDestroy();
recording = false;
if (cameraView != null) {
cameraView.stopPreview();
cameraDevice.release();
cameraDevice = null;
}
if (mWakeLock != null) {
mWakeLock.release();
mWakeLock = null;
}
}
private void initLayout() {
Display display = ((WindowManager) getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
screenWidth = display.getWidth();
screenHeight = display.getHeight();
RelativeLayout.LayoutParams layoutParam = null; 
LayoutInflater myInflate = null; 
myInflate = (LayoutInflater) getSystemService(Context.LAYOUT_INFLATER_SERVICE);
RelativeLayout topLayout = new RelativeLayout(this);
setContentView(topLayout);
LinearLayout preViewLayout = (LinearLayout) myInflate.inflate(R.layout.activity_main, null);
layoutParam = new RelativeLayout.LayoutParams(screenWidth, screenHeight);
topLayout.addView(preViewLayout, layoutParam);
btnRecorderControl = (Button) findViewById(R.id.recorder_control);
btnRecorderControl.setOnClickListener(this);
int display_width_d = (int) (1.0 * bg_screen_width * screenWidth / bg_width);
int display_height_d = (int) (1.0 * bg_screen_height * screenHeight / bg_height);
int prev_rw, prev_rh;
if (1.0 * display_width_d / display_height_d > 1.0 * live_width / live_height) {
prev_rh = display_height_d;
prev_rw = (int) (1.0 * display_height_d * live_width / live_height);
} else {
prev_rw = display_width_d;
prev_rh = (int) (1.0 * display_width_d * live_height / live_width);
}
layoutParam = new RelativeLayout.LayoutParams(prev_rw, prev_rh);
layoutParam.topMargin = (int) (1.0 * bg_screen_by * screenHeight / bg_height);
layoutParam.leftMargin = (int) (1.0 * bg_screen_bx * screenWidth / bg_width);
cameraDevice = Camera.open();
Log.i(LOG_TAG, "cameara open");
    cameraView = new CameraView(this, cameraDevice);
topLayout.addView(cameraView, layoutParam);
        Log.i(LOG_TAG, "cameara preview start: OK");
}
public void stopRecording() {
audioRecordRunnable.runAudioThread = false;

if (recorder != null && recording) {
recording = false;
Log.v(LOG_TAG,"Finishing recording, calling stop and release on recorder");
try {
recorder.stop();
recorder.release();
} catch (Exception e) {
e.printStackTrace();
}
recorder = null;
}
}
@Override
public boolean onKeyDown(int keyCode, KeyEvent event) {
if (keyCode == KeyEvent.KEYCODE_BACK) {
if (recording) {
Toast.makeText(MainActivity.this, "Can't Finish Yet", 1000).show();
stopRecording();
} else {
MainActivity.this.finish();
}
return true;
}
return super.onKeyDown(keyCode, event);
}
//---------------------------------------
// initialize ffmpeg_recorder
//---------------------------------------
    private void initRecorder() {
   
    Log.w(LOG_TAG,"init recorder");

if (yuvIplimage == null) {
yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
Log.i(LOG_TAG, "create yuvIplimage");
}
   
Log.i(LOG_TAG, "ffmpeg_url: " + ffmpeg_link);
    recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
   
recorder.setFormat("flv");
    recorder.setAudioCodec(avcodec.AV_CODEC_ID_AAC);
recorder.setSampleRate(sampleAudioRateInHz);
recorder.setVideoCodec(avcodec.AV_CODEC_ID_FLV1);
// Set in the surface changed method
//recorder.setFrameRate(frameRate);
recorder.setPixelFormat(PIX_FMT_YUV420P);

Log.i(LOG_TAG, "recorder initialize success");
   
audioRecordRunnable = new AudioRecordRunnable();
audioThread = new Thread(audioRecordRunnable);
audioThread.start();
}
    
    
    //---------------------------------------------
// audio thread, gets and encodes audio data
//---------------------------------------------
class AudioRecordRunnable implements Runnable {
boolean runAudioThread = true;

@Override
public void run() {
int bufferSize;
short[] audioData;
int bufferReadResult;

try {
bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz, 
AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);
/* set audio recorder parameters, and start recording */
audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz, 
AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
audioData = new short[bufferSize];
audioRecord.startRecording();
Log.d(LOG_TAG, "audioRecord.startRecording()");
/* ffmpeg_audio encoding loop */
while (runAudioThread) {
bufferReadResult = audioRecord.read(audioData, 0, audioData.length);
if (recording) {
/* Should I be handing both the image and the audio to FFmpegFrameRecorder at this point ??? */

    //recorder.record(yuvIplimage);
    recorder.record(ShortBuffer.wrap(audioData, 0, audioData.length));
}
}
Log.v(LOG_TAG,"AudioThread Finished, release audioRecord");
/* encoding finish, release recorder */
if (audioRecord != null) {
audioRecord.stop();
audioRecord.release();
audioRecord = null;
Log.v(LOG_TAG,"audioRecord released");
}
} catch (Exception e) {
Log.e(LOG_TAG, "get audio data failed");
}
}
}
    //---------------------------------------------
// camera thread, gets and encodes video data
//---------------------------------------------
    class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {
   
    private SurfaceHolder mHolder;
    private Camera mCamera;
    public CameraView(Context context, Camera camera) {
    super(context);
    Log.w("camera","camera view");
    mCamera = camera;
            mHolder = getHolder();
            mHolder.addCallback(CameraView.this);
            mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
        mCamera.setPreviewCallback(CameraView.this);
    }

    @Override
    public void surfaceCreated(SurfaceHolder holder) {
            try {
            stopPreview();
            mCamera.setPreviewDisplay(holder);
            } catch (IOException exception) {
                mCamera.release();
                mCamera = null;
            }
    }

    public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
    Log.v(LOG_TAG,"Setting imageWidth: " + imageWidth + " imageHeight: " + imageHeight + " frameRate: " + frameRate);
            Camera.Parameters camParams = mCamera.getParameters();
        camParams.setPreviewSize(imageWidth, imageHeight);

/* This kills things if the frameRate is anything but 30 */
        //camParams.setPreviewFrameRate(frameRate);

            mCamera.setParameters(camParams);
            startPreview();

/* I would like to be able to do this to set the framerate accurately, it seems I am not able to though and keep the audio */
        //frameRate = camParams.getPreviewFrameRate();
        //recorder.setFrameRate(frameRate);
   
    }

    @Override
    public void surfaceDestroyed(SurfaceHolder holder) {
    mHolder.addCallback(null);
    mCamera.setPreviewCallback(null);
    }
   
        public synchronized void startPreview() {
            if (!isPreviewOn && mCamera != null) {
        isPreviewOn = true;
        mCamera.startPreview();
        }
        }
        
        public synchronized void stopPreview() {
            if (isPreviewOn && mCamera != null) {
        isPreviewOn = false;
        mCamera.stopPreview();
        }
        }
   
    @Override
    public synchronized void onPreviewFrame(byte[] data, Camera camera) {
    /* get video data */
if (yuvIplimage != null && recording) {
yuvIplimage.getByteBuffer().put(data);
    try {
    //Log.v(LOG_TAG,"Recording Frame");

/*  Handing the image to FFmpegFrameRecorder */
recorder.record(yuvIplimage);

} catch (Exception e) {
e.printStackTrace();
}
//Log.i(LOG_TAG, "yuvIplimage put data");
}
/*
Log.i(LOG_TAG, "onPreviewFrame - wrote bytes: " + data.length + "; " + 
camera.getParameters().getPreviewSize().width +" x " + 
camera.getParameters().getPreviewSize().height + "; frameRate: " +
camera.getParameters().getPreviewFrameRate());
*/
    }
    }

@Override
public void onClick(View v) {
if (!recording) {
    try {
recorder.start();
} catch (Exception e) {
e.printStackTrace();
}
    recording = true;
Log.w(LOG_TAG, "Start Button Pushed");
    btnRecorderControl.setText("stop");
} else {
// This will trigger the audio recording loop to stop and then set isRecorderStart = false;
stopRecording();
Log.w(LOG_TAG, "Stop Button Pushed");
btnRecorderControl.setText("start");
}
}
}

Attached is a clip, it is playable in VLC..

Thanks!

-shawn
short-44100.mp4

Shawn Van Every

unread,
Dec 14, 2012, 8:13:18 PM12/14/12
to jav...@googlegroups.com
What you describe sounds right to me. It shouldn't be that large though. My apk is around 15mb. Large but not 60mb. I didn't copy in the armeabi-v7a files, I only grabbed the armeabi files so that might make a size difference. I am sure my performance is suffering due to it as well.

-shawn

On Dec 14, 2012, at 3:57 PM, mayan...@gmail.com wrote:

> Hi Shawn,
>
> My questions is more about setup, can you please help me, to get all this to work did you do following?
>
> Download following zip files from http://code.google.com/p/javacv/downloads/list
> javacv-0.3-bin.zip
> opencv-2.4.3-android-arm.zip
> ffmpeg-1.0-android-arm.zip
>
> Copy all .so files into libs/armeabi and libs/armeabi-v7a and also copy javacpp.jar and javacv.jar in class path.
>
> My question is, what i'm doing is this the right way to do? Secondly this causes apk size to increase by 60MB?
>
>
> Thanks,
> Mayank

Samuel Audet

unread,
Dec 15, 2012, 7:22:03 AM12/15/12
to jav...@googlegroups.com
If your code isn't running fast enough to record all audio frames, then
Android can't do anything else but drop audio frames, and that will
cause that kind of noise. That's just how it is. You have to make sure
that all audio frames get recorded, no choice

FFmpegFrameRecorder doesn't limit anything, but depending on the codec
you choose, FFmpeg does impose varying kinds of limits, yes. Please
refer to the documentation of FFmpeg for some info

Samuel

Samuel Audet

unread,
Dec 15, 2012, 7:54:55 AM12/15/12
to jav...@googlegroups.com
On 12/15/2012 01:05 PM, mayan...@gmail.com wrote:
> Ok, but that is what confuses me Readme.txt under ffmpeg-1.0-android-arm.zip say armeabi-v7a for ARM 7 and armeabi for ARM 6.
>
> I'm testing my code on HTC myTouch which wont work if i dont create armeabi folder even though it's running ARMv7 Processor rev 2

Then copy the files from libs/armeabi-v7a into libs/armeabi, and it
should work.

Samuel Audet

unread,
Dec 24, 2012, 10:51:28 PM12/24/12
to Shawn Van Every, jav...@googlegroups.com
Hi Shawn,

If you agree, I could include your sample code as part of the samples here:
http://code.google.com/p/javacv/source/browse/#git%2Fsamples
A lot of people are asking for working code, so I thought it would be a
good idea to put something in there

Let me know, thanks!

Samuel

On 12/14/2012 07:42 AM, Shawn Van Every wrote:

Shawn Van Every

unread,
Dec 25, 2012, 10:35:13 AM12/25/12
to Samuel Audet, Shawn Van Every, jav...@googlegroups.com
Hi Samuel,

Sure.. I actually have an improved version which I'll send over.

Best,
Shawn
--
mobile shawn
--

Samuel Audet

unread,
Dec 25, 2012, 8:37:31 PM12/25/12
to Shawn Van Every, Shawn Van Every, jav...@googlegroups.com
On 12/26/2012 12:35 AM, Shawn Van Every wrote:
> Hi Samuel,
>
> Sure.. I actually have an improved version which I'll send over.

Sounds great! Thanks and Happy New Year!

Samuel

Shawn Van Every

unread,
Jan 3, 2013, 12:26:14 AM1/3/13
to Samuel Audet, jav...@googlegroups.com
Oops..  Attachment was too large for the list.  Here it is again, minus the libraries.


On Thu, Jan 3, 2013 at 12:23 AM, Shawn Van Every <savan...@gmail.com> wrote:
Hi Samuel,

Sorry for the delay.  It's not perfect but it should prove a good start for those who are looking.

Full Eclipse project archive attached.

Below is the source of the main activity.

Best,
shawn

package com.example.javacv.stream.test2;

import static com.googlecode.javacv.cpp.avutil.*;
import static com.googlecode.javacv.cpp.opencv_core.*;
import static com.googlecode.javacv.cpp.opencv_core.IplImage;

import com.googlecode.javacv.FFmpegFrameRecorder;
import com.googlecode.javacv.FrameRecorder.Exception;
import com.googlecode.javacv.cpp.avcodec;

import java.io.IOException;
import java.nio.Buffer;
import java.nio.ShortBuffer;

import android.app.Activity;
import android.content.Context;
import android.content.pm.ActivityInfo;
import android.hardware.Camera;
import android.hardware.Camera.PreviewCallback;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.os.PowerManager;
import android.util.Log;
import android.view.Display;
import android.view.KeyEvent;
import android.view.LayoutInflater;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.view.View.OnClickListener;
import android.view.WindowManager;
import android.widget.Button;
import android.widget.LinearLayout;
import android.widget.RelativeLayout;

public class MainActivity extends Activity implements OnClickListener {
private final static String CLASS_LABEL = "RecordActivity";
private final static String LOG_TAG = CLASS_LABEL;
private PowerManager.WakeLock mWakeLock;
    
private String ffmpeg_link = "/mnt/sdcard/stream.flv";
boolean recording = false;
private volatile FFmpegFrameRecorder recorder;
    private boolean isPreviewOn = false;
    
    private int sampleAudioRateInHz = 44100;
    private int imageWidth = 176;
    private int imageHeight = 144;
    private int frameRate = 5;
       
    /* audio data getting thread */
private AudioRecord audioRecord;
private AudioRecordRunnable audioRecordRunnable;
private Thread audioThread;
volatile boolean runAudioThread = true;
/* video data getting thread */
private Camera cameraDevice;
private CameraView cameraView;
private IplImage yuvIplimage = null;
/* layout setting */
private final int bg_screen_bx = 232;
private final int bg_screen_by = 128;
private final int bg_screen_width = 700;
private final int bg_screen_height = 500;
private final int bg_width = 1123;
private final int bg_height = 715;
private final int live_width = 640;
private final int live_height = 480;
private int screenWidth, screenHeight;
private Button btnRecorderControl;
@Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);

        setContentView(R.layout.activity_main);

        PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE); 
        mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, CLASS_LABEL); 
        mWakeLock.acquire(); 
        
        initLayout();
        initRecorder();
    }
    

@Override
protected void onResume() {
super.onResume();
if (mWakeLock == null) {
  PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
  mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, CLASS_LABEL);
  mWakeLock.acquire();
}
}

@Override
protected void onPause() {
super.onPause();
if (mWakeLock != null) {
mWakeLock.release();
mWakeLock = null;
}
}
    
@Override
protected void onDestroy() {
super.onDestroy();
recording = false;
if (cameraView != null) {
cameraView.stopPreview();
cameraDevice.release();
cameraDevice = null;
}
if (mWakeLock != null) {
mWakeLock.release();
mWakeLock = null;
}
}
private void initLayout() {
/* get size of screen */
Display display = ((WindowManager) getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
screenWidth = display.getWidth();
screenHeight = display.getHeight();
RelativeLayout.LayoutParams layoutParam = null; 
LayoutInflater myInflate = null; 
myInflate = (LayoutInflater) getSystemService(Context.LAYOUT_INFLATER_SERVICE);
RelativeLayout topLayout = new RelativeLayout(this);
setContentView(topLayout);
LinearLayout preViewLayout = (LinearLayout) myInflate.inflate(R.layout.activity_main, null);
layoutParam = new RelativeLayout.LayoutParams(screenWidth, screenHeight);
topLayout.addView(preViewLayout, layoutParam);
/* add control button: start and stop */
btnRecorderControl = (Button) findViewById(R.id.recorder_control);
btnRecorderControl.setOnClickListener(this);
/* add camera view */
int display_width_d = (int) (1.0 * bg_screen_width * screenWidth / bg_width);
int display_height_d = (int) (1.0 * bg_screen_height * screenHeight / bg_height);
int prev_rw, prev_rh;
if (1.0 * display_width_d / display_height_d > 1.0 * live_width / live_height) {
prev_rh = display_height_d;
prev_rw = (int) (1.0 * display_height_d * live_width / live_height);
} else {
prev_rw = display_width_d;
prev_rh = (int) (1.0 * display_width_d * live_height / live_width);
}
layoutParam = new RelativeLayout.LayoutParams(prev_rw, prev_rh);
layoutParam.topMargin = (int) (1.0 * bg_screen_by * screenHeight / bg_height);
layoutParam.leftMargin = (int) (1.0 * bg_screen_bx * screenWidth / bg_width);
cameraDevice = Camera.open();
Log.i(LOG_TAG, "cameara open");
    cameraView = new CameraView(this, cameraDevice);
topLayout.addView(cameraView, layoutParam);
        Log.i(LOG_TAG, "cameara preview start: OK");
}
//---------------------------------------
// initialize ffmpeg_recorder
//---------------------------------------
    private void initRecorder() {
   
    Log.w(LOG_TAG,"init recorder");

if (yuvIplimage == null) {
yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
Log.i(LOG_TAG, "create yuvIplimage");
}
   
Log.i(LOG_TAG, "ffmpeg_url: " + ffmpeg_link);
    recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
    //recorder.interleaved = false;
   
recorder.setFormat("flv");
    recorder.setAudioCodec(avcodec.AV_CODEC_ID_AAC);
recorder.setSampleRate(sampleAudioRateInHz);
recorder.setVideoCodec(avcodec.AV_CODEC_ID_FLV1);
// Set in the surface changed method
recorder.setFrameRate(frameRate);
recorder.setPixelFormat(PIX_FMT_YUV420P);

Log.i(LOG_TAG, "recorder initialize success");
   
audioRecordRunnable = new AudioRecordRunnable();
audioThread = new Thread(audioRecordRunnable);
}
public void startRecording() {
try {
recorder.start();
recording = true;
audioThread.start();

} catch (Exception e) {
e.printStackTrace();
}
}
public void stopRecording() {
runAudioThread = false;
if (recorder != null && recording) {
recording = false;
Log.v(LOG_TAG,"Finishing recording, calling stop and release on recorder");
try {
recorder.stop();
recorder.release();
} catch (Exception e) {
e.printStackTrace();
}
recorder = null;
}
}
@Override
public boolean onKeyDown(int keyCode, KeyEvent event) {
if (keyCode == KeyEvent.KEYCODE_BACK) {
if (recording) {
stopRecording();
}
finish();
return true;
}
return super.onKeyDown(keyCode, event);
}
    
    //---------------------------------------------
// audio thread, gets and encodes audio data
//---------------------------------------------
class AudioRecordRunnable implements Runnable {

@Override
public void run() {
// Audio
int bufferSize;
short[] audioData, copiedAudioData;
int bufferReadResult;
int minBufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz, 
AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);
bufferSize = minBufferSize;
audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz, 
AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);

audioData = new short[bufferSize];
Log.d(LOG_TAG, "audioRecord.startRecording()");
audioRecord.startRecording();
/* ffmpeg_audio encoding loop */
while (runAudioThread) {
//Log.v(LOG_TAG,"recording? " + recording);
bufferReadResult = audioRecord.read(audioData, 0, audioData.length);
if (bufferReadResult > 0) {
Log.v(LOG_TAG,"bufferReadResult: " + bufferReadResult);
// If "recording" isn't true when start this thread, it never get's set according to this if statement...!!!
// Why?  Good question...
if (recording == true) {
int times = bufferReadResult/1024;
//Log.v(LOG_TAG,times + " 1024");
for (int i = 0; i < times; i++) {
Buffer realAudioData =ShortBuffer.wrap(audioData, 1024*i, 1024*i+1024);
try {
recorder.record(realAudioData);
//Log.v(LOG_TAG,"recording " + 1024*i + " to " + 1024*i+1024);
} catch (Exception e) {
Log.v(LOG_TAG,e.getMessage());
e.printStackTrace();
}
}
}
}
}
Log.v(LOG_TAG,"AudioThread Finished, release audioRecord");
/* encoding finish, release recorder */
if (audioRecord != null) {
audioRecord.stop();
audioRecord.release();
audioRecord = null;
Log.v(LOG_TAG,"audioRecord released");
}
}
}
        Log.v(LOG_TAG,"Preview Framerate: " + camParams.getPreviewFrameRate());
       
        camParams.setPreviewFrameRate(frameRate);
            mCamera.setParameters(camParams);
            startPreview();    
    }

    @Override
    public void surfaceDestroyed(SurfaceHolder holder) {
    mHolder.addCallback(null);
    mCamera.setPreviewCallback(null);
    }
   
        public void startPreview() {
            if (!isPreviewOn && mCamera != null) {
        isPreviewOn = true;
        mCamera.startPreview();
        }
        }
        
        public void stopPreview() {
            if (isPreviewOn && mCamera != null) {
        isPreviewOn = false;
        mCamera.stopPreview();
        }
        }
   
    long lastTime = System.currentTimeMillis();
    long currentTime = System.currentTimeMillis();
    long minTime = 1000/frameRate;        
        
    @Override
    public void onPreviewFrame(byte[] data, Camera camera) {
    /* get video data */
if (yuvIplimage != null && recording) {
yuvIplimage.getByteBuffer().put(data);
    currentTime = System.currentTimeMillis();
    if (currentTime - lastTime >= minTime) {
    Log.v(LOG_TAG,"Writing Frame");
    try {
    recorder.record(yuvIplimage);
    } catch (Exception e) {
    Log.v(LOG_TAG,e.getMessage());
    e.printStackTrace();
    }
    lastTime = currentTime;
}
}
    }
    }

@Override
public void onClick(View v) {
if (!recording) {
startRecording();
Log.w(LOG_TAG, "Start Button Pushed");
    btnRecorderControl.setText("stop");
} else {
// This will trigger the audio recording loop to stop and then set isRecorderStart = false;
stopRecording();
Log.w(LOG_TAG, "Stop Button Pushed");
btnRecorderControl.setText("start");
}
}
}
JavaCVAndroidWriteVideoExample.zip

Samuel Audet

unread,
Jan 4, 2013, 5:52:30 AM1/4/13
to javacv, Shawn Van Every
On 01/03/2013 02:26 PM, Shawn Van Every wrote:
> Oops.. Attachment was too large for the list. Here it is again, minus
> the libraries.

No, Google Groups accepts messages up to 25 megs AFAIK, but there is
apparently no way to reduce it to something more reasonable... Bah, what
can we expect from a free service? At least we get the disk space to
hold all the junk we wish to upload :)

> On Thu, Jan 3, 2013 at 12:23 AM, Shawn Van Every <savan...@gmail.com> wrote:
>
> Hi Samuel,
>
> Sorry for the delay. It's not perfect but it should prove a good
> start for those who are looking.
>
> Full Eclipse project archive attached.
>
> Below is the source of the main activity.

Great! Thanks! Do you mind if I include only the .java file (+ the
config in comments) in the samples directory? (What license BTW? or is
it public domain?) I've found that Eclipse Android projects are neither
backward nor forward compatible (what can we expect from a free
platform? Or maybe this is just a Google thing :) so I don't see the
point of including the overhead it represents..

Samuel

Shawn Van Every

unread,
Jan 7, 2013, 1:40:45 PM1/7/13
to Samuel Audet, javacv, Shawn Van Every, zhangqi...@gmail.com
Hi Samuel,

Please feel free to publish it how you like. Just the Java file is fine. I consider the code to be public domain but it is based off of code from Qianliang Zhang provided here: https://code.google.com/p/javacv/issues/detail?id=160&can=1&start=100 so perhaps we should check with him.

-shawn

Samuel Audet

unread,
Jan 13, 2013, 2:13:33 AM1/13/13
to Shawn Van Every, javacv, Shawn Van Every, zhangqi...@gmail.com
Hello,

On 01/08/2013 03:40 AM, Shawn Van Every wrote:
> Hi Samuel,
>
> Please feel free to publish it how you like. Just the Java file is fine. I consider the code to be public domain but it is based off of code from Qianliang Zhang provided here:https://code.google.com/p/javacv/issues/detail?id=160&can=1&start=100 so perhaps we should check with him.

Ah, I see... Well, it's not public domain unless all the authors say as
much. For now, I'll just leave the license as unspecified and only
indicate the copyright. Anyway, I've fixed a few things and added it to
the samples:
https://code.google.com/p/javacv/source/detail?r=3fbddde33152b1059649aae6526358e6d2e504c3
It works much better now I think :)

Samuel
Reply all
Reply to author
Forward
0 new messages