Capture video through OpenCv JavaCameraView and getting error

1,480 views
Skip to first unread message

Jitendra Ramoliya

unread,
Apr 21, 2015, 6:17:51 AM4/21/15
to jav...@googlegroups.com


Hi Samulate, 
          I am trying video Capture through OpenCv JavaCameraView but I am Getting this type of error while running application.

 FATAL EXCEPTION: Thread-9091
 java.nio.BufferOverflowException
at java.nio.Buffer.checkPutBounds(Buffer.java:189)
at java.nio.DirectByteBuffer.put(DirectByteBuffer.java:307)
at java.nio.ByteBuffer.put(ByteBuffer.java:704)
at com.opencameravideo.VideoMainActivity.onFrame(VideoMainActivity.java:319)
at com.opencameravideo.VideoMainActivity.onCameraFrame(VideoMainActivity.java:256)
at org.opencv.android.CameraBridgeViewBase.deliverAndDrawFrame(CameraBridgeViewBase.java:387)
at org.opencv.android.JavaCameraView$CameraWorker.run(JavaCameraView.java:346)
at java.lang.Thread.run(Thread.java:841)

Waiting for your reply

Jitendra Ramoliya

unread,
Apr 22, 2015, 2:46:02 AM4/22/15
to jav...@googlegroups.com
Hi Samulate, 
          I am trying video Capture through OpenCv JavaCameraView but I am Getting this type of error while running application.

This is My Code, Please Help me out from this problem
package com.opencameravideo;

import static org.bytedeco.javacpp.opencv_core.IPL_DEPTH_8U;
import static org.bytedeco.javacpp.opencv_core.CV_8UC1;

import java.io.File;
import java.io.IOException;
import java.nio.ShortBuffer;
import java.util.List;

import org.bytedeco.javacpp.BytePointer;
import org.bytedeco.javacpp.opencv_core;
import org.bytedeco.javacpp.opencv_core.IplImage;
import org.bytedeco.javacv.FFmpegFrameRecorder;
import org.bytedeco.javacv.Frame;
import org.opencv.android.BaseLoaderCallback;
import org.opencv.android.CameraBridgeViewBase.CvCameraViewFrame;
import org.opencv.android.CameraBridgeViewBase.CvCameraViewListener2;
import org.opencv.android.LoaderCallbackInterface;
import org.opencv.android.OpenCVLoader;
import org.opencv.core.Mat;
import org.opencv.core.Scalar;
import org.opencv.imgproc.Imgproc;

import android.app.Activity;
import android.content.BroadcastReceiver;
import android.content.Context;
import android.content.Intent;
import android.content.IntentFilter;
import android.hardware.Camera;
import android.hardware.Camera.PictureCallback;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.media.MediaScannerConnection;
import android.os.Bundle;
import android.support.v4.content.LocalBroadcastManager;
import android.util.Log;
import android.view.SurfaceView;
import android.view.View;
import android.view.View.OnClickListener;
import android.view.WindowManager;
import android.widget.Button;
import android.widget.Toast;

public class VideoMainActivity extends Activity implements
CvCameraViewListener2 {
// , PictureCallback {

private JavaOpenCvCameraView javaOpenCvCameraView;
private Mat edgesMat;
private final Scalar greenScalar = new Scalar(0, 255, 0);
private int resolutionIndex = 0;
Button startVideo, stopVideo;

private opencv_core.IplImage videoImage = null;

boolean recording = false;
private volatile FFmpegFrameRecorder recorder;

// default
// private int sampleAudioRateInHz = 44100;
// private int imageWidth = 320;
// private int imageHeight = 240;
// private int frameRate = 30;
private int sampleAudioRateInHz = 44100;
private int imageWidth = 320;
private int imageHeight = 240;
private int frameRate = 60;
private String RECIEVE_BYTE_BUFFER = "";
private Thread audioThread;
volatile boolean runAudioThread = true;
private AudioRecord audioRecord;
private AudioRecordRunnable audioRecordRunnable;

private String ffmpeg_link;

long startTime = 0;

private String LOG_TAG = "VideoTest";

private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {
@Override
public void onManagerConnected(int status) {
switch (status) {
case LoaderCallbackInterface.SUCCESS:
Log.i("VideoTest", "OpenCV loaded successfully");
javaOpenCvCameraView.enableView();
break;
default:
super.onManagerConnected(status);
break;
}
}
};

public void onCreate(Bundle savedInstanceState) {

super.onCreate(savedInstanceState);
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);

setContentView(R.layout.activity_main);

startVideo = (Button) findViewById(R.id.startVideo);
stopVideo = (Button) findViewById(R.id.stopVideo);

javaOpenCvCameraView = (JavaOpenCvCameraView) findViewById(R.id.surface_view);
javaOpenCvCameraView.setVisibility(SurfaceView.VISIBLE);
javaOpenCvCameraView.setCvCameraViewListener(this);

LocalBroadcastManager.getInstance(VideoMainActivity.this)
.registerReceiver(recieverByteBuffere,
new IntentFilter(RECIEVE_BYTE_BUFFER));

startVideo.setOnClickListener(new OnClickListener() {

@Override
public void onClick(View v) {
startVideo(startVideo);
}
});

stopVideo.setOnClickListener(new OnClickListener() {

@Override
public void onClick(View v) {
stopRecording();
}
});
initRecorder();
}

private void initRecorder() {
Log.w(LOG_TAG, "initRecorder");

// int depth = com.googlecode.javacv.cpp.opencv_core.IPL_DEPTH_8U;
int channels = 4;

// if (yuvIplimage == null) {
// Recreated after frame size is set in surface change method
// videoImage = IplImage.create(imageWidth, imageHeight, depth,
// channels);
// yuvIplimage = IplImage
// .create(imageWidth, imageHeight, IPL_DEPTH_32S, 2);
videoImage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);

Log.v(LOG_TAG, "IplImage.create");
// }

File videoFile = new File("/mnt/sdcard",
"VideoTest/images/video.mp4");
boolean mk = videoFile.getParentFile().mkdirs();
Log.v(LOG_TAG, "Mkdir: " + mk);

boolean del = videoFile.delete();
Log.v(LOG_TAG, "del: " + del);

try {
boolean created = videoFile.createNewFile();
Log.v(LOG_TAG, "Created: " + created);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}

ffmpeg_link = videoFile.getAbsolutePath();
recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth,
imageHeight, 1);
Log.v(LOG_TAG, "FFmpegFrameRecorder: " + ffmpeg_link + " imageWidth: "
+ imageWidth + " imageHeight " + imageHeight);

recorder.setFormat("mp4");
Log.v(LOG_TAG, "recorder.setFormat(\"mp4\")");

recorder.setSampleRate(sampleAudioRateInHz);
Log.v(LOG_TAG, "recorder.setSampleRate(sampleAudioRateInHz)");

// re-set in the surface changed method as well
recorder.setFrameRate(frameRate);
Log.v(LOG_TAG, "recorder.setFrameRate(frameRate)");

// Create audio recording thread
audioRecordRunnable = new AudioRecordRunnable();
audioThread = new Thread(audioRecordRunnable);
}

@Override
public void onPause() {
super.onPause();
if (javaOpenCvCameraView != null) {
javaOpenCvCameraView.disableView();
}
}

@Override
public void onResume() {
super.onResume();
OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_3, this,
mLoaderCallback);
}

public void onDestroy() {
super.onDestroy();
if (javaOpenCvCameraView != null)
javaOpenCvCameraView.disableView();

if (recieverByteBuffere != null) {
LocalBroadcastManager.getInstance(VideoMainActivity.this)
.unregisterReceiver(recieverByteBuffere);
}

}

public void onCameraViewStarted(int width, int height) {
edgesMat = new Mat();
}

public void onCameraViewStopped() {
if (edgesMat != null)
edgesMat.release();

edgesMat = null;
}

public Mat onCameraFrame(CvCameraViewFrame inputFrame) {
Log.e("", "onCameraFrame is call");

Mat rgba = inputFrame.rgba();
org.opencv.core.Size sizeRgba = rgba.size();

// int rows = (int) sizeRgba.height;
// int cols = (int) sizeRgba.width;
//
// int left = cols / 8;
// int top = rows / 8;
// int width = cols * 3 / 4;
// int height = rows * 3 / 4;
//
// // get sub-image
// Mat rgbaInnerWindow = rgba
// .submat(top, top + height, left, left + width);
//
// // create edgesMat from sub-image
// Imgproc.Canny(rgbaInnerWindow, edgesMat, 100, 100);
//
// Mat colorEdges = new Mat();
// Mat killMe = colorEdges;
// edgesMat.copyTo(colorEdges);
// Imgproc.cvtColor(colorEdges, colorEdges, Imgproc.COLOR_GRAY2BGRA);
//
// colorEdges = colorEdges.setTo(greenScalar, edgesMat);
// colorEdges.copyTo(rgbaInnerWindow, edgesMat);
//
// killMe.release();
// colorEdges.release();
//
// rgbaInnerWindow.release();

if (recording) {
byte[] byteFrame = new byte[(int) (rgba.total() * rgba.channels())];
rgba.get(0, 0, byteFrame);
onFrame(byteFrame);
}

return rgba;
}

public void stopRecording() {
// This should stop the audio thread from running
runAudioThread = false;

if (recorder != null) {
Log.v(LOG_TAG,
"Finishing recording, calling stop and release on recorder");
try {
recorder.stop();
recorder.release();

} catch (FFmpegFrameRecorder.Exception e) {
e.printStackTrace();
}
Toast.makeText(VideoMainActivity.this,
"saved ffmpeg_link::" + ffmpeg_link, Toast.LENGTH_SHORT)
.show();
recorder = null;
recording = false;
}

MediaScannerConnection.scanFile(VideoMainActivity.this,
new String[] { ffmpeg_link }, null, null);
}

public void changeResolution(View v) {
List<android.hardware.Camera.Size> cameraResolutionList = javaOpenCvCameraView
.getResolutionList();
resolutionIndex++;
if (resolutionIndex >= cameraResolutionList.size()) {
resolutionIndex = 0;
}

android.hardware.Camera.Size resolution = cameraResolutionList
.get(resolutionIndex);
javaOpenCvCameraView.setResolution(resolution.width, resolution.height);
resolution = javaOpenCvCameraView.getResolution();
String caption = Integer.valueOf(resolution.width).toString() + "x"
+ Integer.valueOf(resolution.height).toString();
Toast.makeText(this, caption, Toast.LENGTH_SHORT).show();

imageWidth = resolution.width;
imageHeight = resolution.height;

// frameRate = cameraView.getFrameRate();

initRecorder();
}

int frames = 0;

private void onFrame(byte[] data) {

Log.e("", "data frame::" + data.length);

if (videoImage != null && recording) {
long videoTimestamp = 1000 * (System.currentTimeMillis() - startTime);
// Put the camera preview frame right into the yuvIplimage object
videoImage.getByteBuffer().put(data);
// videoImage = IplImage.createFrom(data);
// videoImage = cvDecodeImage(cvMat(1, data.length, CV_8UC1,
// new BytePointer(data)));
try {

if (recorder != null) {
// Get the correct time
recorder.setTimestamp(videoTimestamp);

// Record the image into FFmpegFrameRecorder
// recorder.record(videoImage);
recorder.record(videoImage);

frames++;

Log.i(LOG_TAG, "Wrote Frame: " + frames);
}

} catch (FFmpegFrameRecorder.Exception e) {
Log.v(LOG_TAG, e.getMessage());
e.printStackTrace();
}
}

}

public void startVideo(View v) {

recording = !recording;

Log.i(LOG_TAG, "Recording: " + recording);

if (recording) {
startTime = System.currentTimeMillis();
try {
recorder.start();

Log.i(LOG_TAG, "STARTED RECORDING.");

} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
} else {
stopRecording();
}
}

class AudioRecordRunnable implements Runnable {

@Override
public void run() {
// Set the thread priority
android.os.Process
.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

// Audio
int bufferSize;
short[] audioData;
int bufferReadResult;

bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT);
audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC,
sampleAudioRateInHz,
AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT, bufferSize);

audioData = new short[bufferSize];

Log.d(LOG_TAG, "audioRecord.startRecording()");
audioRecord.startRecording();

// Audio Capture/Encoding Loop
while (runAudioThread) {
// Read from audioRecord
bufferReadResult = audioRecord.read(audioData, 0,
audioData.length);
if (bufferReadResult > 0) {
// Log.v(LOG_TAG,"audioRecord bufferReadResult: " +
// bufferReadResult);

// Changes in this variable may not be picked up despite it
// being "volatile"
if (recording) {
try {
// Write to FFmpegFrameRecorder
recorder.record(ShortBuffer.wrap(audioData, 0,
bufferReadResult));
} catch (FFmpegFrameRecorder.Exception e) {
Log.v(LOG_TAG, e.getMessage());
e.printStackTrace();
}
}
}
}
Log.v(LOG_TAG, "AudioThread Finished");

/* Capture/Encoding finished, release recorder */
if (audioRecord != null) {
audioRecord.stop();
audioRecord.release();
audioRecord = null;

MediaScannerConnection.scanFile(VideoMainActivity.this,
new String[] { ffmpeg_link }, null, null);

Log.v(LOG_TAG, "audioRecord released");
}
}
}

BroadcastReceiver recieverByteBuffere = new BroadcastReceiver() {

@Override
public void onReceive(Context context, Intent intent) {
System.out.println("recieverByteBuffere is call");
if (intent != null && intent.getExtras() != null
&& intent.getExtras().containsKey("byte_data_arrays")) {
byte[] data = intent.getExtras().getByteArray(
"byte_data_arrays");
Log.e("", "data size::" + data.length);
onFrame(data);
}

}
};

// @Override
// public void onPictureTaken(byte[] data, Camera camera) {
// onFrame(data);
// }

}

Samuel Audet

unread,
Apr 22, 2015, 9:20:55 AM4/22/15
to jav...@googlegroups.com
On 04/22/2015 03:46 PM, Jitendra Ramoliya wrote:
> I am trying video Capture through OpenCv JavaCameraView but I am Getting this type of error while running application.

Do you get the same error when running the RecordActivity sample found
below?
https://github.com/bytedeco/javacv/blob/master/samples/RecordActivity.java

Samuel

Jitendra Ramoliya

unread,
Apr 28, 2015, 12:41:44 AM4/28/15
to jav...@googlegroups.com
 Hi samuel,
 I am using that record activity in my code but that shouldn't recording with this code. provide me solution fast if possible.

package com.recordactivity;

import static org.bytedeco.javacpp.opencv_core.IPL_DEPTH_8U;

import java.io.IOException;
import java.nio.ShortBuffer;
import java.util.List;

import org.bytedeco.javacpp.opencv_core.IplImage;
import org.bytedeco.javacv.FFmpegFrameRecorder;
import org.opencv.android.BaseLoaderCallback;
import org.opencv.android.LoaderCallbackInterface;
import org.opencv.android.OpenCVLoader;
import org.opencv.android.CameraBridgeViewBase.CvCameraViewFrame;
import org.opencv.android.CameraBridgeViewBase.CvCameraViewListener2;
import org.opencv.core.Mat;
import org.opencv.imgproc.Imgproc;

import com.recordactivity.RecordOriginalActivity.CameraView;

import android.app.Activity;
import android.content.Context;
import android.content.pm.ActivityInfo;
import android.hardware.Camera;
import android.hardware.Camera.PreviewCallback;
import android.hardware.Camera.Size;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.os.PowerManager;
import android.util.Log;
import android.view.Display;
import android.view.KeyEvent;
import android.view.LayoutInflater;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.view.View.OnClickListener;
import android.view.WindowManager;
import android.widget.Button;
import android.widget.RelativeLayout;

public class RecordActivity extends Activity implements OnClickListener,
CvCameraViewListener2 {

private final static String CLASS_LABEL = "RecordActivity";
private final static String LOG_TAG = CLASS_LABEL;

private PowerManager.WakeLock mWakeLock;

private String ffmpeg_link = "/mnt/sdcard/streamTest.mp4";

long startTime = 0;
boolean recording = false;

private volatile FFmpegFrameRecorder recorder;

private boolean isPreviewOn = false;

private int sampleAudioRateInHz = 44100;
private int imageWidth = 320;
private int imageHeight = 240;
private int frameRate = 30;

/* audio data getting thread */
private AudioRecord audioRecord;
private AudioRecordRunnable audioRecordRunnable;
private Thread audioThread;
volatile boolean runAudioThread = true;

/* video data getting thread */
// private Camera cameraDevice;
// private CameraView cameraView;
private JavaOpenCvCameraView javaOpenCvCameraView;
private Mat edgesMat;
private boolean isCameraStarted = false;

private IplImage yuvIplimage = null;

/* layout setting */
private final int bg_screen_bx = 232;
private final int bg_screen_by = 128;
private final int bg_screen_width = 700;
private final int bg_screen_height = 500;
private final int bg_width = 1123;
private final int bg_height = 715;
private final int live_width = 640;
private final int live_height = 480;
private int screenWidth, screenHeight;
private Button btnRecorderControl;

/**
* The number of seconds in the continuous record loop (or 0 to disable
* loop).
*/
final int RECORD_LENGTH = 15;
IplImage[] images;
long[] timestamps;
ShortBuffer[] samples;
int imagesIndex, samplesIndex;

@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);

setContentView(R.layout.main);

PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK,
CLASS_LABEL);
mWakeLock.acquire();

initLayout();
}

@Override
protected void onResume() {
super.onResume();

if (mWakeLock == null) {
PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK,
CLASS_LABEL);
mWakeLock.acquire();
}

OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_3, this,
mLoaderCallback);

}

@Override
protected void onPause() {
super.onPause();

if (mWakeLock != null) {
mWakeLock.release();
mWakeLock = null;
}

if (javaOpenCvCameraView != null) {
javaOpenCvCameraView.disableView();
}
}

@Override
protected void onDestroy() {
super.onDestroy();

recording = false;

// if (cameraView != null) {
// cameraView.stopPreview();
// }
//
// if (cameraDevice != null) {
// cameraDevice.stopPreview();
// cameraDevice.release();
// cameraDevice = null;
// }

if (javaOpenCvCameraView != null) {
javaOpenCvCameraView.disableView();
}

if (mWakeLock != null) {
mWakeLock.release();
mWakeLock = null;
}
}

private void initLayout() {

/* get size of screen */
Display display = ((WindowManager) getSystemService(Context.WINDOW_SERVICE))
.getDefaultDisplay();
screenWidth = display.getWidth();
screenHeight = display.getHeight();
Log.e("", "screenWidth:::" + screenWidth);
Log.e("", "screenHeight:::" + screenHeight);
RelativeLayout.LayoutParams layoutParam = null;
LayoutInflater myInflate = null;
myInflate = (LayoutInflater) getSystemService(Context.LAYOUT_INFLATER_SERVICE);
RelativeLayout topLayout = new RelativeLayout(this);
setContentView(topLayout);
RelativeLayout preViewLayout = (RelativeLayout) myInflate.inflate(
R.layout.main, null);
layoutParam = new RelativeLayout.LayoutParams(screenWidth, screenHeight);
// topLayout.addView(preViewLayout, layoutParam);
topLayout.addView(preViewLayout);

/* add camera view */

// layoutParam = new RelativeLayout.LayoutParams(prev_rw, prev_rh);
layoutParam = new RelativeLayout.LayoutParams(1280, 720);
// layoutParam.topMargin = (int) (1.0 * bg_screen_by * screenHeight /
// bg_height);
// layoutParam.leftMargin = (int) (1.0 * bg_screen_bx * screenWidth /
// bg_width);

Button preButtonLayout = (Button) myInflate.inflate(
R.layout.custom_btn, null);
btnRecorderControl = (Button) preButtonLayout
.findViewById(R.id.recorder_control);
topLayout.addView(btnRecorderControl);

javaOpenCvCameraView = (JavaOpenCvCameraView) findViewById(R.id.surface_view);
javaOpenCvCameraView.setVisibility(SurfaceView.VISIBLE);
javaOpenCvCameraView.setCvCameraViewListener(this);

/* add control button: start and stop */
btnRecorderControl = (Button) findViewById(R.id.recorder_control);
btnRecorderControl.setText("Start");
btnRecorderControl.setOnClickListener(this);

// cameraView = new CameraView(this, cameraDevice);
// topLayout.addView(cameraView, layoutParam);
Log.i(LOG_TAG, "cameara preview start: OK");
}

private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {
@Override
public void onManagerConnected(int status) {
switch (status) {
case LoaderCallbackInterface.SUCCESS:
Log.i("VideoTest", "OpenCV loaded successfully");
javaOpenCvCameraView.enableView();
break;
default:
super.onManagerConnected(status);
break;
}
}
};

// ---------------------------------------
// initialize ffmpeg_recorder
// ---------------------------------------
private void initRecorder() {

Log.w(LOG_TAG, "init recorder");

if (RECORD_LENGTH > 0) {
imagesIndex = 0;
images = new IplImage[RECORD_LENGTH * frameRate];
timestamps = new long[images.length];
for (int i = 0; i < images.length; i++) {
// images[i] = IplImage.create(imageWidth, imageHeight,
// IPL_DEPTH_8U, 2);
images[i] = IplImage.create(720, 1280, IPL_DEPTH_8U, 2);
timestamps[i] = -1;
}
} else if (yuvIplimage == null) {
// yuvIplimage = IplImage.create(imageWidth, imageHeight,
// IPL_DEPTH_8U, 2);
yuvIplimage = IplImage.create(720, 1280, IPL_DEPTH_8U, 2);
Log.i(LOG_TAG, "create yuvIplimage");
}

Log.i(LOG_TAG, "ffmpeg_url: " + ffmpeg_link);
// recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth,
// imageHeight, 1);
recorder = new FFmpegFrameRecorder(ffmpeg_link, 720, 1280, 2);
// recorder.setFormat("flv");
recorder.setFormat("mp4");
recorder.setSampleRate(sampleAudioRateInHz);
// Set in the surface changed method
recorder.setFrameRate(frameRate);
// recorder.setVideoCodec(13);
// recorder.setVideoQuality(1.0D);

Log.i(LOG_TAG, "recorder initialize success");

audioRecordRunnable = new AudioRecordRunnable();
audioThread = new Thread(audioRecordRunnable);
runAudioThread = true;
}

public void startRecording() {

initRecorder();

try {
recorder.start();
startTime = System.currentTimeMillis();
recording = true;
audioThread.start();

} catch (FFmpegFrameRecorder.Exception e) {
e.printStackTrace();
}
}

public void stopRecording() {

runAudioThread = false;
try {
audioThread.join();
} catch (InterruptedException e) {
e.printStackTrace();
}
audioRecordRunnable = null;
audioThread = null;
recording = false;

if (recorder != null && recording) {
if (RECORD_LENGTH > 0) {
Log.v(LOG_TAG, "Writing frames in recording");
try {
int firstIndex = imagesIndex % samples.length;
int lastIndex = (imagesIndex - 1) % images.length;
if (imagesIndex <= images.length) {
firstIndex = 0;
lastIndex = imagesIndex - 1;
}
if ((startTime = timestamps[lastIndex] - RECORD_LENGTH
* 1000000L) < 0) {
startTime = 0;
}
if (lastIndex < firstIndex) {
lastIndex += images.length;
}
for (int i = firstIndex; i <= lastIndex; i++) {
long t = timestamps[i % timestamps.length] - startTime;
if (t >= 0) {
if (t > recorder.getTimestamp()) {
recorder.setTimestamp(t);
}
recorder.record(images[i % images.length]);
}
}

firstIndex = samplesIndex % samples.length;
lastIndex = (samplesIndex - 1) % samples.length;
if (samplesIndex <= samples.length) {
firstIndex = 0;
lastIndex = samplesIndex - 1;
}
if (lastIndex < firstIndex) {
lastIndex += samples.length;
}
for (int i = firstIndex; i <= lastIndex; i++) {
recorder.record(samples[i % samples.length]);
}
} catch (FFmpegFrameRecorder.Exception e) {
Log.v(LOG_TAG, e.getMessage());
e.printStackTrace();
}
}

// recording = false;
Log.v(LOG_TAG,
"Finishing recording, calling stop and release on recorder");
try {
recorder.stop();
recorder.release();
} catch (FFmpegFrameRecorder.Exception e) {
e.printStackTrace();
}
recorder = null;

}
}

@Override
public boolean onKeyDown(int keyCode, KeyEvent event) {

if (keyCode == KeyEvent.KEYCODE_BACK) {
if (recording) {
stopRecording();
}

finish();

return true;
}

return super.onKeyDown(keyCode, event);
}

// ---------------------------------------------
// audio thread, gets and encodes audio data
// ---------------------------------------------
class AudioRecordRunnable implements Runnable {

@Override
public void run() {
android.os.Process
.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

// Audio
int bufferSize;
ShortBuffer audioData;
int bufferReadResult;

bufferSize = AudioRecord
.getMinBufferSize(sampleAudioRateInHz,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT);
audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC,
sampleAudioRateInHz, AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT, bufferSize);

if (RECORD_LENGTH > 0) {
samplesIndex = 0;
samples = new ShortBuffer[RECORD_LENGTH * sampleAudioRateInHz
* 2 / bufferSize + 1];
for (int i = 0; i < samples.length; i++) {
samples[i] = ShortBuffer.allocate(bufferSize);
}
} else {
audioData = ShortBuffer.allocate(bufferSize);
}

Log.d(LOG_TAG, "audioRecord.startRecording()");
audioRecord.startRecording();

/* ffmpeg_audio encoding loop */
while (runAudioThread) {
if (RECORD_LENGTH > 0) {
audioData = samples[samplesIndex++ % samples.length];
audioData.position(0).limit(0);
}
// Log.v(LOG_TAG,"recording? " + recording);
bufferReadResult = audioRecord.read(audioData.array(), 0,
audioData.capacity());
audioData.limit(bufferReadResult);
if (bufferReadResult > 0) {
Log.v(LOG_TAG, "bufferReadResult: " + bufferReadResult);
// If "recording" isn't true when start this thread, it
// never get's set according to this if statement...!!!
// Why? Good question...
if (recording) {
if (RECORD_LENGTH <= 0)
try {
recorder.record(audioData);
// Log.v(LOG_TAG,"recording " + 1024*i + " to "
// + 1024*i+1024);
} catch (FFmpegFrameRecorder.Exception e) {
Log.v(LOG_TAG, e.getMessage());
e.printStackTrace();
}
}
}
}
Log.v(LOG_TAG, "AudioThread Finished, release audioRecord");

/* encoding finish, release recorder */
if (audioRecord != null) {
audioRecord.stop();
audioRecord.release();
audioRecord = null;
Log.v(LOG_TAG, "audioRecord released");
}
}
}

// ---------------------------------------------
// camera thread, gets and encodes video data
// ---------------------------------------------
// class CameraView extends SurfaceView implements SurfaceHolder.Callback,
// PreviewCallback {
//
// private SurfaceHolder mHolder;
// private Camera mCamera;
//
// public CameraView(Context context, Camera camera) {
// super(context);
// Log.w("camera", "camera view");
// mCamera = camera;
// mHolder = getHolder();
// mHolder.addCallback(CameraView.this);
// mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
// mCamera.setPreviewCallback(CameraView.this);
// }
//
// @Override
// public void surfaceCreated(SurfaceHolder holder) {
// try {
// stopPreview();
// mCamera.setPreviewDisplay(holder);
// } catch (IOException exception) {
// mCamera.release();
// mCamera = null;
// }
// }
//
// public void surfaceChanged(SurfaceHolder holder, int format, int width,
// int height) {
// Log.v(LOG_TAG, "Setting imageWidth: " + imageWidth
// + " imageHeight: " + imageHeight + " frameRate: "
// + frameRate);
// Camera.Parameters camParams = mCamera.getParameters();
// camParams.setPreviewSize(imageWidth, imageHeight);
//
// Log.v(LOG_TAG,
// "Preview Framerate: " + camParams.getPreviewFrameRate());
//
// camParams.setPreviewFrameRate(frameRate);
// mCamera.setParameters(camParams);
// startPreview();
// }
//
// @Override
// public void surfaceDestroyed(SurfaceHolder holder) {
// try {
// mHolder.addCallback(null);
// mCamera.setPreviewCallback(null);
// } catch (RuntimeException e) {
// // The camera has probably just been released, ignore.
// }
// }
//
// public void startPreview() {
// if (!isPreviewOn && mCamera != null) {
// isPreviewOn = true;
// mCamera.startPreview();
// }
// }
//
// public void stopPreview() {
// if (isPreviewOn && mCamera != null) {
// isPreviewOn = false;
// mCamera.stopPreview();
// }
// }
//
// @Override
// public void onPreviewFrame(byte[] data, Camera camera) {
// if (audioRecord == null
// || audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING)
// {
// startTime = System.currentTimeMillis();
// return;
// }
// if (RECORD_LENGTH > 0) {
// int i = imagesIndex++ % images.length;
// yuvIplimage = images[i];
// timestamps[i] = 1000 * (System.currentTimeMillis() - startTime);
// }
// /* get video data */
// if (yuvIplimage != null && recording) {
// yuvIplimage.getByteBuffer().put(data);
//
// if (RECORD_LENGTH <= 0)
// try {
// Log.v(LOG_TAG, "Writing Frame");
// long t = 1000 * (System.currentTimeMillis() - startTime);
// if (t > recorder.getTimestamp()) {
// recorder.setTimestamp(t);
// }
// recorder.record(yuvIplimage);
// } catch (FFmpegFrameRecorder.Exception e) {
// Log.v(LOG_TAG, e.getMessage());
// e.printStackTrace();
// }
// }
// }
// }

@Override
public void onClick(View v) {
if (!recording) {
isCameraStarted = true;
startRecording();
Log.w(LOG_TAG, "Start Button Pushed");
btnRecorderControl.setText("Stop");
} else {
// This will trigger the audio recording loop to stop and then set
// isRecorderStart = false;
isCameraStarted = false;
stopRecording();
Log.w(LOG_TAG, "Stop Button Pushed");
btnRecorderControl.setText("Start");
}
}

int frames = 0;

private void onFrame(byte[] data) {

Log.e("", "data frame::" + data.length);

if (isCameraStarted) {

if (audioRecord == null
|| audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
startTime = System.currentTimeMillis();
return;
}
if (RECORD_LENGTH > 0) {
int i = imagesIndex++ % images.length;
yuvIplimage = images[i];
timestamps[i] = 1000 * (System.currentTimeMillis() - startTime);
}
/* get video data */
if (yuvIplimage != null && recording) {
yuvIplimage.getByteBuffer().put(data);

Log.v("", "recording::" + recording);
if (RECORD_LENGTH <= 0 && recording)
try {
Log.v(LOG_TAG, "Writing Frame");
long t = 1000 * (System.currentTimeMillis() - startTime);
if (t > recorder.getTimestamp()) {
recorder.setTimestamp(t);
}
recorder.record(yuvIplimage);
} catch (FFmpegFrameRecorder.Exception e) {
Log.v(LOG_TAG, e.getMessage());
e.printStackTrace();
}
} else {
Log.e("", "yuvIplimage is null");
}

}

// if (isCameraStarted) {
//
// if (audioRecord == null
// || audioRecord.getRecordingState() !=
// AudioRecord.RECORDSTATE_RECORDING) {
// startTime = System.currentTimeMillis();
// return;
// }
//
// try {
// /* get video data */
// if (yuvIplimage != null && recording) {
// yuvIplimage.getByteBuffer().put(data);
//
// long videoTimestamp = 1000 * (System.currentTimeMillis() -
// startTime);
//
// if (recorder != null) {
// // Get the correct time
// recorder.setTimestamp(videoTimestamp);
//
// // Record the image into FFmpegFrameRecorder
// // recorder.record(videoImage);
// recorder.record(yuvIplimage);
//
// frames++;
//
// Log.i(LOG_TAG, "Wrote Frame: " + frames);
// }
//
// } else {
// Log.e("", "yuvIplimage is null");
// }
//
// } catch (FFmpegFrameRecorder.Exception e) {
// Log.v(LOG_TAG, e.getMessage());
// e.printStackTrace();
// }
//
// }

}

@Override
public void onCameraViewStarted(int width, int height) {
edgesMat = new Mat();
// if (javaOpenCvCameraView != null) {
// javaOpenCvCameraView.setResolution(480, 800);
// }
}

@Override
public void onCameraViewStopped() {
if (edgesMat != null)
edgesMat.release();

edgesMat = null;

}

@Override
public Mat onCameraFrame(CvCameraViewFrame inputFrame) {

Mat rgba = inputFrame.rgba();
// org.opencv.core.Size sizeRgba = rgba.size();
//
// int rows = (int) sizeRgba.height;
// int cols = (int) sizeRgba.width;
//
// int left = cols / 8;
// int top = rows / 8;
// int width = cols * 3 / 4;
// int height = rows * 3 / 4;
//
// // get sub-image
// Mat rgbaInnerWindow = rgba
// .submat(top, top + height, left, left + width);
//
// // create edgesMat from sub-image
// Imgproc.Canny(rgbaInnerWindow, edgesMat, 100, 100);
//

Samuel Audet

unread,
Apr 29, 2015, 9:29:50 AM4/29/15
to jav...@googlegroups.com
On 04/28/2015 01:41 PM, Jitendra Ramoliya wrote:
> Hi samuel,
> I am using that record activity in my code but that shouldn't
> recording with this code. provide me solution fast if possible.

This isn't the code of the original RecordActivity sample. Could you
please first try the original code, and let us know if you have any
issue with that? Thanks!

Samuel

Jitendra Ramoliya

unread,
Apr 30, 2015, 12:35:21 AM4/30/15
to jav...@googlegroups.com
I tried Original Record Activity. Its running very well. but I want to capture video using JavaCameraView. but Video is not being properly. could you provide advise on this code?

Samuel Audet

unread,
May 2, 2015, 6:28:50 PM5/2/15
to jav...@googlegroups.com
Why do you want to use JavaCameraView? What does it do that
RecordActivity doesn't do?

Samuel

Reply all
Reply to author
Forward
0 new messages