Hi,
First post, not sure this is the right place but the best match I could make from
here. Sorry if posting to the wrong group.
So I have an issue using the MediaCodec APIs. I'm using a Google Pixel 1 (Qualcomm chipset) running stock Android 8.1, Samsung Galaxy 6 (Exynos chipset) running stock Android 7 and a DJI CrystalSky (RockChip chipset) running Android 5.1.
I'm trying to encode raw YUV420 image set stored in files.
I'm configuring the MediaCodec successfully using the code below on all three platforms using VBR.
I am also saving the output from the encoder to a file using the fetch method attached below, and what I've noticed is that the two first devices, Pixel and Galaxy 6, the output is the same - configuring the bitrate to 200000 and frame rate to 30 gives a relatively correct bitrate and at the correct frame rate.
However, running the same configuration on the DJI CrystalSky produces a bitrate much higher than the configuration, and at half the frame rate. Additionally, there was no difference in output when switching from VBR to constant quality or constant bitrate.
I also tried (not shown in code) to update the encoder parameters using KEY_PRIMARY_BITRATE at no avail.
My understanding is that this device passed CTS and as such should support this configuration just like expected on the Pixel and Galaxy devices.
The code is attached below, any thoughts are appreciated.
MediaCodec configuration
/* Set up codec to type AVC (H264) */
try {
mediaCodec = MediaCodec.createEncoderByType("video/avc"); //Also tried using string constant from MediaCodec, no difference
} catch (IOException e) {
logger.log(Severities.ERROR, "prepare failed to create mediaCodec: " + e.toString());
return;
}
logger.log(Severities.INFO, "Initializing codec with parameters:");
logger.log(Severities.INFO, "width: "+width+" height: "+height+" fps: "+
fps+" bitrate: "+bitrate+" mode: "+mode+" iframe_rate: "+iframe_rate);
this.fps = fps;
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", width, height);
final int KILO = 1000;
final int rate = bitrate * KILO;
/* Configure rate */
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, rate);
/* Configurable fps */
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, fps);
/* YUV420 input */
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Flexible);
/* Interval every */
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, iframe_rate);
/* Set bitrate mode */
if (mode == CBR)
mediaFormat.setInteger(MediaFormat.KEY_BITRATE_MODE, MediaCodecInfo.EncoderCapabilities.BITRATE_MODE_CBR);
else if (mode == VBR)
mediaFormat.setInteger(MediaFormat.KEY_BITRATE_MODE, MediaCodecInfo.EncoderCapabilities.BITRATE_MODE_VBR);
else if (mode == CQ)
mediaFormat.setInteger(MediaFormat.KEY_BITRATE_MODE, MediaCodecInfo.EncoderCapabilities.BITRATE_MODE_CQ);
/* Set configuration to encode */
mediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
/* Done */
state = States.STATE_PREPARED;
MediaCodec feeding
/* Use this method to provide YUV420 rasterized buffers for encoding */
public void feed(byte[] food) {
/* State verification */
if (state != States.STATE_START)
{
logger.log(Severities.ERROR, "prepare was called but current state is: " + state.describe());
return;
}
/* Find an unallocated input buffer to store food */
try {
int inputBufferIndex = mediaCodec.dequeueInputBuffer(-1);
if (inputBufferIndex >= 0) {
/* Get input buffer and fill it with our input */
ByteBuffer inputBuffer = mediaCodec.getInputBuffer(inputBufferIndex);
inputBuffer.clear();
inputBuffer.put(food);
/* Enqueue buffer */
logger.log(Severities.EXTRA_VERBOSE, "Enqueued input index: "+inputBufferIndex);
/* Benchmark in */
benchmarkDataIn.elementsIn++;
benchmarkDataIn.elementsWeightIn+=food.length;
mediaCodec.queueInputBuffer(inputBufferIndex, 0, food.length, (time)*1000 / fps, 0);
time++;
} else {
/* Benchmark drop. This should always be zero if dequeueInputBuffer timeout is -1 */
benchmarkDataIn.elementsInDropped++;
benchmarkDataIn.elementsWeightInDropped+=food.length;
}
}
catch (Exception e){
logger.log(Severities.ERROR, "Get free buffer failed");
}
MediaCodec fetch
public byte[] fetch() {
/* Currently all I know is this is required and filled by mediacodec */
MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
int outputBufferIndex = mediaCodec.dequeueOutputBuffer(info, -1);
if (outputBufferIndex >=0 ) {
/* Compressed frame is ready! */
ByteBuffer compressed = mediaCodec.getOutputBuffer(outputBufferIndex);
logger.log(Severities.EXTRA_VERBOSE, "Deque output index: "+outputBufferIndex);
/* Copy to byte array for further processing */
byte[] arr = new byte[compressed.remaining()];
compressed.get(arr);
/* Release MediaCodec buffer */
mediaCodec.releaseOutputBuffer(outputBufferIndex, false);
benchmarkDataOut.elementsIn++;
benchmarkDataOut.elementsWeightIn+=arr.length;
return arr;
}
else {
logger.log(Severities.INFO, "outputbuffer callback returned invalid index, -2 is okay: " + outputBufferIndex);
return null;
}
}