Hi, Scott!
Thanks for the reply.
I am trying to create a small app for live streaming directly form the back camera of an android phone to youtube live.
But I need to add a simple overlay on top of the image.
The overlay is a simple yellow rectangle with 75% opacity and a small text: CAM #01 - 20oC
(Where 20oC is the current temperature in degrees celcius).
I fixed the previous problem: I was converting the NV21 image from the back camera to yuv, not RGBA.
I rewrote the function that builds the bitmap, to convert from YUV to RGBA, so I can easily draw the overlay on top of it.
I am using FFMPEG to live stream to youtube.
It works perfectly, now.
Kind of :)
FFMPEG quits if I try to stream at resolutions above 640 x 480.
It also quits if I try to stream 640 x 480 at any frame rate above 12FPS.
I believe the problem is I am not feeding FFMPEG at a minimum required frame rate.
And the problem seems to be the time it is taking for converting NV21 to RGBA in Java.
I wonder if there's a better, faster way to do the conversion.
Another thing I noticed is the audio is streamed correctly, but the video is not synced with it - the image comes late, after the audio.
I wonder if there's a way to do this without using FFMEP.
Can I point me at an example how to stream directly from cameraX, without having to use FFMPEG?
What I really need is a way to live stream from the back camera, with a simple overlay.
Thanks in advance if you (or anyone) can help me.
Cheers!
P.S.: Here's the function I am uwing to convert the NV21 image that comes from the back camera to RGBA:
private Bitmap nv21ToRgba(ImageProxy image) {
// Validate input
if (image == null) {
return null;
}
// Extract image dimensions
int width = image.getWidth();
int height = image.getHeight();
// Extract YUV planes
ImageProxy.PlaneProxy yPlane = image.getPlanes()[0];
ImageProxy.PlaneProxy uvPlane = image.getPlanes()[1];
ByteBuffer yBuffer = yPlane.getBuffer();
ByteBuffer uvBuffer = uvPlane.getBuffer();
// Calculate buffer sizes and strides
int ySize = yBuffer.remaining();
int uvSize = uvBuffer.remaining();
int yRowStride = yPlane.getRowStride();
int uvRowStride = uvPlane.getRowStride();
// Create NV21 byte array
byte[] nv21Bytes = new byte[ySize + uvSize];
// Reset buffer positions
yBuffer.rewind();
uvBuffer.rewind();
// Copy Y plane
yBuffer.get(nv21Bytes, 0, ySize);
// Copy UV plane
uvBuffer.get(nv21Bytes, ySize, uvSize);
// Create bitmap and int array for pixel data
Bitmap bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
int[] rgbaData = new int[width * height];
// Convert NV21 to RGBA
for (int y = 0; y < height; y++) {
for (int x = 0; x < width; x++) {
// Calculate Y index
int yIndex = y * width + x;
// Calculate UV index with row stride consideration
int uvIndex = ySize + (y / 2) * uvRowStride + (x / 2) * 2;
// Bounds checking to prevent ArrayIndexOutOfBoundsException
if (uvIndex + 1 >= nv21Bytes.length) {
continue;
}
// Extract Y, U, and V values
int y_value = nv21Bytes[yIndex] & 0xFF;
int u_value = nv21Bytes[uvIndex] & 0xFF;
int v_value = nv21Bytes[uvIndex + 1] & 0xFF;
// Convert YUV to RGB
int r = (int)(y_value + 1.402 * (v_value - 128));
int g = (int)(y_value - 0.34414 * (u_value - 128) - 0.71414 * (v_value - 128));
int b = (int)(y_value + 1.772 * (u_value - 128));
// Clamp RGB values
r = Math.max(0, Math.min(255, r));
g = Math.max(0, Math.min(255, g));
b = Math.max(0, Math.min(255, b));
// Combine into ARGB int (full opacity)
rgbaData[yIndex] = 0xFF000000 | (r << 16) | (g << 8) | b;
}
}
// Copy converted data to bitmap
bitmap.setPixels(rgbaData, 0, width, 0, 0, width, height);
return bitmap;
}