Groups keyboard shortcuts have been updated
Dismiss
See shortcuts

Streaming back camera to youtube results in color stripes

92 views
Skip to first unread message

BRASA space

unread,
Dec 15, 2024, 7:53:11 AM12/15/24
to Android CameraX Discussion Group
I'm trying to live stream from the back camera, from a Samsung S20 FE.
The back camera image appears correctly.
Then I add a simple overlay to the image and stream it to youtube live.

But youtube shows it as running color strips.
If I cover the back camera, resulting in a black image, I can see the overlay on youtube, but the colors are all wrong.

Any ideas what I am doing wrong?

Thanks!

Scott Nien

unread,
Dec 16, 2024, 10:41:06 AM12/16/24
to BRASA space, Android CameraX Discussion Group
Hi BRASA, 

Are you developing an app that uses CameraX to do the live streaming ?  If so please provide some code snippet showing how you did it so that we can help check what went wrong. 
Scott

--
You received this message because you are subscribed to the Google Groups "Android CameraX Discussion Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email to camerax-develop...@android.com.
To view this discussion visit https://groups.google.com/a/android.com/d/msgid/camerax-developers/160d042c-6765-402a-81c7-f0e54a3506c6n%40android.com.

BRASA space

unread,
Dec 16, 2024, 6:45:28 PM12/16/24
to Android CameraX Discussion Group, scot...@google.com, Android CameraX Discussion Group, BRASA space
Hi, Scott!

Thanks for the reply.

I am trying to create a small app for live streaming directly form the back camera of an android phone to youtube live.
But I need to add a simple overlay on top of the image.
The overlay is a simple yellow rectangle with 75% opacity and a small text:  CAM #01 - 20oC
(Where 20oC is the current temperature in degrees celcius).

I fixed the previous problem: I was converting the NV21 image from the back camera to yuv, not RGBA.
I rewrote the function that builds the bitmap, to convert from YUV to RGBA, so I can easily draw the overlay on top of it.

I am using FFMPEG to live stream to youtube.

It works perfectly, now.
Kind of :)

FFMPEG quits if I try to stream at resolutions above 640 x 480.
It also quits if I try to stream 640 x 480 at any frame rate above 12FPS.

I believe  the problem is I am not feeding FFMPEG at a minimum required frame rate.
And the problem seems to be the time it is taking for converting NV21 to RGBA in Java.

I wonder if there's a better, faster way to do the conversion.

Another thing I noticed is the audio is streamed correctly, but the video is not synced with it - the image comes late, after the audio.

I wonder if there's a way to do this without using FFMEP.
Can I point me at an example how to stream directly from cameraX, without having to use FFMPEG?
What I really need is a way to live stream from the back camera, with a simple overlay.
Thanks in advance if you (or anyone) can help me.

Cheers!

P.S.: Here's the function I am uwing to convert the NV21 image that comes from the back camera to RGBA:

private Bitmap nv21ToRgba(ImageProxy image) {
// Validate input
if (image == null) {
return null;
}

// Extract image dimensions
int width = image.getWidth();
int height = image.getHeight();

// Extract YUV planes
ImageProxy.PlaneProxy yPlane = image.getPlanes()[0];
ImageProxy.PlaneProxy uvPlane = image.getPlanes()[1];

ByteBuffer yBuffer = yPlane.getBuffer();
ByteBuffer uvBuffer = uvPlane.getBuffer();

// Calculate buffer sizes and strides
int ySize = yBuffer.remaining();
int uvSize = uvBuffer.remaining();
int yRowStride = yPlane.getRowStride();
int uvRowStride = uvPlane.getRowStride();

// Create NV21 byte array
byte[] nv21Bytes = new byte[ySize + uvSize];

// Reset buffer positions
yBuffer.rewind();
uvBuffer.rewind();

// Copy Y plane
yBuffer.get(nv21Bytes, 0, ySize);

// Copy UV plane
uvBuffer.get(nv21Bytes, ySize, uvSize);

// Create bitmap and int array for pixel data
Bitmap bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);

int[] rgbaData = new int[width * height];

// Convert NV21 to RGBA
for (int y = 0; y < height; y++) {
for (int x = 0; x < width; x++) {
// Calculate Y index
int yIndex = y * width + x;

// Calculate UV index with row stride consideration
int uvIndex = ySize + (y / 2) * uvRowStride + (x / 2) * 2;

// Bounds checking to prevent ArrayIndexOutOfBoundsException
if (uvIndex + 1 >= nv21Bytes.length) {
continue;
}

// Extract Y, U, and V values
int y_value = nv21Bytes[yIndex] & 0xFF;
int u_value = nv21Bytes[uvIndex] & 0xFF;
int v_value = nv21Bytes[uvIndex + 1] & 0xFF;

// Convert YUV to RGB
int r = (int)(y_value + 1.402 * (v_value - 128));
int g = (int)(y_value - 0.34414 * (u_value - 128) - 0.71414 * (v_value - 128));
int b = (int)(y_value + 1.772 * (u_value - 128));

// Clamp RGB values
r = Math.max(0, Math.min(255, r));
g = Math.max(0, Math.min(255, g));
b = Math.max(0, Math.min(255, b));

// Combine into ARGB int (full opacity)
rgbaData[yIndex] = 0xFF000000 | (r << 16) | (g << 8) | b;
}
}

// Copy converted data to bitmap
bitmap.setPixels(rgbaData, 0, width, 0, 0, width, height);

return bitmap;
}

Charcoal Chen

unread,
Dec 16, 2024, 9:44:21 PM12/16/24
to Android CameraX Discussion Group, rui...@gmail.com, Scott Nien, Android CameraX Discussion Group
Hi,

For the format conversion part, you can try to use ImageAnalysis with OUTPUT_IMAGE_FORMAT_RGBA_8888 output format setting. Then, you will receive RGBA_8888 images that the data conversion is done by native code.

For the A/V sync problem part, audio data usually comes faster than the image frames. You might need to wait and compare the timestamp of the audio and image data, and then submit the data.

For the timestamp comparison, you will need to use the same timebase to the camera to get the timestamp from the AudioRecord to do the comparison. Some more related info listed below for your reference:

[Camera time]
https://developer.android.com/reference/android/hardware/camera2/CameraCharacteristics#SENSOR_INFO_TIMESTAMP_SOURCE
CameraMetadata.SENSOR_INFO_TIMESTAMP_SOURCE_UNKNOWN   -> System uptime
CameraMetadata.SENSOR_INFO_TIMESTAMP_SOURCE_REALTIME -> System realtime

[Audio Time]
From API 24
https://developer.android.com/reference/android/media/AudioRecord#getTimestamp(android.media.AudioTimestamp,%20int)
TIMEBASE_MONOTONIC -> System uptime
TIMEBASE_BOOTTIME -> System realtime

Before API 24,
Get by
System.nanoTime() -> System uptime
SystemClock.elapsedRealtimeNanos() -> Sytem realtime

BRASA space

unread,
Dec 17, 2024, 6:47:21 AM12/17/24
to Charcoal Chen, Android CameraX Discussion Group, Scott Nien
Thanks, Charcoal!


BRASA space

unread,
Dec 22, 2024, 6:57:11 PM12/22/24
to Charcoal Chen, Android CameraX Discussion Group, Scott Nien
Hi, guys.

Can you help me with something else - related to my experiments.

When I configure ffmpeg to live stream 1280 x 720 @ 30 FPS, it works for a couple of minutes, then the streaming buffer reaches 0 (on YouTube's "Stats for Nerds").
If I configure it for 1280 x 720 @ 15 FPS, it works fine (but I got another problem: video seems to play at half speed on youtube live - that's counter intuitive, if any error, it should play at double the speed, shouldn't it?).

Can you guys help me figure out how to find a fix - or a workaround?

Here's my ffmpeg code:

private void startFfmpegProcess() {
@SuppressLint("DefaultLocale") String cmd = String.format(
"-y -re -f rawvideo -pixel_format rgba -s %dx%d -r %d -i %s " +
"-f s16le -ar 44100 -ac 2 -i %s " +
"-c:v libx264 -preset superfast " +
"-b:v 3500k -maxrate 3500k -bufsize 14000k " +
"-profile:v baseline -level:v 4.0 " +
"-x264-params ref=2:bframes=0:scenecut=30:nal-hrd=cbr " +
"-pix_fmt yuv420p " +
"-c:a aac -ar 44100 -b:a 128k " +
"-threads 2 " +
"-f flv " +
"-flvflags no_duration_filesize " +
"%s",
width, height, framerate, // framerate here is your input 15 FPS
videoFifo.getAbsolutePath(),
audioFifo.getAbsolutePath(),
rtmpUrl
);

FFmpegKit.executeAsync(cmd, session -> {
if (ReturnCode.isSuccess(session.getReturnCode())) {
Log.i("FFmpeg", "Streaming finished successfully.");
} else {
Log.e("FFmpeg", "FFmpeg error: " + session.getFailStackTrace());
}
}, log -> {
Log.d("FFmpeg", "Log: " + log.getMessage());
}, statistics -> {
long timeInMilliseconds = statistics.getTime();
if (timeInMilliseconds > 0) {
Log.d("FFmpeg", String.format("Time: %d ms", timeInMilliseconds));
}
});
}


And here are the audio and video configurations:

// Video parameters
private int width = 1280; // 640; // 1280; // 640;
private int height = 720; // 480; // 720; // 480;
private int framerate = 30;

// Audio parameters
private int sampleRate = 44100;
private int channelConfig = AudioFormat.CHANNEL_IN_STEREO;
private int audioFormat = AudioFormat.ENCODING_PCM_16BIT;

Rui Barbosa Jr.



Reply all
Reply to author
Forward
0 new messages