Java Android org.webrtc.VideoRenderer.I420Frame arrays to PreviewCallback.onPreviewFrame byte[]

1,140 views
Skip to first unread message

Stephen Gutknecht

unread,
Dec 27, 2013, 10:27:25 AM12/27/13
to discuss...@googlegroups.com
I keep hoping some code will appear on the internet, but getting nowhere ;)

incoming I420Frame object seems to have 3 arrays of yuvPlanes

A typical Android camera app gets PreviewCallback.onPreviewFrame byte[] as a single array of bytes.

Can someone help me in how to convert I420Frames yuvPlanes to a single byte[] array like PreviewCallback.onPreviewFrame byte[] YCbCr_420_SP (NV21)?

Thank you.

Ami Fischman

unread,
Dec 31, 2013, 10:43:54 PM12/31/13
to discuss...@googlegroups.com
If you want efficient CSC code in general your best bet is libyuv; e.g. I420ToNV21.  (or, sometimes, using a GL shader)
It's unclear to me what you're actually trying to do, though.

-a


--
 
---
You received this message because you are subscribed to the Google Groups "discuss-webrtc" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss-webrt...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

Stephen Gutknecht

unread,
Jan 8, 2014, 10:46:51 AM1/8/14
to discuss...@googlegroups.com
What I'm trying to do is just display the I420Frame ;) or save it to a jpeg. In the native Java - the open GL shader code is already there in AppRTCDemo... that's the only code sample that exists.


What if you just want to render to a normal ImageView object? I'm not concerned about performance as much as decoding at this point. The I420FRAME object is in a byte format not native to Android.




Alex Cohn authored this code:

private static void copyPlane(ByteBuffer src, ByteBuffer dst) {
  src.position(0).limit(src.capacity());
  dst.put(src);
  dst.position(0).limit(dst.capacity());
}

public static android.graphics.YuvImage ConvertTo(org.webrtc.VideoRenderer.I420Frame src) {
    byte[] bytes = new byte[src.yuvStrides[0]*src.height +
                            src.yuvStrides[1]*src.height/2 + 
                            src.yuvStrides[2]*src.height/2];
    int[] strides = new int[3];

    ByteBuffer tmp = ByteBuffer.wrap(bytes, 0, src.yuvStrides[0]*src.height);
    copyPlane(src.yuvPlanes[0], tmp);
    tmp = ByteBuffer.wrap(bytes, src.yuvStrides[0]*src.height, src.yuvStrides[2]*src.height/2);
    copyPlane(src.yuvPlanes[2], tmp);
    tmp = ByteBuffer.wrap(bytes, src.yuvStrides[0]*src.height+src.yuvStrides[2]*src.height/2, src.yuvStrides[1]*src.height/2);
    copyPlane(src.yuvPlanes[1], tmp);

    strides[0] = src.yuvStrides[0];
    strides[1] = src.yuvStrides[2];
    strides[2] = src.yuvStrides[1];

    android.graphics.YuvImage image = new android.graphics.YuvImage(bytes, android.graphics.ImageFormat.YV12, src.width, src.height, strides);
    return image;
}

public static android.graphics.YuvImage ConvertTo(org.webrtc.VideoRenderer.I420Frame src, int imageFormat) {
    byte[] bytes = new byte[src.yuvStrides[0]*src.height +
                            src.yuvStrides[1]*src.height/2 + 
                            src.yuvStrides[2]*src.height/2];
    int[] strides = new int[3];
    switch (imageFormat) {
    default:
        return null;

    case android.graphics.ImageFormat.YV12: {
        ByteBuffer tmp = ByteBuffer.wrap(bytes, 0, src.yuvStrides[0]*src.height);
        copyPlane(src.yuvPlanes[0], tmp);
        tmp = ByteBuffer.wrap(bytes, src.yuvStrides[0]*src.height, src.yuvStrides[2]*src.height/2);
        copyPlane(src.yuvPlanes[2], tmp);
        tmp = ByteBuffer.wrap(bytes, src.yuvStrides[0]*src.height+src.yuvStrides[2]*src.height/2, src.yuvStrides[1]*src.height/2);
        copyPlane(src.yuvPlanes[1], tmp);
        strides[0] = src.yuvStrides[0];
        strides[1] = src.yuvStrides[2];
        strides[2] = src.yuvStrides[1];
        return new YuvImage(bytes, imageFormat, src.width, src.height, strides);
    }

    case android.graphics.ImageFormat.NV21: {
        if (src.yuvStrides[0] != src.width)
            return null;
        if (src.yuvStrides[1] != src.width/2)
            return null;
        if (src.yuvStrides[2] != src.width/2)
            return null;

        ByteBuffer tmp = ByteBuffer.wrap(bytes, 0, src.width*src.height);
        copyPlane(src.yuvPlanes[0], tmp);

        byte[] tmparray = new byte[src.width/2*src.height/2];
        tmp = ByteBuffer.wrap(tmparray, 0, src.width/2*src.height/2);

        copyPlane(src.yuvPlanes[2], tmp);
        for (int row=0; row<src.height/2; row++) {
            for (int col=0; col<src.width/2; col++) {
                bytes[src.width*src.height + row*src.width + col*2] = tmparray[row*src.width/2 + col];
            }
        }
        copyPlane(src.yuvPlanes[1], tmp);
        for (int row=0; row<src.height/2; row++) {
            for (int col=0; col<src.width/2; col++) {
                bytes[src.width*src.height + row*src.width + col*2+1] = tmparray[row*src.width/2 + col];
            }
        }
        return new YuvImage(bytes, imageFormat, src.width, src.height, null);
    }

    }
}


So far, the colors aren't right in my testing on several versions of Android.

Ami Fischman

unread,
Jan 8, 2014, 12:30:49 PM1/8/14
to discuss...@googlegroups.com
I'm still a bit confused as to what you're seeing, but does this help?  (warning: untested)
If it doesn't, can you post links to a sample image and a description of what's going wrong in it?

  // Convert (interleave) I420 to NV12 or NV21.
  // Assumes packed, macroblock-aligned frame with no cropping
  // (visible/coded row length == stride).
  private byte[] I420ToNV(
      int width, int height, byte[] frame,
      boolean nv12 /* false ==> nv21 */ ) {
    byte[] out = new byte[frame.length];
    // Y plane we just copy.
    for (int i = 0; i < width * height; ++i)
      out[i] = frame[i];
    // U & V plane we interleave.
    int u_offset = width * height;
    int v_offset = u_offset * 5 / 4;
    int uv_offset = width * height;
    while (uv_offset < frame.length) {
      if (nv12) {
        out[uv_offset++] = frame[u_offset++];
        out[uv_offset++] = frame[v_offset++];
      } else {
        out[uv_offset++] = frame[v_offset++];
        out[uv_offset++] = frame[u_offset++];
      }
    }
    return out;
  }

Stephen Gutknecht

unread,
Jan 8, 2014, 1:59:39 PM1/8/14
to discuss...@googlegroups.com
Your code is taking a single byte[] array parameter named frame.

In my workings with Android libjingle_peerconnection.jar - the callback fenderFrame  does not get this data structure. Instead it is given a org.webrtc.VideoRenderer.I420Frame object - which is composed of multiple byte[] arrays of the yuvPlanes.

Ami Fischman

unread,
Jan 8, 2014, 2:18:41 PM1/8/14
to discuss...@googlegroups.com
Oops, sorry, copy/pasta.  I meant to change each of the {y/u/v}_offset variables to point to the heads of the respective plane in the I420Frame object, and obvs. this is for a byte[]-using source instead of a ByteBuffer.
Can you post a sample image with what's going wrong, and/or the code you're using, and link to them here?

-a

Stephen Gutknecht

unread,
Jan 8, 2014, 4:52:17 PM1/8/14
to discuss...@googlegroups.com
Still working on this. I built up an independent Android app to demo and publish... and even recompiled libjingle from today's checkout... and now I'm not sure if the color problem still exists.  Going to have to get my other device with the code from last week to confirm.

Kushtrim Pacaj

unread,
Mar 14, 2019, 4:47:12 AM3/14/19
to discuss-webrtc
Reply all
Reply to author
Forward
0 new messages