Does anyone know how to convert from YUV to RGB format?

5,705 views
Skip to first unread message

kishore rathnavel

unread,
Aug 2, 2011, 12:00:28 AM8/2/11
to android...@googlegroups.com
Consider the following lines taken from the "Add native opencv" which came along with opencv-2.3.1

JNIEXPORT void JNICALL Java_org_opencv_samples_tutorial3_Sample3View_FindFeatures(JNIEnv* env, jobject thiz, jint width, jint height, jbyteArray yuv, jintArray rgba)
{
    jbyte* _yuv  = env->GetByteArrayElements(yuv, 0);
    jint*  _rgba = env->GetIntArrayElements(rgba, 0);

    Mat myuv(height + height/2, width, CV_8UC1, (unsigned char *)_yuv);
    Mat mrgba(height, width, CV_8UC4, (unsigned char *)_rgba);
    Mat mgray(height, width, CV_8UC1, (unsigned char *)_yuv);

    cvtColor(myuv, mrgba, CV_YUV420i2BGR, 4);
 

I guess that android passes images to the c code in byte format which is in turn in YUV format. Please correct me if I'm wrong.
And then we convert it to rgba format.

I have the following questions:
1) Is there a way to convert YUV to RGB directly? What is the CV code to do that.
I am guessing it would be like this:
jint* _rgb = env->GetIntArrayElements(rgba, 0);
Mat rgb = Mat mrgba(height, width, CV_8UC3, (unsigned char *)_rgb);
cvtColor(myuv, mrgba, CV_YUV420i2BGR, 3);

2) Is there a way to find out in what color format my image is? I am especially concerned as to whether my CvMat is in RGB or BGR format.
3) Is there any advantage of using RGBA over RGB?


Andrey Kamaev

unread,
Aug 2, 2011, 3:19:41 AM8/2/11
to android-opencv
Hi,

> I guess that android passes images to the c code in byte format which is in
> turn in YUV format. Please correct me if I'm wrong.

You are correct. According to specification of Andriod 2.3 and newer
it has to be yuv420sp format.

1) You are right.

cvtColor(myuv, mrgba, CV_YUV420i2BGR, 3);

or

cvtColor(myuv, mrgba, CV_YUV420i2BGR);

will convert YUV from camera to BGR format.

Please note that in beta2 constant CV_YUV420i2BGR will be changed to
CV_YUV420sp2BGR which is more correct.

2) No, information about image format is not stored in OpenCV. You can
get only number of channels but not the channel names.

3) RGBA is one of image types that supported by Android runtime. There
are no functions in Android API that can accept BGR and quite few
working with RGB.

/Andrey

kishore rathnavel

unread,
Aug 3, 2011, 1:28:52 PM8/3/11
to android...@googlegroups.com
Hi Andrey,

I did what you suggested, and I have posted relevant sections of my code here below. My problem is that the execution does not go beyond the cvtColor statement. The execution just comes back to the next Android statement after calling the C function. And I speculate that if it works for the example application but doesnt work for mine, I am doing something wrong with passing in the byte array probably? But I really dont know how to make this work. Any help or suggestions are welcome.


JNIEXPORT void JNICALL Java_nd_erwin_ColorimetricAnalysis10_ImageProcessing_squares(JNIEnv* env, jobject thiz, jint mode, jint numOfSquares, jint width, jint height, jbyteArray yuv, jintArray rgba)
{
    __android_log_print(ANDROID_LOG_VERBOSE,"Kishore","Kishore:Working");

    jbyte* _yuv  = env->GetByteArrayElements(yuv, 0);
    jint*  _rgba = env->GetIntArrayElements(rgba, 0);

    Mat myuv(height + height/2, width, CV_8UC1, (unsigned char *)_yuv);
    Mat mrgba(height, width, CV_8UC4, (unsigned char *)_rgba);
    Mat mgray(height, width, CV_8UC1, (unsigned char *)_yuv);

    cvtColor(myuv, mrgba, CV_YUV420i2BGR, 4);
    __android_log_print(ANDROID_LOG_VERBOSE,"Kishore","Kishore:StillWorking");

    Java_nd_erwin_ColorimetricAnalysis10_ImageProcessing_algo(mode, numOfSquares);
}

In the above code, I get the logcat output of "Working" but i dont get "StillWorking"
I call the function in the following manner:

public void testing(int width, int height, byte[] b){
        int[] rgba = {0};
        squares(0, 6, width, height, b, rgba);
    }


And this function is in turn called from another activity in the following manner:
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
    byte[] b={0};
   
  if (requestCode == REQUEST_FROM_CAMERA && resultCode == RESULT_OK) {
    InputStream is=null;
    File file=getTempFile(this);
    try {
       is=new FileInputStream(file);
    } catch (FileNotFoundException e) {
        e.printStackTrace();
    }
    String str = is.toString();
    b = str.getBytes();
    Bitmap bm = BitmapFactory.decodeStream(is);
    int width = bm.getWidth();
    int height = bm.getHeight();
    ip = new ImageProcessing();
    ip.testing(width, height, b);
  }


ImageProcessing is the class that serves as the class from where I call all my C functions.

Andrey Kamaev

unread,
Aug 3, 2011, 1:52:03 PM8/3/11
to android-opencv
Hi,

This line is the source of your problem:

int[] rgba = {0};

This array has to be allocated with proper size before passing to
native level:

int[] rgba = new int[4*width*height];

And I recommend you use "null" keyword instead of {0} if you need to
initialize variable with dummy value.

/Andrey
Reply all
Reply to author
Forward
0 new messages