constant error converting byte[] array to Mat

1,597 views
Skip to first unread message

Jim Graham

unread,
Aug 1, 2012, 1:16:27 PM8/1/12
to android...@googlegroups.com
I'm working on re-writing a function in C++//OpenCV that takes two
images, one a photo, and the second, a photographic filter (e.g.,
solid color, graduated, or split-field) and blends the two. The
first version had the Java side saving the two source images, and
then in C++, they were opened again with imread, and blended. This
works, but it's WAY too slow (4--5 seconds).

So, now that I (thought I knew) how, I'm trying to re-code this
to accept the two images, one now a byte[] array, and the other
a bitmap, passed directly as args. I'm combining code from the
jnipart.cpp from tutorial-3-native, and code I got as a response
to an earlier post here---that part I know works---I've been using
it as a "wrapper" around image processing routines with great
success---somewhere outside of that, however, it's crashing.

The source (so far) and the debug output from addr2line and ndk-stack
are included below. If someone can please help me with this, I would
appreciate it. I've been trying to debug this for some time now, with
no success (or debug output that I can understand).

Here's the source for the revised version (so far):

--------------------------- CUT HERE ---------------------------

#include <jni.h>
#include <cv.h>
#include <highgui.h>
#include <android/bitmap.h>
#include <iostream>
#include <opencv2/core/core.hpp>
#include <opencv2/imgproc/imgproc.hpp>

using namespace std;
using namespace cv;

extern "C" {

JNIEXPORT void JNICALL Java_com_jdgapps_FilterTests_FilterTests_cvBlend2(JNIEnv* env, jobject thiz,
jint pw, jint ph, jbyteArray yuv, jobject src, jobject dst) {

// this is adapted from tutorial-3-native/jni/jnipart.cpp

jbyte* _yuv = env->GetByteArrayElements(yuv, 0);
Mat myuv(ph, pw, CV_8UC4, (unsigned char *)_yuv);
Mat pbgra, tmp;

// I originally had this in one line, as
// cvtColor(myuv, pbgra, CV_YUV420sp2BGR, 4)
// and when that crashed, I tried this instead, to see
// if that was the problem....it didn't change anything

cvtColor(myuv, tmp, CV_YUV420sp2BGR, 3);
cvtColor(tmp, pbgra, CV_BGR2BGRA, 4);

// now the byte[] array should be an CV_8UC4 format Mat (I think)

// this is from code in response to another post, that imports
// two bitmaps, one src, one dest. I know this part works, as
// I've been using it successfully since I got it as a wrapper
// around image processing routines.

AndroidBitmapInfo infoSrc;
void* pixelsSrc = 0;
AndroidBitmapInfo infoDst;
void* pixelsDst = 0;

CV_Assert( AndroidBitmap_getInfo(env, src, &infoSrc) >= 0 );
CV_Assert( infoSrc.format == ANDROID_BITMAP_FORMAT_RGBA_8888 );

CV_Assert( AndroidBitmap_getInfo(env, dst, &infoDst) >= 0 );
CV_Assert( infoDst.format == ANDROID_BITMAP_FORMAT_RGBA_8888 );

CV_Assert( infoSrc.height == infoDst.height && infoSrc.width == infoDst.width );
int h = infoSrc.height, w = infoSrc.width;

CV_Assert( AndroidBitmap_lockPixels(env, src, &pixelsSrc) >= 0 );
CV_Assert( pixelsSrc );

CV_Assert( AndroidBitmap_lockPixels(env, dst, &pixelsDst) >= 0 );
CV_Assert( pixelsDst );

Mat mSrc(h, w, CV_8UC4, pixelsSrc), mDst(h, w, CV_8UC4, pixelsDst);

// For this test, all I want to do is see if the byte[] array was
// properly converted by displaying it as the result (mDst). If I
// can get that far, blending is easy.... (The next bit will be
// creating the filter bitmap in C++. <grin>)

pbgra.copyTo(mDst);

return;

} // cvBlend2

} // extern C
--------------------------- CUT HERE ---------------------------

And this is the debug output (from addr2line and ndk-stack):

--------------------------- CUT HERE ---------------------------

std::basic_istream<wchar_t, std::char_traits<wchar_t> >& std::basic_istream<wchar_t, std::char_traits<wchar_t> >::_M_extract<unsigned long>(unsigned long&)
eh_type.cc:0


********** Crash dump: **********
Build fingerprint: 'acer/a500_pa_cus1/picasso:4.0.3/IML74K/1333727375:user/release-keys'
pid: 4000, tid: 4000 >>> com.jdgapps.FilterTests <<<
signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr deadbaad
Stack frame #00 pc 000178a8 /system/lib/libc.so

--------------------------- CUT HERE ---------------------------

Can anyone help me with A) understanding the above debug output, and
B) understanding why this is crashing?

Thanks,
--jim

PS: Please remember, I am still learning to use the NDK, C++, AND
OpenCV. I've already learned some, but this is one part I've
never gotten to work, despite the tutorials.

--
THE SCORE: ME: 2 CANCER: 0
73 DE N5IAL (/4) MiSTie #49997 < Running Mac OS X Lion >
spook...@gmail.com ICBM/Hurricane: 30.44406N 86.59909W

"sigh, once upon a time T-1 was fast...."
--seen in alt.sysadmin.net-abuse.email

Android Apps Listing at http://www.jstrack.org/barcodes.html

Andrey Pavlenko

unread,
Aug 3, 2012, 9:03:29 AM8/3/12
to android...@googlegroups.com
1) how do you get that "byte[] yuv"? are you sure it contains an image of "pw" x "ph" size in "YUV420" format?
2) are you sure the Bitmaps "src" and "dst" have the same size as the YUV image?
3) looking at the "Tutorial 3" I see a bit different code, in particular:
    Mat myuv(height + height/2, width, CV_8UC1, (unsigned char *)_yuv);
4) don't forget to release the Java objects after finishing working with them:
e.g.
    env->ReleaseByteArrayElements(yuv, _yuv, 0);


and
    AndroidBitmap_unlockPixels(env, bitmap);


Jim Graham

unread,
Aug 3, 2012, 9:22:04 AM8/3/12
to android...@googlegroups.com
On Fri, Aug 03, 2012 at 06:03:29AM -0700, Andrey Pavlenko wrote:
> 1) how do you get that "*byte[] yuv*"? are you sure it contains an
> image of "*pw*" x "*ph*" size in "*YUV420*" format?

Right now, this is just a test. When it goes into the app, the byte[]
array will be the image size, and the first bitmap (a photographic
filter) will be display size, and will be resized before working on it.
That, or more likely, I'll be transferring the java code over to C++ and
make the filters full-size when I need to blend the images.

Right now, though, byte[] data, src, and dst are all the SAME image
(except that byte[] data (yuv) was converted from a bitmap in java).

My only concern at this point is to get the yuv image converted to a Mat
correctly. Then I can proceed with the rest (where the "filter" image,
src here, will be an inverted copy of the original, just to prove 100%
that it's all happy).

> 2) are you sure the Bitmaps "*src*" and "*dst*" have the same size as the
> YUV image?

See above. Yes.

> 3) looking at the "*Tutorial 3*" I see a bit different code, in particular:
> Mat myuv(height + height/2, width, CV_8UC1, (unsigned char *)_yuv);

Ok, but the height for mine should be the height of the yuv image (i.e.,
it shold stay the same, not change to 1.5 * ph). Or is there a different
reason for that? Is that actually converting it to a MAT the same size
as yuv? Remember...still learning. :-)

> 4) don't forget to release the Java objects *after* finishing working with
> them:

When I replace the current version of cvBlend that's in the real app, I
certainly will. For this little thing, I'm using small images (original
and source) for playing around with filters (my current PITA is a fog
filter---got it looking nice for a full fog, but ground-hugging fog is
not right---it hugs the bottom of the image, not the ground <grin>).
After this version of the blending (cvBlend) function crashed the
first time in my camera app, it locked the camera, requiring a reboot.
Sadly my Acer Iconia A500 has a bug in its camera code, where for some
random boots, there is a ton of Gaussian (random) noise in various color
combinations. This last time, it required about 3 reboots to get it right.
So I'm working on it offline now (in my FilterTests app---no camera).

Thanks,
--jim

--
THE SCORE: ME: 2 CANCER: 0
73 DE N5IAL (/4) | "This 'telephone' has too many
spook...@gmail.com | shortcomings to be seriously considered
< Running Mac OS X Lion > | as a means of communication. The device
ICBM / Hurricane: | is inherently of no value to us."
30.44406N 86.59909W | (Western Union internal memo, 1876)

Andrey Pavlenko

unread,
Aug 3, 2012, 9:39:14 AM8/3/12
to android...@googlegroups.com
Which way do you get "byte[]" values from Bitmap in Java?

Jim Graham

unread,
Aug 3, 2012, 9:47:38 AM8/3/12
to android...@googlegroups.com
On Fri, Aug 03, 2012 at 06:39:14AM -0700, Andrey Pavlenko wrote:
> Which way do you get "byte[]" values from Bitmap in Java?

ByteArrayOutputStream stream = new ByteArrayOutputStream();
original.compress(Bitmap.CompressFormat.JPEG, 100, stream);
byte[] data = stream.toByteArray();

cvBlend2(width, height, data, original, filtered);

Note that (and I *THINK* I mentioned this in my first post---if not, I
meant to...which, as usual, is common since the damage done by both my
first cancer and the treatment, including three brain surgeries, very
harsh chemo, and max-dose radiation to the brain, happens far too often).

I've already tested to make sure the byte[] data was valid by testing it
in java (convert it back to bitmap, skip JNI and just display the
result).

Thanks,
--jim

--
THE SCORE: ME: 2 CANCER: 0
73 DE N5IAL (/4) | Remember your spelling rules, including:
spook...@gmail.com | I before E except after C
< Running Mac OS X Lion > |
ICBM / Hurricane: | BEING a native-born American, I don't
30.44406N 86.59909W | always notice our WEIRD spelling....

Andrey Pavlenko

unread,
Aug 3, 2012, 10:06:10 AM8/3/12
to android...@googlegroups.com
JPEG compressed image can't be decoded via cvtColor(), instead you should use cv::imdecode().
Be aware that instead of W & H you need to pass array length to JNI.
Something like:
jbyte* _jpg  = env->GetByteArrayElements(jpg, 0);
Mat mjpg(1, length, CV_8UC1, (unsigned char *)_jpg);
Mat imgBGR = imdecode(mjpg);
env
->ReleaseByteArrayElements(jpg, _jpg, 0);
CV_Assert
( ! imgBGR.empty() );
Mat imgRGBA;
cvtColor
(imgBGR, imgRGBA, CV_BGR2RGBA);

Jim Graham

unread,
Aug 3, 2012, 11:08:11 AM8/3/12
to android...@googlegroups.com
Sorry for the delay---was out finishing off my monthly disability pay-day
routine (paying bills).

On Fri, Aug 03, 2012 at 07:06:10AM -0700, Andrey Pavlenko wrote:
> JPEG compressed image can't be decoded via cvtColor(), instead you
> should use
> cv::imdecode<http://docs.opencv.org/trunk/modules/highgui/doc/reading_and_writing_images_and_video.html?highlight=imdecode#imdecode>

I assume that, in C++, that translates to imdecode(data), right?

Btw, I was just following the example in tutorial-3-native, which takes a
byte array from the camera ...... or, no, it doesn't. I just looked
through the code again, and it's taking preview frame data, which is YUV,
not JPEG. Ok, wrong tutorial. Oops.

Which one SHOULD I be using to see where a final photo image is processed
in C++?

> Be aware that instead of W & H you need to pass *array length* to JNI.

> Something like:
> jbyte* _jpg = env->GetByteArrayElements(jpg, 0);
> Mat mjpg(1, length, CV_8UC1, (unsigned char *)_jpg);
> Mat imgBGR = imdecode(mjpg);
> env->ReleaseByteArrayElements(jpg, _jpg, 0);
> CV_Assert( ! imgBGR.empty() );
> Mat imgRGBA;
> cvtColor(imgBGR, imgRGBA, CV_BGR2RGBA);

Ok, I'll try that. But still, pleaes point me to the correct tutorial
(the one that shows this side of it, if nothing else than for future
reference).

Thanks,
--jim

--
THE SCORE: ME: 2 CANCER: 0
73 DE N5IAL (/4) | DMR: So fsck was originally called
spook...@gmail.com | something else.
< Running Mac OS X Lion > | Q: What was it called?
ICBM / Hurricane: | DMR: Well, the second letter was different.
30.44406N 86.59909W | -- Dennis M. Ritchie, Usenix, June 1998.

Andrey Pavlenko

unread,
Aug 3, 2012, 11:30:04 AM8/3/12
to android...@googlegroups.com
Currently OpenCV4Android doesn't have a tutorial processing a taken photo image, just preview frames.

Jim Graham

unread,
Aug 3, 2012, 11:44:18 AM8/3/12
to android...@googlegroups.com
On Fri, Aug 03, 2012 at 08:30:04AM -0700, Andrey Pavlenko wrote:
> Currently OpenCV4Android doesn't have a tutorial processing a taken photo
> image, just preview frames.

Oh. Ok. I'd assumed that there would be. But then, we all know what
they say about assumptions....

By the way, just looking through the code you sent:

jbyte* _jpg = env->GetByteArrayElements(jpg, 0);
Mat mjpg(1, length, CV_8UC1, (unsigned char *)_jpg);

Granted, I'm new to the CV_8UCx values, but is saying 1 color channel
is used for all three color channels? Is that another byte array vs
bitmap thing?

Mat imgBGR = imdecode(mjpg);
env->ReleaseByteArrayElements(jpg, _jpg, 0);
CV_Assert( ! imgBGR.empty() );
Mat imgRGBA;
cvtColor(imgBGR, imgRGBA, CV_BGR2RGBA);

Another question: in every doc I've seen, cvtColor specifies the
number of color channels as the last arg, so shouldn't that be

cvtColor(imgBGR, imgRGBA, CV_BGR2RGBA, 8) ?

Or is 8 a default if nothing is specified (or does it just
somehow, using some method I'm not aware of because I'm so
new to this)?
Reply all
Reply to author
Forward
0 new messages