Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Convert U8 array to 16bit hex num

534 views
Skip to first unread message

Shreesha

unread,
Aug 10, 2004, 3:20:31 PM8/10/04
to
Hi,
I have an array of U8. I want to pull the first two 8 bit elements
and join them to produce a 16 bit hex number.
Ex:
If Array ---> FE 0C 12 D4.......
Here;
Element1---> FE
Element2---> 0C
Then;
16 bit hex--> FE0C

I tried doing this, but could not come with a way of joining two 8
bit data into a 16 bit data.
Please let me know how this can be done.
A small example program would be helpful.
Thanks.

Dave Kaufman

unread,
Aug 10, 2004, 4:02:05 PM8/10/04
to
Use the decimate array vi (array pallet) to split the U8 array into
two arrays, high byte and low byte. Index each array. Convert the
high byte to U16. Multiply the high byte by 256 (100h) and add the
low byte. Do this for each element of the arrays.

Dave.

Rolf Østvik

unread,
Aug 11, 2004, 4:54:23 AM8/11/04
to
Dave Kaufman <x...@no.email> wrote in news:50650000000500000098B30100-
107939...@exchange.ni.com:

Do the split as describe above. Then multiply the high byte array by 256
(100h) and add the low byte array.

You don't need to iterate over all elements in the array.

--
Rolf

Rolf Østvik

unread,
Aug 11, 2004, 5:02:48 AM8/11/04
to
Shreesha <x...@no.email> wrote in news:50650000000800000036EA0000-
107939...@exchange.ni.com:

Use the "Type Cast" function
(Function palette->Advanced->Data Manipulation->Type cast)

Wire your U8 array to the x terminal.
Wire an empty U16 (or I16) array to the typeterminal.

The output should now be a 16 bit array.

I think this should work an all platforms (both high endian and low endian
systems).

--
Rolf

0 new messages