Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

URGENT: converting a bit-array to a number

197 views
Skip to first unread message

Hugo Veiga

unread,
Jun 4, 2001, 5:24:12 PM6/4/01
to
greetings,

please, can anyone tell me how to convert a bit-array to a number
without losing time processing it. Is there any function to do that? The
thing is that i wich to make a boar for a game with bit-array to save space,
but in the same time i wich to have a unique hash value for different
boards, so the most natural way is to convert that bit-array to a number. I
searched all over the internet and found nothing about that function.

I thank you all in advance for reading this.

p.s - reply me with an email.
--
__________________________________________________________________
Hugo Veiga - Leic - IST
Tm: 965555721 Email1: ha...@netcabo.pt Email2: ha...@mega.ist.utl.pt
"Trabalhar não mata nem nunca matou ninguém... Mas para quê arriscar?"
__________________________________________________________________


Christopher Stacy

unread,
Jun 5, 2001, 7:31:25 AM6/5/01
to
>>>>> On Mon, 4 Jun 2001 22:24:12 +0100, Hugo Veiga ("Hugo") writes:
Hugo> greetings,
Hugo> please, can anyone tell me how to convert a bit-array to a number
Hugo> without losing time processing it. Is there any function to do that? The
Hugo> thing is that i wich to make a boar for a game with bit-array to save space,
Hugo> but in the same time i wich to have a unique hash value for different
Hugo> boards, so the most natural way is to convert that bit-array to a number. I
Hugo> searched all over the internet and found nothing about that function.

In order to do this homework problem you need to learn how strings of
base-2 digits are used to represent numbers; you might want to learn
how two's complement arithmetic works.

(I assume that you're going to intern the boards as their identities:
two boards with the same setup are in fact the same board!
Otherwise your proposed hash scheme won't work.)

Something like this:

(defun twos (bitvec)
(let* ((length (length bitvec))
(sign (elt bitvec 0))
(sum (elt bitvec (1- length))))
(loop with c = 2
for i from (- length 2) downto 1
for bit = (elt bitvec i)
doing
(when (not (zerop bit))
(incf sum c))
(setq c (* c 2)))
(if (zerop sign)
sum
(* -1 sum))))


#2r00101110 => 46
(twos #(0 0 1 0 1 1 1 0)) => 46

Mike McDonald

unread,
Jun 5, 2001, 3:11:39 PM6/5/01
to
In article <ur8wzg...@spacy.boston.ma.us>,

Christopher Stacy <cst...@spacy.Boston.MA.US> writes:
>>>>>> On Mon, 4 Jun 2001 22:24:12 +0100, Hugo Veiga ("Hugo") writes:
> Hugo> greetings,
> Hugo> please, can anyone tell me how to convert a bit-array to a number
> Hugo> without losing time processing it. Is there any function to do that? The
> Hugo> thing is that i wich to make a boar for a game with bit-array to save space,
> Hugo> but in the same time i wich to have a unique hash value for different
> Hugo> boards, so the most natural way is to convert that bit-array to a number. I
> Hugo> searched all over the internet and found nothing about that function.
>
> In order to do this homework problem you need to learn how strings of
> base-2 digits are used to represent numbers; you might want to learn
> how two's complement arithmetic works.

> Something like this:
>
> (defun twos (bitvec)

Doesn't solve the problem as presented. Read the part about "without losing
time processing it" again. I think he wants to do the C type of thing of
coercing an array of bits to an array of ints. Can displaced arrays be of
different types in CL? Ie one array be a bit array and then have a displaced
array of 32 bit fixnums to it?

Mike McDonald
mik...@mikemac.com

Christopher Stacy

unread,
Jun 5, 2001, 5:24:35 PM6/5/01
to
>>>>> On Tue, 05 Jun 2001 19:11:39 GMT, Mike McDonald ("Mike") writes:

Mike> In article <ur8wzg...@spacy.boston.ma.us>,


Mike> Christopher Stacy <cst...@spacy.Boston.MA.US> writes:
>>>>>>> On Mon, 4 Jun 2001 22:24:12 +0100, Hugo Veiga ("Hugo") writes:
Hugo> greetings,
Hugo> please, can anyone tell me how to convert a bit-array to a number
Hugo> without losing time processing it. Is there any function to do that? The
Hugo> thing is that i wich to make a boar for a game with bit-array to save space,
Hugo> but in the same time i wich to have a unique hash value for different
Hugo> boards, so the most natural way is to convert that bit-array to a number. I
Hugo> searched all over the internet and found nothing about that function.
>>
>> In order to do this homework problem you need to learn how strings of
>> base-2 digits are used to represent numbers; you might want to learn
>> how two's complement arithmetic works.

>> Something like this:
>>
>> (defun twos (bitvec)

In case anyone wonders (someone asked me) I purposely didn't write it
using explicit shift operations. By the way, I accidently left out
the "complement" part of the sign processing (but nobody noticed yet)!

Mike> Doesn't solve the problem as presented. Read the part about "without losing
Mike> time processing it" again. I think he wants to do the C type of thing of
Mike> coercing an array of bits to an array of ints. Can displaced arrays be of
Mike> different types in CL? Ie one array be a bit array and then have a displaced
Mike> array of 32 bit fixnums to it?

I'm not following you about an array of ints: he said "bit-array to a number",
which sounds like an urgent homework problem for understanding binary numbers.
He sounded like he wants a single hash-value, not converting an arbitrary
sequence into another sequence. But now that I read the "losing time" part,
maybe he is talking about some kind of casting trick. I don't think you can
do that in Common Lisp, even with displaced arrays: MAKE-ARRAY is going to
check that the element-types are compatible ("the consequences are undefined
if the actual array element type of displaced-to is not type equivalent to
the actual array element type of the array being created.")

Kent M Pitman

unread,
Jun 5, 2001, 8:47:20 PM6/5/01
to
Christopher Stacy <cst...@spacy.Boston.MA.US> writes:

> He sounded like he wants a single hash-value, not converting an arbitrary
> sequence into another sequence.

I took him to be asking for "how to cast"--that is, how to change bits to
something else without processing, but merely in the head of the language.

The answer is that there is no operation to do that in CL because it would
not be portable. CL does not presuppose the bit representations that you
use for your data, hence you cannot take data in some layout you've chosen
and suppose it to be appropriate to cast, hence there are no operations for
casting.

Historically, btw, LSH worked on floats in Maclisp, so (LSH x 0) would cast
a float to a fixnum and FSC (floating point scale) worked on integers, so
(FSC x 0) would cast a fixnum to a float. But junk like that didn't port
well, nor probably did it even cope well with the shift to multiple float
representations in one implementation, and I think got lost mostly for that
reason...

Erik Naggum

unread,
Jun 7, 2001, 8:44:54 AM6/7/01
to
* "Hugo Veiga" <ha...@netcabo.pt>

> please, can anyone tell me how to convert a bit-array to a number without
> losing time processing it.

This is impossible. It will always take some time. This is not C, where
you can keep the underlying bit representation but change "types" by the
way you look at them.

However, knowing that there _are_ underlying bit representation may make
it far easier to do this than it would have been to go through it digit
by digit, which, by the way, can be done rather nicely with this function:

(defun bit-arrary-integer-value (bit-array)
"Returns the bits of the bit-array as an integer as the primary value.
The secondary value is the number of bits."
(let ((place -1))
(values (reduce #'+ bit-array :key (lambda (digit) (ash digit (incf place))))
(incf place))))

Please note that (bit <bit-array> 0) is the _least_ significant bit.
There are no signs in bit-arrays unless you impute one, and it is not a
good idea to do so. The value of a bit in slot n should be (expt 2 n),
to make for the most mathematically sound representation. Note that the
way a bit array is printed thus produces a _reverse_ sequence of digits
compared to the binary integer representation.

It is no coincidence that both bignums and bit vectors are stored with
the least significant digit (of their respective bases) first. While
big-endian representations are a lot easier to deal with because we write
our numbers in big-endian digit sequences, hardware-wise little-endian
has serious advantages. If you want the reverse view, you will have to
work a lot harder to get your integer representation. (Note that the
simplest way to convert from big-endian to little-endian bit order at the
byte level is via a table of byte values if you do not have hardware
support for it.)

#+(and franz-inc (or allegro-v6.0 allegro-v6.1) little-endian)
(defun bit-array-integer-value (bit-array)
(let* ((bits (length bit-array))
(bigits (ceiling bits 16))
(value (copy-seq bit-array)))
;; Turn it into a possibly denormalized bignum
(setf (sys:memref value -16 -2 :unsigned-word) 18)
(setf (sys:memref value -16 0 :unsigned-word) bigits)
;; Normalize it. This better not be optimized away!
(+ 0 value)))

It should have been needless to say even in this litigous age, but this
is for information only. use at your own risk.

> Is there any function to do that? The thing is that i wich to make a
> boar for a game with bit-array to save space, but in the same time i wich
> to have a unique hash value for different boards, so the most natural way
> is to convert that bit-array to a number. I searched all over the
> internet and found nothing about that function.

You could use integeres to represent the board directly. Use logbitp
instead of bit to access an individual bit. Of course, it is no longer a
sequence if that matters.

#:Erik
--
Travel is a meat thing.

0 new messages