*good* 64-bit random number generator.

699 views
Skip to first unread message

Dr. David Kirkby

unread,
Aug 24, 2002, 4:53:21 AM8/24/02
to
Has anyone any experience of using a good quality random number generator (RNG)
that runs on a 64-bit processor (UltraSPARC in particular), using 64-bit
arithmetic? By good I mean it passes any statistical test thrown at it. One
coded in UltraSPARC assembler would be useful. (Experience has shown me the time
to generate random numbers can be 40% of a numerical simulation, so speeding
these up in assembler might have some advantages).

Experience has shown me that a random number generator that works reasonably
well on a 16-bit processor performs abysmally on a 32-bit processor. For a very
graphical proof of this, see page 199, of chapter 8 of my PhD thesis (page 15
of chapter 8's pdf file)
http://www.medphys.ucl.ac.uk/~davek/phd/chapter8.pdf
While the RNG used in turbo pascal performed very poorly on a Sun's 32-bit
integer, it was not quite as bad as it would appear from the diagram if run with
a 16-bit integer arithmetic. I still think it is poor, but not quite as bad.

Hence it seems highly likely that a random number generator designed for a
32-bit word length would perform badly if executed using 64-bit arithmetic.
There would be advantages to using 64-bit (longer period), so I'm interested if
anyone has produced a good quality RNG that uses 64-bit arithmetic.

The book 'Numerical Recipes in C' by Press et al offer some $$ for anyone who
can find a statistical test their RNG's fail. But I'm pretty sure they would
fail if the integers were changed for longs and so the word length was 64-bits,
rather than the 32-bits the RNGs are designed for.


--
Dr. David Kirkby PhD,
email: drki...@ntlworld.com
web page: http://www.david-kirkby.co.uk
Amateur radio callsign: G8WRB

John E. Hadstate

unread,
Aug 24, 2002, 1:53:20 PM8/24/02
to

"Dr. David Kirkby" <drki...@ntlworld.com> wrote in message
news:3D674981...@ntlworld.com...

> Has anyone any experience of using a good quality random number
generator (RNG)
> that runs on a 64-bit processor (UltraSPARC in particular), using
64-bit
> arithmetic?

Use Google and...

Look up recent threads in this news group by Marsaglia. He has just
posted a long-period PRNG that passes his tests.

Look up Mersenne Twister.

Look up threads in this NG pertaining to Lewis & Payne GFSR and make the
obvious adjustment for 64-bit words. This algorithm requires 2 integer
compares, 2 integer additions, 2 array index operations and 2 XOR's to
deliver a 64-bit pseudo-random integer. How much faster can you get?


George Marsaglia

unread,
Aug 24, 2002, 2:11:07 PM8/24/02
to
Dr. David Kirkby <drki...@ntlworld.com> wrote in message news:3D674981...@ntlworld.com... > Has anyone any experience of using a good quality random number generator (RNG) > that runs on a 64-bit processor (UltraSPARC in particular), using 64-bit > arithmetic? By good I mean it passes any statistical test thrown at it. One > coded in UltraSPARC assembler would be useful. (Experience has shown me the time > to generate random numbers can be 40% of a numerical simulation, so speeding > these up in assembler might have some advantages). > Experience has shown me that a random number generator that works reasonably > well on a 16-bit processor performs abysmally on a 32-bit processor. For a very > graphical proof of this, see page 199, of chapter 8 of my PhD thesis (page 15 > of chapter 8's pdf file) > http://www.medphys.ucl.ac.uk/~davek/phd/chapter8.pdf > While the RNG used in turbo pascal performed very poorly on a Sun's 32-bit > integer, it was not quite as bad as it would appear from the diagram if run with > a 16-bit integer arithmetic. I still think it is poor, but not quite as bad. > Hence it seems highly likely that a random number generator designed for a > 32-bit word length would perform badly if executed using 64-bit arithmetic. > There would be advantages to using 64-bit (longer period), so I'm interested if > anyone has produced a good quality RNG that uses 64-bit arithmetic. A C version of a very very good 64-bit RNG is given below. You should be able to adapt it to your particular needs. It is based on the complimentary-multiple-with-carry sequence x(n)=a*x(n-4)+carry mod 2^64-1, which works as follows: Assume a certain multiplier 'a' and a base 'b'. Given a current x value and a current carry 'c', form: t=a*x+c Then the new carry is c=floor(t/b) and the new x value is x = b-1-(t mod b). Ordinarily, for 32-bit mwc or cmwc sequences, the value t=a*x+c can be formed in 64 bits, then the new c is the top and the new x the bottom 32 bits (with a little fiddling when b=2^32-1 and cmwc rather than mwc.) To generate 64-bit x's, it is difficult to form t=a*x+c in 128 bits then get the new c and new x from the the top and bottom halves. But if 'a' has a special form, for example, a=2^62+2^47+2 and b=2^64-1, then the new c and the new x can be formed with shifts, tests and +/-'s, again with a little fiddling because b=2^64-1 rather than 2^64. (The latter is not an optimal choice because, being a square, it cannot be a primitive root of the prime a*b^k+1, where 'k' is the 'lag': x(n)=a*x(n-k)+carry mod b.) But the multiplier a=2^62+2^47+2 makes a*b^4+1 a prime for which b=2^64-1 is a primitive root, and getting the new x and new c can be done with arithmetic on integers the size of x. A C version of x(n)=a*x(n-4)+carry mod 2^64-1 follows, it is cmwc64( ), with a main program to test it. #include <stdio.h> static unsigned long long x=123456789,y,z,w,c; /* default seeds y,z,w,c are all zero. You should choose ANY 4 random 64-bit seeds x,y,z,w < 2^64-1 and a random seed c in 0<= c < a = 2^62+2^47+2. There are P=(2^62+2^46+2)*(2^64-1)^4 > 2^318 possible choices for seeds, the period of the RNG. The default seeds x=123456789, y = z = w = c = 0 are OK for tests (or for real one-time runs), but you should choose randomly from set of P possible seeds. unsigned long long cmwc64( ){ unsigned long long t; t=(x<<47)+(x<<62)+(x<<1); t = t+c; t+= (t<c); c=(t<c)+(x>>17)+(x>>2)+(x>>63); x = y; y = z ; z = w; return ( w = ~(t+c)-1 ); } int main(void){ int i; unsigned long long t; // for serious runs, first assign random seed values to x,y,z,w,c for(i=0;i<100000000;i++) t=cmwc64(); for(i=0;i<5;i++) printf(" %22LLU \n",cmwc64()); } /* your default output should be 9064188857900113776 8021554294545628002 17280675555674358941 2631262474002176301 6376492577913983186 Compile with an optimizing compiler, run it and count the seconds until it prints. If it takes s seconds, then each random 64-bit integer takes about 10*s nanoseconds, (mine had 7<s<8). George Marsaglia

Doug Hughes

unread,
Aug 24, 2002, 9:18:42 PM8/24/02
to
In article <%_P99.180496$SS.73...@bin3.nnrp.aus1.giganews.com>, George Marsaglia wrote:
>
> Dr. David Kirkby <drki...@ntlworld.com> wrote in message
> news:3D674981...@ntlworld.com...
>> Has anyone any experience of using a good quality random number generator (RNG)
>> that runs on a 64-bit processor (UltraSPARC in particular), using 64-bit
>> arithmetic? By good I mean it passes any statistical test thrown at it. One
>> coded in UltraSPARC assembler would be useful. (Experience has shown me the time
>> to generate random numbers can be 40% of a numerical simulation, so speeding
>> these up in assembler might have some advantages).
>>

Have you considered Sun patch 112438-01 that adds /dev/random ?

Dr. David Kirkby

unread,
Aug 25, 2002, 2:51:04 AM8/25/02
to

I don't need the patch on my Solaris 9 machine, since /dev/random is there
already.

However, /dev/random has several drawbacks for scientific research:

1) You can not 'seed' the RNG with a known number to get back back the same
sequence of numbers again. This is really helpful when debugging code. If you
find a problem, it is nice to re-run a program with the same sequence of random
numbers again.

2) The time to generate the random numbers is not known in advance, since
/dev/random gets data from the kernel and stops producing the numbers if the
entropy (state of disorder/randomness) in the kernel is too low. This makes it
difficult to estimate how long a program will run.

3) I can't see that is it easy to test the rng because of (1).

Overall, I prefer a pseudo-random number generator, where the output it
deterministic, but random for all practical purposes.

Mok-Kong Shen

unread,
Aug 25, 2002, 9:23:23 AM8/25/02
to

"Dr. David Kirkby" wrote:
>
[snip]


> The book 'Numerical Recipes in C' by Press et al offer some $$ for anyone who
> can find a statistical test their RNG's fail. But I'm pretty sure they would
> fail if the integers were changed for longs and so the word length was 64-bits,
> rather than the 32-bits the RNGs are designed for.

I don't know the exact context of the prize. But I
tested some of the PRNGs with smaller moduli listed
in their book (these have been taken over in a list
in Schneier's AC) and found two of them failed some
simple statistical tests severely. (That's long ago.
I can't tell the details now.)

M. K. Shen

Chris Morgan

unread,
Aug 25, 2002, 10:27:23 AM8/25/02
to
"Dr. David Kirkby" <drki...@ntlworld.com> writes:

> Has anyone any experience of using a good quality random number generator (RNG)
> that runs on a 64-bit processor (UltraSPARC in particular), using 64-bit
> arithmetic? By good I mean it passes any statistical test thrown at it. One
> coded in UltraSPARC assembler would be useful.

I know that the requirements on the random number generator for the
programming language Ada95 were really harsh, and that the
implementation for the 'GNAT' compiler was a pretty serious attempt to
meet them. This is free software, which can be downloaded from
cs.ny.edu. If it would be useful, I'd be happy to run any tests you
have against a program built with GNAT. I used to work in Ada for
years... however it would be possible to provide a C api to it with a
bit of tweaking.

Chris

--
Chris Morgan <cm at mihalis.net> http://www.mihalis.net
Temp sig. - Enquire within

Dr. David Kirkby

unread,
Aug 25, 2002, 2:13:29 PM8/25/02
to
Mok-Kong Shen wrote:

> I don't know the exact context of the prize. But I
> tested some of the PRNGs with smaller moduli listed
> in their book (these have been taken over in a list
> in Schneier's AC) and found two of them failed some
> simple statistical tests severely. (That's long ago.
> I can't tell the details now.)

Well in the second edition of the book, page 281 they say:

"We think, within the limits of it floating point precision, ran2 provides
perfect random numbers; a practical definition of 'perfect' is that we will pay
$1000 to the first reader who convinces us otherwise (by finding a statistical
test than ran2 fails in a nontrivial way, excluding the ordinary limitations of
a machine's floating-point representation)."

so if you find a flaw in ran2, you may be in for some $$$.

Mok-Kong Shen

unread,
Aug 25, 2002, 4:41:43 PM8/25/02
to

"Dr. David Kirkby" wrote:
>
> Mok-Kong Shen wrote:
>
> > I don't know the exact context of the prize. But I
> > tested some of the PRNGs with smaller moduli listed
> > in their book (these have been taken over in a list
> > in Schneier's AC) and found two of them failed some
> > simple statistical tests severely. (That's long ago.
> > I can't tell the details now.)
>
> Well in the second edition of the book, page 281 they say:
>
> "We think, within the limits of it floating point precision, ran2 provides
> perfect random numbers; a practical definition of 'perfect' is that we will pay
> $1000 to the first reader who convinces us otherwise (by finding a statistical
> test than ran2 fails in a nontrivial way, excluding the ordinary limitations of
> a machine's floating-point representation)."
>
> so if you find a flaw in ran2, you may be in for some $$$.

What I meant was a list of parameters they gave for
good linear congruential PRNGs. (I don't know whether
the newest edition of the book has that list.) Of
that list, there were two PRNGs that I found defective.

M. K. Shen

Dr. David Kirkby

unread,
Aug 25, 2002, 8:28:45 PM8/25/02
to
"Dr. David Kirkby" wrote:

> However, /dev/random has several drawbacks for scientific research:
>
> 1) You can not 'seed' the RNG with a known number to get back back the same
> sequence of numbers again. This is really helpful when debugging code. If you
> find a problem, it is nice to re-run a program with the same sequence of random
> numbers again.
>
> 2) The time to generate the random numbers is not known in advance, since
> /dev/random gets data from the kernel and stops producing the numbers if the
> entropy (state of disorder/randomness) in the kernel is too low. This makes it
> difficult to estimate how long a program will run.
>
> 3) I can't see that is it easy to test the rng because of (1).

It also appears to be *very* slow, unless I am using it very wrong. I did the
following on my dual 450 MHz processor Ultra 60 (Solaris 9):
% cat /dev/random > foo

It took about 10 mins (guessing) to create a file of 11 Mb file of random
numbers. This suggests the time to create a 4-byte integer is around 200 us,
compared to the 50 ns or so to create an 8-byte integer using the RNG posted by
George Marsaglia.

Perhaps I'm doing something wrong, but /dev/random does not look to be suitable
for typical Monte Carlo simulations, but then perhaps its authors did not have
that use in mind.

Tim Bradshaw

unread,
Aug 25, 2002, 8:37:31 PM8/25/02
to
* David Kirkby wrote:
> Perhaps I'm doing something wrong, but /dev/random does not look to
> be suitable for typical Monte Carlo simulations, but then perhaps
> its authors did not have that use in mind.

I doubt they did. /dev/random and its ilk are usually aimed at
getting something as close as possible to true randomness from the
system - so they not only want to have good statistics, but they also
want to be nonrepeatable. Ideally they'd be connected to some source
of random noise, if not they try and improvise by harvesting
randomness from the system (things that depend on interrupt timings
&c). They're designed for things like keys for cryptography or TCP
sequence numbers, not for generating large amounts of statistically
good but possibly repeatable data.

--tim

Stuart Lamble

unread,
Aug 25, 2002, 8:54:45 PM8/25/02
to
In article <3D69763D...@ntlworld.com>, Dr. David Kirkby wrote:
>Perhaps I'm doing something wrong, but /dev/random does not look to
>be suitable for typical Monte Carlo simulations, but then perhaps
>its authors did not have that use in mind.

Indeed.

/dev/random is for applications where genuine, non-repeatable entropy
is needed. Mainly things like cryptography -- key generation, etc.
Because of the nature of the beast, this sort of thing tends to be
slow.

I'd suggest using a pseudo-random number generator for testing purposes,
and then, once you're happy with things (assuming that you need fully
random numbers for your purposes) look at using /dev/random. If you don't
need fully random numbers, /dev/random isn't the way to go.

--
I'm waiting for tech support to call me back. I'm also waiting for the
second coming of Jesus. Wanna take bets on which happens first?

Dr. David Kirkby

unread,
Aug 26, 2002, 4:56:14 AM8/26/02
to
Stuart Lamble wrote:

> I'd suggest using a pseudo-random number generator for testing purposes,
> and then, once you're happy with things (assuming that you need fully
> random numbers for your purposes) look at using /dev/random. If you don't
> need fully random numbers, /dev/random isn't the way to go.

Looking at the 64-bit generator posted by George Marsagliaa, it does need four
64-bit seeds. That makes it impossible to use the normal method for seeding
32-bit random number generators, which is to determine the number of seconds
since 1/1/1970 with the time(2) command and use that as the seed. That works,
since each time you run a program, you will get a different seed and so produce
a different set of random numbers. (Obviously running your program more than
once per second will break this, but rarely this is an issue - for my
applications at least).

Clearly one must use some other source of random numbers as a seed for a
generator requiring 4 64-bit numbers. Calling Solaris's time(2) 8 times would
most likely give the same answer for all 8 calls.

I can see that calling /dev/random to get 32-bytes of random numbers, which
would be used as source of four 64-bit integers to seed the random number
generator posted earlier, might be a sensible use for /dev/random. If the seed
is saved, one could re-run the program with the same seed and get the same set
of random numbers.

I don't know if others use /dev/random for a source of random numbers in
simulations, but if so, they should think again, as it might well be slowing
their program a lot.

Stuart L. Anderson

unread,
Aug 26, 2002, 6:53:35 AM8/26/02
to
In sci.crypt.random-numbers Dr. David Kirkby <drki...@ntlworld.com> wrote:

> Looking at the 64-bit generator posted by George Marsagliaa, it does need four
> 64-bit seeds. That makes it impossible to use the normal method for seeding
> 32-bit random number generators, which is to determine the number of seconds
> since 1/1/1970 with the time(2) command and use that as the seed. That works,
> since each time you run a program, you will get a different seed and so produce
> a different set of random numbers. (Obviously running your program more than
> once per second will break this, but rarely this is an issue - for my
> applications at least).

My current simulation generator is

Pierre L'Ecuyer and Terry H. Andres.
A random number generator based on the combination of four LCGs.
Mathematics and Computers in Simulation, May 1997, 44(1):99-107.

It uses a combination of 4 31-bit congruential generators. The first seed
is obtained by standard techniques. The next three are computed in
sequence from it with a 31-bit inversive generator. Yes, the inversive
generator is slow, but I only call it three times. I expect that it
breaks up the generated seeds better than a congruential generator would.

--Stu
___________________________________________________________________
Stu Anderson stua...@drizzle.com Renton, Washington, USA

Scott Nelson

unread,
Aug 26, 2002, 2:37:26 PM8/26/02
to
On Mon, 26 Aug 2002 01:28:45 +0100, "Dr. David Kirkby"
<drki...@ntlworld.com> wrote:

>"Dr. David Kirkby" wrote:
>
>> However, /dev/random has several drawbacks for scientific research:
>>
>> 1) You can not 'seed' the RNG with a known number to get back back the same
>> sequence of numbers again. This is really helpful when debugging code. If you
>> find a problem, it is nice to re-run a program with the same sequence of random
>> numbers again.
>>
>> 2) The time to generate the random numbers is not known in advance, since
>> /dev/random gets data from the kernel and stops producing the numbers if the
>> entropy (state of disorder/randomness) in the kernel is too low. This makes it
>> difficult to estimate how long a program will run.
>>
>> 3) I can't see that is it easy to test the rng because of (1).
>
>It also appears to be *very* slow, unless I am using it very wrong. I did the
>following on my dual 450 MHz processor Ultra 60 (Solaris 9):
>% cat /dev/random > foo
>

try:
cat /dev/urandom > foo.

It will probably be considerably faster.

/dev/random is "true" randomness.
/dev/urandom is cryptographically secure randomness.

cryptographically secure randomness is more than good enough
for monte carlo simulations.

You still won't be able to seed it though, so it still may not
be as good as a traditional pseudo random number generator.

Scott Nelson <sc...@helsbreth.org>

Scott Nelson

unread,
Aug 26, 2002, 2:50:02 PM8/26/02
to

That is a big problem for cryptographically secure random
number generators, but why would it matter to non-secure PRNG?

Seems to me that if each seed is unique, no more is required.

BTW, instead of just time(2), throw in gettimeofday(2) if your
system supports it (which solaris does).
Even if you just want a 32 bit seed,
adding the microseconds to the seconds will avoid some of the
problems that time(2) alone has.

Scott Nelson <sc...@helsbreth.org>

Eric Braeden

unread,
Aug 26, 2002, 3:04:07 PM8/26/02
to

"Dr. David Kirkby" <drki...@ntlworld.com> wrote in message
news:3D69ED2E...@ntlworld.com...
> Stuart Lamble wrote:
>
> snip

>
> Clearly one must use some other source of random numbers as a seed for a
> generator requiring 4 64-bit numbers. Calling Solaris's time(2) 8 times
would
> most likely give the same answer for all 8 calls.
>
> --
> Dr. David Kirkby PhD,
Actually there is a low-level time() function in Sparc Solaris that gives
usec
resolution from the system timer. I haven't looked at this for a while, but
I remember it being impossible on a Sparc5 to get the same value
even with immediate repeated calls to this function. I remember
harvesting about one byte of entropy per call. For crypto work,
of course, hashing would be necessary.


J.D. Baldwin

unread,
Aug 29, 2002, 11:00:24 AM8/29/02
to

In the previous article, Scott Nelson <sc...@helsbreth.org> wrote:
> >It also appears to be *very* slow, unless I am using it very wrong. I
> >did the following on my dual 450 MHz processor Ultra 60 (Solaris 9):
> >% cat /dev/random > foo
>
> try:
> cat /dev/urandom > foo.
>
> It will probably be considerably faster.

Whoa, no kidding.

Ten seconds of /dev/random redirected to a file in /tmp on an
Ultra-60: 72,800 bytes

Ten seconds of /dev/urandom redirected to a file in /tmp on an
Ultra-60: 14,556,880 bytes

[both run while gcc was compiling in another window]

A bit over two orders of magnitude difference. Thanks for the tip.
--
_+_ From the catapult of |If anyone disagrees with any statement I make, I
_|70|___:)=}- J.D. Baldwin |am quite prepared not only to retract it, but also
\ / bal...@panix.com|to deny under oath that I ever made it. -T. Lehrer
***~~~~-----------------------------------------------------------------------

Rob Stampfli

unread,
Sep 1, 2002, 1:30:59 AM9/1/02
to
In article <aklcu8$kiq$1...@reader1.panix.com>,

J.D. Baldwin <baldwi...@panix.com> wrote:
>
>In the previous article, Scott Nelson <sc...@helsbreth.org> wrote:
>> >It also appears to be *very* slow, unless I am using it very wrong. I
>> >did the following on my dual 450 MHz processor Ultra 60 (Solaris 9):
>> >% cat /dev/random > foo
>>
>> try:
>> cat /dev/urandom > foo.
>>
>> It will probably be considerably faster.
>
>Whoa, no kidding.
>
>Ten seconds of /dev/random redirected to a file in /tmp on an
>Ultra-60: 72,800 bytes
>
>Ten seconds of /dev/urandom redirected to a file in /tmp on an
>Ultra-60: 14,556,880 bytes
>
>[both run while gcc was compiling in another window]

Be aware that /dev/random does not return until it has acquired
enough randomness from the system through things like disk accesses,
key strokes and the like (and, no, I'm not an expert at what Sun
uses to derive this randomness), so if you are requesting a lot of
random data, it will take some time. /dev/urandom does not block
when it runs short of randomness -- it will just recycle what it
has through a hash function, like SHA, which makes the data appear
random. In essance, you can think of /dev/random as generating truly
random data, while /dev/urandom is more of a pseudo-random number
generator with an initial random seed. /dev/urandom may be fine for
your application but be aware of its limitations. For instance, I
would prefer using /dev/random for strong cryptography.

An aside: The Andreas Maier random number package found at:

http://www.cosy.sbg.ac.at/~andi/

derives its entropy pool (randomness) from sources that inherently
have no delay, so its /dev/random and /dev/urandom interfaces
operate identically. Whether this is an asset or liability is
something the reader will have to decide for himself.

Rob

Florian Weimer

unread,
Sep 3, 2002, 4:43:41 AM9/3/02
to
"George Marsaglia" <g...@stat.fsu.edu> writes:

> A C version of a very very good 64-bit RNG is given below.
> You should be able to adapt it to your particular needs.
>
> It is based on the complimentary-multiple-with-carry
> sequence
> x(n)=a*x(n-4)+carry mod 2^64-1,
> which works as follows:
> Assume a certain multiplier 'a' and a base 'b'.
> Given a current x value and a current carry 'c',
> form: t=a*x+c
> Then the new carry is c=floor(t/b)
> and the new x value is x = b-1-(t mod b).

Why is this better than an LCG?

(And I doubt that the generator is suitable for
sci.crypt.random-numbers purposes, too.)

Scott Nelson

unread,
Sep 3, 2002, 12:01:11 PM9/3/02
to

Complimentary-multiple-with-carry is superior to
LCG because it has much better diffusion properties.
i.e. It takes less time for a single bit change in
the state to cause all bits in the state to be different

With a LCG, the low order bits are completely unaffected
by changes to the high order bits, no matter how many
iterations are performed.

The random-numbers group is under the sci.crypt
hierarchy because of usenet group creation policies.
It is a perfectly acceptable place to discuss "normal" PRNG,
providing that it's clear that is what is being discussed.
In this thread, the original poster clarified it completely
with the statement "By good I mean it passes any statistical
test thrown at it.".

Scott Nelson <sc...@helsbreth.org>

George Marsaglia

unread,
Sep 3, 2002, 1:13:19 PM9/3/02
to

Florian Weimer <Wei...@CERT.Uni-Stuttgart.DE> wrote in message
news:878z2jt...@Login.CERT.Uni-Stuttgart.DE...

> "George Marsaglia" <g...@stat.fsu.edu> writes:
>
> > A C version of a very very good 64-bit RNG is given below.
> > You should be able to adapt it to your particular needs.
> >
> > It is based on the complimentary-multiple-with-carry
> > sequence
> > x(n)=a*x(n-4)+carry mod 2^64-1,
> > which works as follows:
> > Assume a certain multiplier 'a' and a base 'b'.
> > Given a current x value and a current carry 'c',
> > form: t=a*x+c
> > Then the new carry is c=floor(t/b)
> > and the new x value is x = b-1-(t mod b).
>
> Why is this better than an LCG?

1. Depending on the multiplier 'a', the period is
typically > 2^276, much larger than the 2^32
of a 32-bit congruential generator or 2^64 for 64 bits.
2. CMWC RNG's seem to pass tests of randomness much better
than the commonly used congruential generators.
Most of the latter use modulus 2^32 or 2^64 and
will have unsatisfactory trailing bits.
All congruential RNG's fail certain tests, such as the gcd test described in
http:www.jstatsoft.org v7. no 3.
And, of course, (congruential) "Random numbers fall mainly in the planes",
G. Marsaglia, Proc. National Academy of Sciences" 1968, v61 pp25-28.
3. For uses of RNG's in cryptography, I have little interest and no expertise,
other than in the underlying mathematics of sequences of the form
x, f(x), f^2(x), f^3(x), ...
where x is a random selection from a set S and f( ) is a function mapping S onto S.
Such are the basis of most RNG's, and methods for determining f( ), given a few
elements of such a sequence, are likely to be of interest in cryptography.

Given a few successive integers from a LCG, there is an easy way to
reconstruct the entire sequence---see, for example, G. Marsaglia,
"Cracking a congruential RNG", sci.crypt - 19 Apr 2002.
I challenge you to do so with a multiply-with-carry
or complementary-multiply-with-carry RNG's when you
are not given any of the parameters in the recursion
x(n)=a*x(n-k)+carry mod b (MWC)
or
x(n)=(b-1) - (a*x(n-k)+carry mod b) (CMWC)
Some know how, but not many. (Hint: the x's are, in
reverse order, digits of the reciprocal of a large prime
expanded to base 'b', or the complements of those digits.)
4. Aside from addressing why one RNG seems better than another,
there are advantages to having a variety of choices for RNG's,
particularly when they are to be used as sources of random bits.
When, as random integers, they are floated to give uniform [0,1)
reals, it is difficult to give reasons why some are better than others.
To quote from my Keynote Address, Computer Science and Statistics,
XVI Symposium on the Interface, 1985,
A random number generator is like sex;
When it's good, it's wonderful,
And when it's bad, it's still pretty good.

George Marsaglia

Reply all
Reply to author
Forward
0 new messages