Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

(quest) interesting benchmark

27 views
Skip to first unread message

Prroffessorr Fir Kenobi

unread,
Aug 30, 2015, 3:44:11 PM8/30/15
to
this is some benchmark i fond on some
blog (called 'null program')

#include <stdio.h>
#include <stdlib.h>

int trial() {
int count = 0;
double sum = 0;
while (sum <= 1.0) {
sum += rand() / (double) RAND_MAX;
count++;
}
return count;
}

double monteCarlo(int n) {
int i, total = 0;
for (i = 0; i < n; i++) {
total += trial();
}
return total / (double) n;
}

int main() {
printf("%f\n", monteCarlo(100000000));
return 0;
}

this measures if i understuud the problem
how many random values (form 0. to 1.) to add
to get a sum over 1.0 (it could be thinked maybe that it shold be 2 on average but it
soes not seem so (or maybe im wrong in understanding it) dont matter)
They say that it is e = 2.71828...

this is anyway interesting form the optimistation points of view, this kode
above on my old C2D executes 7.6 s and
yeilds to 2.178154

the quest optimise it.. to optimise it i think there should be written a better rand() routine that both would have good probabilistic properties and faster execution..

can someone provide such routine?

Johann Klammer

unread,
Aug 30, 2015, 5:13:44 PM8/30/15
to
On 08/30/2015 09:43 PM, Prroffessorr Fir Kenobi wrote:
> this is some benchmark i fond on some blog (called 'null program')
>
> #include <stdio.h> #include <stdlib.h>
>
> int trial() { int count = 0; double sum = 0; while (sum <= 1.0) { sum
> += rand() / (double) RAND_MAX; count++; } return count; }
>
> double monteCarlo(int n) { int i, total = 0; for (i = 0; i < n; i++)
> { total += trial(); } return total / (double) n; }
>
> int main() { printf("%f\n", monteCarlo(100000000)); return 0; }
>
> this measures if i understuud the problem how many random values
> (form 0. to 1.) to add to get a sum over 1.0 (it could be thinked
> maybe that it shold be 2 on average but it soes not seem so (or maybe
> im wrong in understanding it) dont matter) They say that it is e =
> 2.71828...
>
> this is anyway interesting form the optimistation points of view,
> this kode above on my old C2D executes 7.6 s and yeilds to 2.178154
Looks closer to 2 than e....

>
> the quest optimise it.. to optimise it i think there should be
> written a better rand() routine that both would have good
> probabilistic properties and faster execution..
>
> can someone provide such routine?
>

take the code for rand and the functions it calls from the c library and inline them in this file.

Prroffessorr Fir Kenobi

unread,
Aug 30, 2015, 5:21:39 PM8/30/15
to
where is that code?
(changing div to mul by inverse improves quite a lot 7.6 -> 5.6 or alike,, but
really wonder if some better than clib rand cannot be found..

are all this various platform rands() implement the same algorithm?

David Brown

unread,
Aug 30, 2015, 6:28:27 PM8/30/15
to
On 30/08/15 23:21, Prroffessorr Fir Kenobi wrote:
> W dniu niedziela, 30 sierpnia 2015 23:13:44 UTC+2 użytkownik Johann
> Klammer napisał:
>> On 08/30/2015 09:43 PM, Prroffessorr Fir Kenobi wrote:
>>> this is anyway interesting form the optimistation points of
>>> view, this kode above on my old C2D executes 7.6 s and yeilds to
>>> 2.178154
>> Looks closer to 2 than e....

That's a typo. With 100000000 iterations the answer is about 2.718359.
With more iterations, it appears to get closer to e - but converges
extremely slowly.

I haven't done the maths to work out the answer correctly, but I can't
say it would shock me to see the answer being e.


>>
>>>
>>> the quest optimise it.. to optimise it i think there should be
>>> written a better rand() routine that both would have good
>>> probabilistic properties and faster execution..
>>>
>>> can someone provide such routine?
>>>
>>
>> take the code for rand and the functions it calls from the c
>> library and inline them in this file.
>
> where is that code? (changing div to mul by inverse improves quite a
> lot 7.6 -> 5.6 or alike,, but really wonder if some better than clib
> rand cannot be found..

Use -ffast-math, which lets the compiler do optimisations like that
automatically. And making file-local functions (and variables, though
there are none here) "static" can sometimes be of significant help in
optimising, as well as being good programming practice.

>
> are all this various platform rands() implement the same algorithm?
>

There are vast numbers of pseudo-random number generating algorithms. I
have no idea which one your code will be using from the C library on
your system. But it won't take much googling to find some. The quality
(which can be measured in many different ways) varies - you might find
some algorithms that look good, and give a good long-term average
pattern, yet fail to give anything close to e on this test.

bartekltg

unread,
Aug 30, 2015, 6:44:44 PM8/30/15
to
On 30.08.2015 23:13, Johann Klammer wrote:
> On 08/30/2015 09:43 PM, Prroffessorr Fir Kenobi wrote:
>> this is some benchmark i fond on some blog (called 'null program')
>>
>> #include <stdio.h> #include <stdlib.h>
>>
>> int trial() { int count = 0; double sum = 0; while (sum <= 1.0) { sum
>> += rand() / (double) RAND_MAX; count++; } return count; }
>>
>> double monteCarlo(int n) { int i, total = 0; for (i = 0; i < n; i++)
>> { total += trial(); } return total / (double) n; }
>>
>> int main() { printf("%f\n", monteCarlo(100000000)); return 0; }
>>
>> this measures if i understuud the problem how many random values
>> (form 0. to 1.) to add to get a sum over 1.0 (it could be thinked
>> maybe that it shold be 2 on average but it soes not seem so (or maybe
>> im wrong in understanding it) dont matter) They say that it is e =
>> 2.71828...
>>
>> this is anyway interesting form the optimistation points of view,
>> this kode above on my old C2D executes 7.6 s and yeilds to 2.178154
> Looks closer to 2 than e....

It is e. Fir just doesn't care and smash keyboard without looking;-)

Lets n(x) be an expected value of times one need to draw
a random number idd U[0..1] to gat sum smaller than x.
We are looking for n(1).

We draw a number p. If p>=x we are done. In other case
we need another n(x-p) draws.

n(x) = 1 + \int_0^x n(x-p) dp

I'm sure there is a methodical way to solve the equation,
but we can guess n(x) = exp(x). Ant it works.
1+\int_0^x exp(x-p) dp = 1 - exp(x-p)|_0^x = 1-(exp(0)-exp(x))=exp(x)

n(1) = e.


>> the quest optimise it.. to optimise it i think there should be
>> written a better rand() routine that both would have good
>> probabilistic properties and faster execution..
>>
>> can someone provide such routine?
>>
>
> take the code for rand and the functions it calls from the c library and inline them in this file.

I'm pretty sure a decent compiler do it for us.


A version with default_random_engine (propably x = x * 48271 %
2147483647) from <random> without any trick, is only a bit slower

int trial2() {
static random_device rd;
static default_random_engine gen(rd());
uniform_real_distribution<double> dist(0.0,1.0);
int count = 0;
double sum = 0;
while (sum <= 1.0) {
sum += dist(gen);
count++;
}
return count;
}

double monteCarlo2(int n) {
int i, total = 0;
for (i = 0; i < n; i++) {
total += trial2();
}
return total / (double) n;
}


c like rand 2.718359
5.64892s
c++11 <random> 2.718232
5.73054s
c like rand 2.718111
5.67173s
c++11 <random> 2.718149
5.73041s
c like rand 2.718318
5.67199s
c++11 <random> 2.718329
5.73014s

And mt19937, a real pseudorandom number generator, not a toy,
get results below 8seconds.

bartekltg



bartekltg

unread,
Aug 30, 2015, 6:50:24 PM8/30/15
to
On 31.08.2015 00:28, David Brown wrote:
> On 30/08/15 23:21, Prroffessorr Fir Kenobi wrote:
>> W dniu niedziela, 30 sierpnia 2015 23:13:44 UTC+2 użytkownik Johann
>> Klammer napisał:
>>> On 08/30/2015 09:43 PM, Prroffessorr Fir Kenobi wrote:
>>>> this is anyway interesting form the optimistation points of
>>>> view, this kode above on my old C2D executes 7.6 s and yeilds to
>>>> 2.178154
>>> Looks closer to 2 than e....
>
> That's a typo. With 100000000 iterations the answer is about 2.718359.
> With more iterations, it appears to get closer to e - but converges
> extremely slowly.

Monte Carlo always is extremely slow. It converges like 1/sqrt(n),
so we get next significant digit after 100 times more work.

bartekltg

0 new messages