| So, what is all this? In particular, is there something special about
| the value of 3.7 billion?
No, nothing special at all.
The purpose of the exercise is just to confirm that after generating
1000000000 random numbers, you get the same answer as George does.
"robin" wrote in message
| So, what is all this? In particular, is there something special about
| the value of 3.7 billion?
>No, nothing special at all.
>The purpose of the exercise is just to confirm that after generating
>1000000000 random numbers, you get the same answer as George does.
Alas, I think you are making some strong assumptions about the state of
computing in the hereafter.
George lives on in his code.
All we have now are George Marsaglia's posts and writings.
I know there's now a move on the way to 64-bit processors,
which I take to mean the x86_64 or AMD64 design/instruction set.
In any case, with an executable compiled with a C compiler,
there's the function sizeof, which might be useful
in some cases at run time.
For example, one could add to main() in C :
printf("the size of an unsigned long in bytes is %d\n", sizeof(unsigned long));
There's also the Itanium architecture and others, and even with a known
processor, some compiler flags affect the number of bytes for
some data types, such as "long double" with the -m64 flag
on Fujitsu SPARC IV with Sun Solaris (--> 16 byte long
doubles with the -m64 flag).
AFAIK, sizeof(unsigned long) can be relied upon to give the size
in 8-bit bytes of a C "unsigned long".
Perhaps some documentation of language, machine, compiler, compiler
options examples where KISS4691 works as per the Marsaglia
specs could be helpful as a reference ...
David Bernier
--
The MegaPenny Project | One Trillion Pennies:
<http://www.kokogiak.com/megapenny/thirteen.asp>
The move happened several years ago (at least on the desktop and server).
> In any case, with an executable compiled with a C compiler,
> there's the function sizeof, which might be useful
> in some cases at run time.
Being pedantic, sizeof is a compile time operator when used with
integral types.
> For example, one could add to main() in C :
>
> printf("the size of an unsigned long in bytes is %d\n", sizeof(unsigned long));
Given the code as written, assert(sizeof(unsigned long) == 4) would be
more use.
> There's also the Itanium architecture and others, and even with a known
> processor, some compiler flags affect the number of bytes for
> some data types, such as "long double" with the -m64 flag
> on Fujitsu SPARC IV with Sun Solaris (--> 16 byte long
> doubles with the -m64 flag).
>
> AFAIK, sizeof(unsigned long) can be relied upon to give the size
> in 8-bit bytes of a C "unsigned long".
sizeof(unsigned long) is by definition the size in (not necessarily 8
bit) bytes of an unsigned long.
> Perhaps some documentation of language, machine, compiler, compiler
> options examples where KISS4691 works as per the Marsaglia
> specs could be helpful as a reference ...
I suggested long ago the code be updated to use fixed width types, thus
removing any ambiguities.
--
Ian Collins
(snip)
> All we have now are George Marsaglia's posts and writings.
> I know there's now a move on the way to 64-bit processors,
> which I take to mean the x86_64 or AMD64 design/instruction set.
I have an actual Itanium system, but not so many people do.
> In any case, with an executable compiled with a C compiler,
> there's the function sizeof, which might be useful
> in some cases at run time.
Well, sizeof is a compile time constant, but, yes, you can
use the value at run time.
(snip)
> AFAIK, sizeof(unsigned long) can be relied upon to give the size
> in 8-bit bytes of a C "unsigned long".
No. You need CHAR_BIT to tell how many bits are in a char.
It has been known to get to 64 on word addressed 64 bit machines.
It must be at least 8, but can be more.
-- glen
... for suitable values of "several." DEC's first Alpha CPU's
shipped in 1992, and are now old enough to vote.
>> In any case, with an executable compiled with a C compiler,
>> there's the function sizeof, which might be useful
>> in some cases at run time.
>
> Being pedantic, sizeof is a compile time operator when used with
> integral types.
It's an operator, always. It's evaluable at compile time for
any operand, integral or not, except a variable-length array (whose
element count is not determined until run time).
--
Eric Sosman
eso...@ieee-dot-org.invalid
The context was integral types, I didn't want to venture anywhere near
VLAs on a cross-post!
--
Ian Collins
I'm sorry about the inaccuracies and falsehoods in my post.
I support your suggestion, perhaps also adding comments.
I'm the wrong person to update the code, but I would be
willing to test updated C code.
David Bernier
It's not just evaluable at compile time, it is evaluated using
only the type and not the value of the expression. C99 6.5.3.4
paragraph 2.
Regards,
Nick Maclaren.
I'm dealing with this same issue in my real life, as my algebraist uncle
seems to have early-onset alzheimer's. So he has his demonstrable
achievements in science and the last few years where he embarrassed himself.
George's C was atrocious at the end. I'd like to let that die.
A kiss is a kiss; it's simple, effective, and not improved by
redefinition when you've lost your marbles.
--
Uno
Certainly CHAR_BIT *could* be 64 (or even more), but I've never heard
of a 64-bit system with CHAR_BIT==64. Such an implementation would
have trouble dealing with octet-oriented data from other systems.
Some DSPs (digital signal processors) do have CHAR_BIT > 8.
There have been systems with, for example, 9-bit bytes, but they
had pretty much become obsolete by the time C was standardized.
C defines two kinds of implementations, hosted and freestanding.
Hosted implementations must support the full standard library
(stdio and so forth); freestanding implementations are typically for
systems where the program doesn't run under an operating system,
such as embedded systems and OS kernels. I've never heard of a
hosted implementation with CHAR_BIT > 8.
Do you know of such a system?
--
Keith Thompson (The_Other_Keith) ks...@mib.org <http://www.ghoti.net/~kst>
Nokia
"We must do something. This is something. Therefore, we must do this."
-- Antony Jay and Jonathan Lynn, "Yes Minister"
(snip, I wrote)
>> No. You need CHAR_BIT to tell how many bits are in a char.
>> It has been known to get to 64 on word addressed 64 bit machines.
>> It must be at least 8, but can be more.
> Certainly CHAR_BIT *could* be 64 (or even more), but I've never heard
> of a 64-bit system with CHAR_BIT==64. Such an implementation would
> have trouble dealing with octet-oriented data from other systems.
> Some DSPs (digital signal processors) do have CHAR_BIT > 8.
> There have been systems with, for example, 9-bit bytes, but they
> had pretty much become obsolete by the time C was standardized.
The story I remember was for a Cray machine, but I never used
one to check on it. I do know that there are some 64 bit
word addressed machines, especially some from Cray.
-- glen
I used to work on a Cray T90. It had 64-bit words, and no direct
hardware access to smaller chunks of memory; to read an 8-bit byte,
you had to read the word containing it and then extract the bits
you wanted. But the C compiler (C90, it never implemented C99)
had CHAR_BIT==8. char was 8 bits; short, int, and long were all
64 bits. A pointer to anything smaller than a 64-bit word stored
the byte offset in the high-order 3 bits (the address space was
much smaller than 64 bits, so the high-order bits weren't being
used for anything else). This was all handled in software; the
compiler had to generate extra instructions to deal with byte data.
Making CHAR_BIT==64 would have made much more sense for the machine
itself in isolation, but it would have made interoperability with
other systems quite difficult. It ran Unicos, Cray's BSDish Unix;
I'm not sure you can even implement Unix with bytes bigger than
8 bits.
It made it very slow for things like string processing, but since
it was primarily used for floating-point computations that wasn't
much of an issue.
Some older Cray vector systems ran other, non-Unix, operating systems
(COS, CTSS), but I never used any of those, and I don't know what
value their compilers used for CHAR_BIT -- if C was even supported.
No, it can be relied to give the size in "bytes" in C of "unsigned long",
but "byte" is just a synonym for "the type char", and there have been
systems where that's not 8-bit bytes.
-s
--
Copyright 2011, all wrongs reversed. Peter Seebach / usenet...@seebs.net
http://www.seebs.net/log/ <-- lawsuits, religion, and funny pictures
http://en.wikipedia.org/wiki/Fair_Game_(Scientology) <-- get educated!
I am not speaking for my employer, although they do rent some of my opinions.
I've heard of at least one system where sizeof(int64_t) was 1, but I'm not
sure what range of values was allowed in chars. Presumably unsigned char
was 0..2^64-1.
It was also a LONG time ago, and may not have been standard C.
| Alas, I think you are making some strong assumptions about the state of
| computing in the hereafter.
The sun is not scheduled for burnout for a few million years yet.
The Unisys U2200 implementation that we used in the mid-1990's had 9-bit
characters, and had a full C implementation. I didn't use it much personally
(I was working on the Ada compiler), but we had extensive facilities to
interface to the C compiler which definitely used pointers at 9-bit bytes.
(The machine was most efficient when used with 36-bit word addressing, which
we used extensively in the Ada compiler, but the C-compiler used double-word
pointers including a byte offset -- so we had to have conversion facilities
to get back and forth).
Not sure of exactly what version of C that was, but surely well after "C was
standardized".
Randy.
This is internet in all its beauty:
(1) He has "heared" of at least one system...
(2) The "system" is not named, nor identified even in a remote sense
> but I'm not
> sure what range of values was allowed in chars. Presumably unsigned char
> was 0..2^64-1.
>
Presumably (of course) the system in question used 64 bits for
character data. VERY efficient.
> It was also a LONG time ago, and may not have been standard C.
People tell stories because it is so easy to start them...
Nobody will complain.
So what's the problem? Perhaps someone else will remember more
about it, and we can all learn something.
>> but I'm not
>> sure what range of values was allowed in chars. Presumably unsigned char
>> was 0..2^64-1.
>>
>
> Presumably (of course) the system in question used 64 bits for
> character data. VERY efficient.
Presumably the system in question was not optimized for working
with character data. I've mentioned the Cray T90 here before;
its use of software-defined byte pointers made character processing
extraordinarily inefficient, but it was blazingly fast on vectors
of 64-bit floating-point numbers, which is exactly what it was
designed for. (And the front of the machine had a really cool
glassed-in Flourinert "waterfall".)
>> It was also a LONG time ago, and may not have been standard C.
>
> People tell stories because it is so easy to start them...
> Nobody will complain.
I can't think of any good reason why anybody should complain.
Note: This is cross-posted to sci.math and comp.lang.c. I've
redirected followups just to comp.lang.c. Feel free to restore
the cross-post if you think the sci.math folks would be interested.
| I'm dealing with this same issue in my real life, as my algebraist uncle
| seems to have early-onset alzheimer's. So he has his demonstrable
| achievements in science and the last few years where he embarrassed himself.
|
| George's C was atrocious at the end. I'd like to let that die.
You might not have liked George's programming style,
but there is no suggestion that he was losing it.
Here is a modification of the program with masking to produce correct results
with any conforming C implementation. It truncates where when required. A good
optimizer should eliminate the unneeded masking for 32-bit unsigned long.
static unsigned long xs = 521288629;
static unsigned long xcng = 362436069;
static unsigned long Q[4691];
#define M32 0xffffffff
unsigned long MWC(void)
{
static unsigned long c = 0;
static unsigned long j = 4691;
unsigned long t;
unsigned long x;
j = (j < 4690) ? j + 1 : 0;
x = Q[j];
t = ((x << 13) + c) & M32;
if (t < c) {
c = (x >> 19) + 1;
t = (t + x) & M32;
} else {
t = (t + x) & M32;
c = (x >> 19) + (t < x);
}
return (Q[j] = t);
}
void initMWC(void)
{
unsigned long i;
for (i = 0; i < sizeof Q / sizeof Q[0]; i++)
Q[i] = ((xcng = 69069 * xcng + 123) + (xs = (xs ^ (xs << 13)) & M32,
xs ^= (xs >> 17), xs ^= (xs << 5))) & M32;
}
#ifdef UNIT_TEST
#include <stdio.h>
int main()
{
unsigned long i;
unsigned long x;
initMWC();
printf("Does MWC result=3740121002 ?\n");
for (i = 0; i < 1000000000; i++)
x = MWC();
printf("%27u\n", x);
printf("Does KISS result=2224631993 ?\n");
for (i = 0; i < 1000000000; i++)
x = (MWC() + (xcng = 69069 * xcng + 123) + (xs = (xs ^ (xs << 13)) & M32,
xs ^= (xs >> 17), xs ^= (xs << 5)));
printf("%27u\n", x);
return 0;
}
#endif
--
Thad
Why oh why can't people just use fixed width types? What's the
obsession with unsigned long and masking?
Add
#include <stdint.h>
change all references to "unsigned long" to uint32_t, remove all
references to M32.
> static unsigned long xs = 521288629;
> static unsigned long xcng = 362436069;
> static unsigned long Q[4691];
> #define M32 0xffffffff
>
> unsigned long MWC(void)
> {
> static unsigned long c = 0;
> static unsigned long j = 4691;
> unsigned long t;
> unsigned long x;
> j = (j< 4690) ? j + 1 : 0;
> x = Q[j];
> t = ((x<< 13) + c)& M32;
> if (t< c) {
> c = (x>> 19) + 1;
> t = (t + x)& M32;
> } else {
> t = (t + x)& M32;
> c = (x>> 19) + (t< x);
> }
> return (Q[j] = t);
>
> }
>
> void initMWC(void)
> {
> unsigned long i;
> for (i = 0; i< sizeof Q / sizeof Q[0]; i++)
> Q[i] = ((xcng = 69069 * xcng + 123) + (xs = (xs ^ (xs<< 13))& M32,
> xs ^= (xs>> 17), xs ^= (xs<< 5)))& M32;
>
> }
>
> #ifdef UNIT_TEST
>
> #include<stdio.h>
>
> int main()
> {
> unsigned long i;
> unsigned long x;
>
> initMWC();
>
> printf("Does MWC result=3740121002 ?\n");
> for (i = 0; i< 1000000000; i++)
> x = MWC();
> printf("%27u\n", x);
>
> printf("Does KISS result=2224631993 ?\n");
> for (i = 0; i< 1000000000; i++)
> x = (MWC() + (xcng = 69069 * xcng + 123) + (xs = (xs ^ (xs<< 13))& M32,
> xs ^= (xs>> 17), xs ^= (xs<< 5)));
> printf("%27u\n", x);
> return 0;
> }
>
> #endif
>
>
>
--
Ian Collins
Because, in C at least, it's not mandatory for a conforming
implementation to support any of the fixed-width types.
--
James Kuyper
But it's hardly rocket science to declare them for implementations that
lack them.
--
Ian Collins
More accurately, it's generally impossible to declare them for
implementations that lack them. If the implementation choses not to
provide a particular fixed-width type, it's generally because it's not
possible to do so on that platform; less frequently, it's because it
would be inconvenient to support them.
--
James Kuyper
(snip, someone wrote)
>> Because, in C at least, it's not mandatory for a conforming
>> implementation to support any of the fixed-width types.
> But it's hardly rocket science to declare them for implementations that
> lack them.
With the assumption that the fixed-width is one implemented on
the host machine. Neither Fortran nor C require a machine to
support a 32 bit type. (Java does.)
-- glen
In my case, I work with a lot of compilers not supporting C99. The code is thus
more portable to existing applications. Standard C doesn't guarantee the
existence of a 32-bit integer type.
--
Thad
As those of us with significant experience of portability will
remember, building fixed sizes into a program is a cardinal
error. Sooner or later, you will want to run that code on a
system with different sizes, and it won't work. The vast
majority of clean programs, even in C90, needed NO source
changes to go from 32 to 64 bits; I have written code that was
moved to systems with 16, 24, 32, 36, 48, 60, 64 and perhaps
other sizes, and so have many other people.
I accept that Marsaglia's generator is a valid case for using
fixed-width types, but that is likely to make it very inefficient
indeed on a system where they are not natural. Masking is a
far more portably efficient approach.
Regards,
Nick Maclaren.
Most of your posts are such a waste of bandwidth...
Back into the bit bucket you go. No soup for you!!
> You might not have liked George's programming style,
> but there is no suggestion that he was losing it.
If no one has suggested it, then I do so herewith. What proof do you
need? (That the guy died?)
I'm not a stranger to numbers or C. It's not "potentially top-ranked"
so much as "rank."
I apologize to those I might offend with such rhetoric. His
acknowledged contributions to computer science live on.
- -
Uno
> Making CHAR_BIT==64 would have made much more sense for the machine
> itself in isolation, but it would have made interoperability with
> other systems quite difficult. It ran Unicos, Cray's BSDish Unix;
> I'm not sure you can even implement Unix with bytes bigger than
> 8 bits.
Could tack on a satellite machine for interoperability
the way ARPANET was first configured. Each big machine
at Illinois, etc. had a Burroughs (6600?) that actually
did the networking.
--
Michael Press
So Ian,
I wouldn't mind hearing your thoughts about corruption and empires. Are
they the same thing?
But to be topical, I thought I might re-write george in fortran, using
the ISO_C_BINDING.
Attended the first funeral after my dad's today.
--
Uno
Oh my goodness, you can't re-write george in these here united states.
--
mz
| As those of us with significant experience of portability will
| remember, building fixed sizes into a program is a cardinal
| error. Sooner or later, you will want to run that code on a
| system with different sizes, and it won't work.
That's because FORTRAN was designed around a word machine,
where float precisions offered were either single and double precision.
That could cause different actions when moving from a 32-bit word machine
to 60/64-bit word machine,
Later languages, including PL/I, were designed around
byte machines. The way in which float precisions are specified
lends itself to portability. Thus, asking for say, 12 decimal digits
got you about that number of digits, whether the machine offered
32-bit or 60-bit words. Some machines actually offered decimal
(BCD) floats in hardware,* and asking for 12 decimals got you about
that number of digits.
| The vast
| majority of clean programs, even in C90, needed NO source
| changes to go from 32 to 64 bits; I have written code that was
| moved to systems with 16, 24, 32, 36, 48, 60, 64 and perhaps
| other sizes, and so have many other people.
Indeed. (Including doing portable character handling before F77.)
| I accept that Marsaglia's generator is a valid case for using
| fixed-width types, but that is likely to make it very inefficient
| indeed on a system where they are not natural.
I very much doubt that any new machine is going to offer less than 32 bit
integer words ; for some years now the trend has been to machines of
64-bit integers. Marsaglia's (integer) algorithms will run on such machines,
some possibly with minor alterations, but will not, even if modified,
will not run "inefficiently".
| Masking is a
| far more portably efficient approach.
_______________
* some still do, in an updated verion of decimal.
| As those of us with significant experience of portability will
| remember, building fixed sizes into a program is a cardinal
| error. Sooner or later, you will want to run that code on a
| system with different sizes, and it won't work.
That's because FORTRAN was designed around a word machine,
where float precisions offered were single and/or single and double precision.
That could cause different actions when moving from a 32-bit word machine
to 60/64-bit word machine,
Later languages, including PL/I, were designed bearing in mind
byte machines. The way in which float precisions are specified
lends itself to portability. Thus, asking for say, 12 decimal digits
got you about that number of digits, whether the machine offered
32-bit or 60-bit words. Some machines actually offered decimal
(BCD) floats in hardware,* and asking for 12 decimals got you about
that number of digits.
| The vast
| majority of clean programs, even in C90, needed NO source
| changes to go from 32 to 64 bits; I have written code that was
| moved to systems with 16, 24, 32, 36, 48, 60, 64 and perhaps
| other sizes, and so have many other people.
Indeed. (Including doing portable character handling before F77.)
| I accept that Marsaglia's generator is a valid case for using
| fixed-width types, but that is likely to make it very inefficient
| indeed on a system where they are not natural.
I very much doubt that any new machine is going to offer less than 32 bit
integer words ; for some years now the trend has been to machines of
64-bit integers. Marsaglia's (integer) algorithms will run on such machines,
some possibly with minor alterations, but will not, even if modified,
will not run "inefficiently".
| Masking is a
| far more portably efficient approach.
_______________
| As those of us with significant experience of portability will
| remember, building fixed sizes into a program is a cardinal
| error. Sooner or later, you will want to run that code on a
| system with different sizes, and it won't work.
That's because FORTRAN was designed around a word machine,
where float precisions offered were single and/or single and double precision.
That could cause different actions when moving from a 32-bit word machine
to 60/64-bit word machine,
Later languages, including PL/I, were designed bearing in mind
byte machines. The way in which float precisions are specified
lends itself to portability. Thus, asking for say, 12 decimal digits
got you about that number of digits, whether the machine offered
32-bit or 60-bit words. Some machines actually offered decimal
(BCD) floats in hardware,* and asking for 12 decimals got you about
that number of digits.
| The vast
| majority of clean programs, even in C90, needed NO source
| changes to go from 32 to 64 bits; I have written code that was
| moved to systems with 16, 24, 32, 36, 48, 60, 64 and perhaps
| other sizes, and so have many other people.
Indeed. (Including doing portable character handling before F77.)
| I accept that Marsaglia's generator is a valid case for using
| fixed-width types, but that is likely to make it very inefficient
| indeed on a system where they are not natural.
I very much doubt that any new machine is going to offer less than 32 bit
integer words ; for some years now the trend has been to machines of
64-bit integers. Marsaglia's (integer) algorithms will run on such machines,
some possibly with minor alterations, but will not, even if modified,
will not run "inefficiently".
| Masking is a
| far more portably efficient approach.
_______________
| As those of us with significant experience of portability will
| remember, building fixed sizes into a program is a cardinal
| error. Sooner or later, you will want to run that code on a
| system with different sizes, and it won't work.
That's because FORTRAN was designed around a word machine,
where float precisions offered were single and/or single and double precision.
That could cause different actions when moving from a 32-bit word machine
to 60/64-bit word machine,
Later languages, including PL/I, were designed bearing in mind
byte machines. The way in which float precisions are specified
lends itself to portability. Thus, asking for say, 12 decimal digits
got you about that number of digits, whether the machine offered
32-bit or 60-bit words. Some machines actually offered decimal
(BCD) floats in hardware,* and asking for 12 decimals got you about
that number of digits.
| The vast
| majority of clean programs, even in C90, needed NO source
| changes to go from 32 to 64 bits; I have written code that was
| moved to systems with 16, 24, 32, 36, 48, 60, 64 and perhaps
| other sizes, and so have many other people.
Indeed. (Including doing portable character handling before F77.)
| I accept that Marsaglia's generator is a valid case for using
| fixed-width types, but that is likely to make it very inefficient
| indeed on a system where they are not natural.
I very much doubt that any new machine is going to offer less than 32 bit
integer words ; for some years now the trend has been to machines of
64-bit integers. Marsaglia's (integer) algorithms will run on such machines,
some possibly with minor alterations, but will not, even if modified,
will not run "inefficiently".
| Masking is a
| far more portably efficient approach.
_______________
| As those of us with significant experience of portability will
| remember, building fixed sizes into a program is a cardinal
| error. Sooner or later, you will want to run that code on a
| system with different sizes, and it won't work.
That's because FORTRAN was designed around a word machine,
where float precisions offered were single and/or single and double precision.
That could cause different actions when moving from a 32-bit word machine
to 60/64-bit word machine,
Later languages, including PL/I, were designed bearing in mind
byte machines. The way in which float precisions are specified
lends itself to portability. Thus, asking for say, 12 decimal digits
got you about that number of digits, whether the machine offered
32-bit or 60-bit words. Some machines actually offered decimal
(BCD) floats in hardware,* and asking for 12 decimals got you about
that number of digits.
| The vast
| majority of clean programs, even in C90, needed NO source
| changes to go from 32 to 64 bits; I have written code that was
| moved to systems with 16, 24, 32, 36, 48, 60, 64 and perhaps
| other sizes, and so have many other people.
Indeed. (Including doing portable character handling before F77.)
| I accept that Marsaglia's generator is a valid case for using
| fixed-width types, but that is likely to make it very inefficient
| indeed on a system where they are not natural.
I very much doubt that any new machine is going to offer less than 32 bit
integer words ; for some years now the trend has been to machines of
64-bit integers. Marsaglia's (integer) algorithms will run on such machines,
some possibly with minor alterations, but will not, even if modified,
will not run "inefficiently".
| Masking is a
| far more portably efficient approach.
_______________
> | I accept that Marsaglia's generator is a valid case for using
> | fixed-width types, but that is likely to make it very inefficient
> | indeed on a system where they are not natural.
>
> I very much doubt that any new machine is going to offer less than 32 bit
> integer words ; for some years now the trend has been to machines of
> 64-bit integers.
Not sure about the kind and specifics of technical systems in which
number generation plays a part.
But in case it does matter in computers embedded in other
systems: the number of <32bit microcontrollers exceeds
the number of >=32bit processors built into PCs and
mobile computers (including "cellphones").
According to Wikipedia, counting all CPUs sold, even the share
of 8bit µcontrollers is more than a half. (They give sources for
the numbers.)
TTBOMK, it is very common that C compilers for 8bit
µcontrollers will offer int as a 16bit type.
Looks like ISP server's malfunctioned again.
In fact, I believe the C standard says that "int" must be at least 16
bits.
--
Ilmari Karonen
To reply by e-mail, please replace ".invalid" with ".net" in address.
Wikipedia is not a reliable source.
> "Georg Bauhaus" <rm.dash...@futureapps.de> wrote in message
> news:4dda09ca$0$6629$9b4e...@newsspool2.arcor-online.net...
> |
> | According to Wikipedia, counting all CPUs sold, even the share
> | of 8bit µcontrollers is more than a half. (They give sources for
> | the numbers.)
>
> Wikipedia is not a reliable source.
It's still right. A coffee-machine or toaster doesn't necessarily need
a 32-bit processor, so there are hell a lot of 8-bit micros still being
sold these days.
Vinzent.
--
f u cn rd ths, u cn gt a gd jb n cmptr prgrmmng.
They list their references, you didn't.
Or as Vincent explains, it takes just a moment of reflection to
learn about the number of devices and parts controlled
by µcontrollers whose words are sized < 32bit.
Whilst wikipeadia is not accurate all the other figures I have seen*
showed that up until about 5 years ago 1 in 3 MCU's on the planet were
the 8 bit 8051 types. Remember at one time every PC had at least two
8051's in it.
* Various sets of non public figures on industry surveys, also talking
to various silicon companies.
When the ARM parts arrived they hardly touched the 8 bit market but did
squeeze the 16 bit market. The 16 bit market was small compared to the
8 and 4 bit markets.
Yes there were (and still are) a lot of 4 bit micros used. AFAIK the 4
bit market is not small as all sorts of things like toys use them.
However it is not growing.
The 8 bit market is declining but that is only very recently with the
advent of cheap cortex parts. It is still large and the installed base
of parts is enormous.
--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills Staffs England /\/\/\/\/
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
I've lost count of the number of times I've corrected incorrect
information in Wikipedia.
"But you would say that" :-)
And then some one else "corrects" your stuff..... I personally have
seen the case where someone was "correcting" a page they had written
after the originator of the work on finding the wiki page had changed a
lot of the page to be accurate.
So even when some one who really knows corrects the Wiki there is no
guarantee that some idiot will not change it again.
Some companies, religions, pressure groups, political groups spend time
"correcting" wiki pages to reflect their views and "repairing" damaged
caused by vandals.... ie other companies, religions, pressure groups,
political groups spending time "correcting" wiki pages to reflect their
views
Wikipedia is certainly not the most reliable source in the world, but
personally, I've found Wikipedia to be substantially more reliable than
any other information source that is comparably easy to access.
However, since that Wikipedia article (like most such articles) cites
sources for it's numbers, the reliability of Wikipedia is irrelevant;
what matters is the reliability of those sources. Have you anything
useful to say about that?
It's only worthwhile pointing out the unreliability of wikipedia if you
can identify a more reliable source.
--
James Kuyper
Then go ahead and correct it: what do you claim to be the correct
numbers, and what are your sources for those numbers?
--
James Kuyper
That is probably true... which is why people are manipulating it.
>It's only worthwhile pointing out the unreliability of wikipedia if you
>can identify a more reliable source.
That is not true. Unreliable information should be removed if it is
wrong. Even if you don't have anything better. A blank screen is better
than an incorrect or misleading one.
We are drowning in a sea of unreliable information and complete
fabrications that people believe are true because no one challenges them
and people want easy access.
Are you saying that sources (about analysis of number of
processors sold) referenced on the current page are incorrect
and should not be referred to?
If you are justified in your belief that something is wrong, you will
have an alternative source that you consider more reliable. If so, you
should cite it; without such a citation, other people cannot judge the
accuracy of your belief that it is, in fact, a more reliable source.
--
James Kuyper
Not always. Also in many cases not information that can be put on a
public web page. It might surprise you that in the information age
information is power and a lot of it is NOT in the public domain.
There is a very stupid belief these days that if it is not on the
Internet it is not real. So if you can't provide a link it is not
real.... I was discussing something similar with a friend who was at
the start or the Internet and was discussing this in a forum. When
challenged for links to prove what he said (as him saying "I was there
did not count") he replied with "two filing cabinets beside my desk".
> If so, you
>should cite it; without such a citation, other people cannot judge the
>accuracy of your belief that it is, in fact, a more reliable source.
SO if I write some complete crap on a wiki page (with no citations) it
should stand unless some one has citations to prove otherwise?
What you are saying is that any old rubbish can go on wiki unless some
one has the time and resources (ie money) maintain the page to put
something else up?
Besides often you have to be prepared to battle nutters and zealots who
won't accept reality. Why should I spend time and effort on that?
No wonder wiki is a mess. Yes a lot of it is good ad accurate but a
hell of a lot is a mess.
If Wiki is not correct then it is wrong
Eventually some of these groups get black-listed. Last I heard the
Scientologists were locked out from updating anything having to do with
Scientology.
> "Georg Bauhaus" <rm.dash...@futureapps.de> wrote in message
> news:4dda09ca$0$6629$9b4e...@newsspool2.arcor-online.net...
> |
> | According to Wikipedia, counting all CPUs sold, even the share
> | of 8bit µcontrollers is more than a half. (They give sources for
> | the numbers.)
>
> Wikipedia is not a reliable source.
It's as reliable as any encyclopedia. What it isn't, is original
research -- and it doesn't pretend to be.
Do you have a more reliable source that says otherwise?
That is the problem we face... stupidity like that.
Wiki is very far from being as reliable as any other encyclopaedia.
I know some one who has a written a page entry for the Encyclopaedia
Britanica. He is a world expert in the subject which is why he was
asked to do it. When he finished the item it was peer reviewed by other
world class experts. It is like that for al their entries. The same
with most other encyclopaedias. They take a lot of care.
That sort of level of care does not go into wiki pages. Anyone can write
anything on any page. There was an experiment done 3-4 years ago to see
if was possible to get ridiculous changes past the page editors. IT was
so successful that after owning up some of the changes were not reversed
until the experimenters re-edited the pages themselves.
SO apart from the usual mistakes, and the authors being anything but
experts, there are those with differing views counter editing and of
course malicious editing. You don't get these problems in other
encyclopaedias.
In short due the openness of the wiki it is far less reliable than any
other encyclopaedia. Because no one is responsible in any meaningful way
for what is on wikipeadia.
The only study I've seen showed an error rate for wikipedia roughly 30%
higher (per article) than britannica, while a follow-on pointed out that
the articles themselves were 2.6 times longer.
Yes, it's possible to deliberately "game" wikipedia in order to prove a
point, and politicians and others who have an interest in the content of
articles vandalize them. The number of 8-bit processors doesn't come
into these categories, and besides (as has been pointed out by others)
the wikipedia article cites sources.
So, rather than blanket statements regarding the unreliability of wikis,
or anecdotes regarding your friend the expert, are you claiming the
wikipedia author mis-quoted the statistics? Or do you have a "more
reliable" source that says says something different?
The only study I've seen showed an error rate for wikipedia roughly 30%
higher (per article) than britannica, while a follow-on pointed out that
the articles themselves were 2.6 times longer.
Yes, it's possible to deliberately "game" wikipedia in order to prove a
point, and politicians and others who have an interest in the content of
articles vandalize them. The number of 8-bit processors doesn't come
into these categories, and besides (as has been pointed out by others)
the wikipedia article cites sources.
So, rather than blanket statements regarding the unreliability of wikis,
or anecdotes regarding your friend the expert, are you claiming the
wikipedia author mis-quoted the statistics? Or do you have a "more
reliable" source that says says something different?
A few years back I had written that the Wikipedia page
on Brownian motion was about as good as what I'd expect from
Encyclopaedia Britannica. Unfortunately, I hadn't looked at
what Encyclopaedia Britannica had on Brownian motion.
That was at a time when Encyclopaedia Britannica was a more accurate reference
than Wikipedia. It would be interesting to find those old web pages,
if they still exist.
David Bernier
--
The MegaPenny Project | One Trillion Pennies:
<http://www.kokogiak.com/megapenny/thirteen.asp>
That is the key point that nobody else has mentioned. Material on
wikipedia must be sourced. You should not edit just because you "know"
it is wrong. You must have a source. If you can show from another source
that the original source is incorrect, that is fine. Not using sources is
called "original research" on wikipedia and it is not allowed under the
guidelines. It is why wikipedia is actually so much better than you
might expect it to be.
--
Brian Salter-Duke Melbourne, Australia
My real address is b_duke(AT)bigpond(DOT)net(DOT)au
Use this for reply or followup
"A good programmer can write Fortran in any language"
A real encyclopedia.
If so, please provide a counter-example. If it happens often enough to
justify bothering to mention that possibility, it shouldn't be hard to
come up with one.
> ... Also in many cases not information that can be put on a
> public web page. It might surprise you that in the information age
> information is power and a lot of it is NOT in the public domain.
One of the costs of secrecy is that people reach incorrect conclusions
and make bad decisions based upon the absence of the information that's
been kept secret. That's not their fault, it's the fault of the secret
keepers, and in an ideal world the secret keepers would be held liable
for the costs of those badly made decisions.
The existence of secrets is not adequate justification for criticizing
Wikipedia; it makes no claim to being able to penetrate people's secrets
- that's Wikileaks you're thinking of.
> There is a very stupid belief these days that if it is not on the
> Internet it is not real. So if you can't provide a link it is not
Who said anything about a link? I just asked for a citation. You
remember those - they predate the Internet; they predate the invention
of electronics; they predate the invention of the the printing press.
> real.... I was discussing something similar with a friend who was at
> the start or the Internet and was discussing this in a forum. When
> challenged for links to prove what he said (as him saying "I was there
> did not count") he replied with "two filing cabinets beside my desk".
A citation that cannot be checked by the person you're communicating
with is useless; if such a citation is the only reason you can give for
believing something, the other person is well justified in being
skeptical about it. You might be right, but you've not given him
adequate justification to believe you.
>> If so, you
>> should cite it; without such a citation, other people cannot judge the
>> accuracy of your belief that it is, in fact, a more reliable source.
>
> SO if I write some complete crap on a wiki page (with no citations) it
> should stand unless some one has citations to prove otherwise?
How did you reach such a stupid conclusion? There's not even the
remotest connection between what I said and your response. Wikipedia's
standards require citations; the editors do clean up wiki pages that
lack them; and the particular page currently under discussion had citations.
> What you are saying is that any old rubbish can go on wiki unless some
> one has the time and resources (ie money) maintain the page to put
> something else up?
Again, that comment has no logical connection to anything which I said,
which was about wiki page which did have citations, just as most of them do.
> Besides often you have to be prepared to battle nutters and zealots who
> won't accept reality. Why should I spend time and effort on that?
That's a different matter; I've never bothered fixing a wiki page, so I
could hardly criticize someone else for failing to do so. On the other
hand, I've recognized very few errors on those pages. This is partly
because I use Wikipedia mainly to look up things I don't know about.
However, I've also frequently looked at Wikipedia pages covering topics
I'm an expert in; I've seldom seen any defect in any of those pages that
was serious enough that I'd want to bother correcting it, even if I had
endless free time. The worst cases I've seen are pages that were clearly
written by non-native speakers of English, and even those were far
cleaner than the typical message I receive from co-workers and in-laws
who aren't native speakers of English.
> If Wiki is not correct then it is wrong.
Of course it's not correct. No significant repository of knowledge is
free from errors. It's just a question of how many, and what type. If
you expect perfection, you're dreaming. If you consider a source of
information unusable solely because it has errors, without
quantification of those errors, there aren't any usable sources.
--
James Kuyper
Wikipedia's quite real, it's just not on paper.
Which dead-tree encyclopedia has an up-to-date article covering this
topic, and what does it say about the question?
--
James Kuyper
He's been asked several times now whether he has a more reliable source,
and he hasn't bothered responding to those questions. That's pretty
strong evidence that he doesn't have any, though it's not conclusive.
--
James Kuyper
> "Joe Pfeiffer" <pfei...@cs.nmsu.edu> wrote in message news:1bei2e5...@snowball.wb.pfeifferfamily.net...
> | "robin" <rob...@dodo.mapson.com.au> writes:
> |
> | > "Georg Bauhaus" <rm.dash...@futureapps.de> wrote in message
> | > news:4dda09ca$0$6629$9b4e...@newsspool2.arcor-online.net...
> | > |
> | > | According to Wikipedia, counting all CPUs sold, even the share
> | > | of 8bit µcontrollers is more than a half. (They give sources for
> | > | the numbers.)
> | >
> | > Wikipedia is not a reliable source.
> |
> | It's as reliable as any encyclopedia. What it isn't, is original
> | research -- and it doesn't pretend to be.
> |
> | Do you have a more reliable source that says otherwise?
>
> A real encyclopedia.
Remember the "says otherwise" part. Which "real" encyclopedia claims
less than all processors are 8-bit?
That seems like a pretty safe bet. I just get really annoyed by people
who respond to wikipedia quotes by dismissing it as "unreliable" or "not
authoritative" or any of a dozen other near-synonyms, as though
reliability were a boolean function, and as though any encyclopedia has
ever been useable as a source for... well, just about anything beyond
general information or an initial overview before turning to real
sources, really.
> "Joe Pfeiffer" <pfei...@cs.nmsu.edu> wrote in message news:1bei2e5...@snowball.wb.pfeifferfamily.net...
> | "robin" <rob...@dodo.mapson.com.au> writes:
> |
> | > "Georg Bauhaus" <rm.dash...@futureapps.de> wrote in message
> | > news:4dda09ca$0$6629$9b4e...@newsspool2.arcor-online.net...
> | > |
> | > | According to Wikipedia, counting all CPUs sold, even the share
> | > | of 8bit µcontrollers is more than a half. (They give sources for
> | > | the numbers.)
> | >
> | > Wikipedia is not a reliable source.
> |
> | It's as reliable as any encyclopedia. What it isn't, is original
> | research -- and it doesn't pretend to be.
> |
> | Do you have a more reliable source that says otherwise?
>
> A real encyclopedia.
(assuming "real" == "printed") Which one?
And much more importantly, what does it say about the proportion of
total CPUs sold that are 8 bit microcontrollers?
--
Online waterways route planner | http://canalplan.eu
Plan trips, see photos, check facilities | http://canalplan.org.uk
> In message <iuco82$u6s$1...@dont-email.me>, James Kuyper
> <james...@verizon.net> writes
> >On 06/28/2011 09:03 AM, Chris H wrote:
> >> In message <iuchk8$gha$1...@dont-email.me>, James Kuyper
> >> <james...@verizon.net> writes
> >...
> >>> It's only worthwhile pointing out the unreliability of wikipedia if you
> >>> can identify a more reliable source.
> >>
> >> That is not true. Unreliable information should be removed if it is
> >> wrong.
> >
> >If you are justified in your belief that something is wrong, you will
> >have an alternative source that you consider more reliable.
>
> Not always. Also in many cases not information that can be put on a
> public web page. It might surprise you that in the information age
> information is power and a lot of it is NOT in the public domain.
'Twas ever thus. Look at the trouble it took to get
Christian scripture published in English. Likely
it would never have happened without movable type.
> There is a very stupid belief these days that if it is not on the
> Internet it is not real. So if you can't provide a link it is not
> real.... I was discussing something similar with a friend who was at
> the start or the Internet and was discussing this in a forum. When
> challenged for links to prove what he said (as him saying "I was there
> did not count") he replied with "two filing cabinets beside my desk".
>
> > If so, you
> >should cite it; without such a citation, other people cannot judge the
> >accuracy of your belief that it is, in fact, a more reliable source.
>
> SO if I write some complete crap on a wiki page (with no citations) it
> should stand unless some one has citations to prove otherwise?
Else how are we to know it is complete crap?
> What you are saying is that any old rubbish can go on wiki unless some
> one has the time and resources (ie money) maintain the page to put
> something else up?
As I understand it, Wikipedia demands references
to peer reviewed literature
> Besides often you have to be prepared to battle nutters and zealots who
> won't accept reality. Why should I spend time and effort on that?
Agree. Battling nutters and zealots is usually a poor use of our time.
> No wonder wiki is a mess.
Disagree. It is as orderly or more so than many public forums.
> Yes a lot of it is good ad accurate but a
> hell of a lot is a mess.
>
>
> If Wiki is not correct then it is wrong
--
Michael Press
>>>> Wikipedia is not a reliable source.
[...]
>>> Do you have a more reliable source that says otherwise?
>> A real encyclopedia.
> Wikipedia's quite real, it's just not on paper.
Printed versions are also available, not to mention that one can
always print it him- or herself.
> Which dead-tree encyclopedia has an up-to-date article covering this
> topic, and what does it say about the question?
... The problem is that it's not as hard to have some kind of
utter rubbish published as someone may imagine. Actually, I
guess that the more than half of the stuff printed nowadays is
anything but reliable.
Wikipedia has its policies, just as virtually any other serious
(for certain definitions of the word) encyclopedia. My belief
is that these policies work. Most of the time, at the least.
--
FSF associate member #7257
I believe that you meant to write "... less than half of all ..."?
--
James Kuyper
Encyclopediae have their own set of problems. The data is often
outdated, and they don't have the breadth of coverage someone like
Wikipedia can have.
And when was the article written?
On the other hand, the original is definitely a true statement;-)
Well, I used to work for a top-10 semiconductor manufacturer and there,
by number of units, microcontrollers easily outsold 32-bit CPUs.
Even if every moderately advanced phone has at least 3 32-bit ARM
cores on it (many have 7 or more). The number of microcontrollers in
automative applications absolutely dwarfs that. I know we were
supplying 30 processors for some vehicles, and I know we weren't the
only supplier.
Every time they make things like microwaves more hi-tech so that they
need a 16- or 32-bit processor, they find something even dumber to
put 8-bit microcontrollers into.
Phil
--
"At least you know where you are with Microsoft."
"True. I just wish I'd brought a paddle." -- Matthew Vernon
Au contraire - he responded. In the same way that a dead frog's
leg responds to a galvanic impulse.
Most mobile phone SIMS are (or were) an 8051 type. I know I worked on
them.
> The number of microcontrollers in
>automative applications absolutely dwarfs that. I know we were
>supplying 30 processors for some vehicles, and I know we weren't the
>only supplier.
>
>Every time they make things like microwaves more hi-tech so that they
>need a 16- or 32-bit processor, they find something even dumber to
>put 8-bit microcontrollers into.
Also most of the smart cards (ie credit, debit charge etc) with a chip
in were 8051 types.
It will change as the cheap low power 32bit platforms gain market share
but there are already billions of 8 bit systems out there.
--
Support Sarah Palin for the next US President
Go Palin! Go Palin! Go Palin!
In God We Trust! Rapture Ready!!!
http://www.sarahpac.com/
> Dr Nick <3-no...@temporary-address.org.uk> writes:
>> "robin" <rob...@dodo.mapson.com.au> writes:
>> > "Joe Pfeiffer" <pfei...@cs.nmsu.edu> wrote in message news:1bei2e5...@snowball.wb.pfeifferfamily.net...
>> > | "robin" <rob...@dodo.mapson.com.au> writes:
>> > |
>> > | > "Georg Bauhaus" <rm.dash...@futureapps.de> wrote in message
>> > | > news:4dda09ca$0$6629$9b4e...@newsspool2.arcor-online.net...
>> > | > |
>> > | > | According to Wikipedia, counting all CPUs sold, even the share
>> > | > | of 8bit µcontrollers is more than a half. (They give sources for
>> > | > | the numbers.)
>> > | >
>> > | > Wikipedia is not a reliable source.
>> > |
>> > | It's as reliable as any encyclopedia. What it isn't, is original
>> > | research -- and it doesn't pretend to be.
>> > |
>> > | Do you have a more reliable source that says otherwise?
>> >
>> > A real encyclopedia.
>>
>> (assuming "real" == "printed") Which one?
>>
>> And much more importantly, what does it say about the proportion of
>> total CPUs sold that are 8 bit microcontrollers?
>
> Well, I used to work for a top-10 semiconductor manufacturer and there,
> by number of units, microcontrollers easily outsold 32-bit CPUs.
>
> Even if every moderately advanced phone has at least 3 32-bit ARM
> cores on it (many have 7 or more). The number of microcontrollers in
> automative applications absolutely dwarfs that. I know we were
> supplying 30 processors for some vehicles, and I know we weren't the
> only supplier.
>
> Every time they make things like microwaves more hi-tech so that they
> need a 16- or 32-bit processor, they find something even dumber to
> put 8-bit microcontrollers into.
Just for the record, I have no dog in the fight about the proportion of
microprocessors. I'm just responding to the point that "Wikipedia is
not a reliable source" and "a real encyclopedia" is with a gentle
request for what a real encyclopedia says on the subject.
SO it is unreliable.
>It's a volunteer effort,
Yes anyone can write any old rubbish to a wiki page.
>so if you have expertise feel free to improve
>the situation.
No thanks.
>Wiki is certainly more reliable than the mainstream
>news media.
And less reliable than encycoppedia's
>Is it better or worse than printed encyclopedias? Yes. Some of them
>have errors as egregious as those in Wiki.
But generally better and not as open to abuse.
The same things that make printed encyclopaedias not as open to abuse,
make them not open to correction.
--
When a true genius appears in the world, you may know him by
this sign, that the dunces are all in confederacy against him.
Jonathan Swift: Thoughts on Various Subjects, Moral and Diverting