Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Computer of the century

45 views
Skip to first unread message

John Hendrickx

unread,
Dec 21, 1999, 3:00:00 AM12/21/99
to
Bohemian Rhapsody was voted record of the century (in a Dutch poll), the
T-Ford is car of the century, Time magazine has been polling readers
about persons of the century in various fields for some months now. But
what was the computer of the century?

A quick AltaVista search turned up two quotes, one for the e-mate as
computer of the century, the second of "William B. Sanders. Computer of
the Century: A Guide to the Radio Shack Model 100 . (Chatsworth, CA. :
Datamost, Inc.) 1984." The former is hype, the latter was premature. An
obvious choice would be the IBM 360. It brought computing to businesses,
for better or for worse, and gave IBM a virtual monopoly during the 60s
and 70s. Or does some other computer deserve this distinction? And have
there been any polls for something that has only been around for half a
century?

gnohm...@my-deja.com

unread,
Dec 21, 1999, 3:00:00 AM12/21/99
to
Apple II. The historic stempoint of home computing, an incredible piece of
engineering, and an incredible commercial success. Withou the Apple II,
there'd be no trs80 nor ibmpc.

The 360 isn't such a bad choice, but the Apple II is the model-T of computing.


Sent via Deja.com http://www.deja.com/
Before you buy.

John Birch

unread,
Dec 21, 1999, 3:00:00 AM12/21/99
to
On Tue, 21 Dec 1999 12:38:44 +0100, John Hendrickx
<J.Hen...@mailbox.kun.nl> wrote:

>Bohemian Rhapsody was voted record of the century (in a Dutch poll), the
>T-Ford is car of the century, Time magazine has been polling readers
>about persons of the century in various fields for some months now. But
>what was the computer of the century?

Hmmm.... I'm a Queen fan too :-)

>A quick AltaVista search turned up two quotes, one for the e-mate as
>computer of the century, the second of "William B. Sanders. Computer of
>the Century: A Guide to the Radio Shack Model 100 . (Chatsworth, CA. :
>Datamost, Inc.) 1984." The former is hype, the latter was premature. An
>obvious choice would be the IBM 360. It brought computing to businesses,
>for better or for worse, and gave IBM a virtual monopoly during the 60s
>and 70s. Or does some other computer deserve this distinction? And have
>there been any polls for something that has only been around for half a
>century?

Surely the Sinclair ZX80/1 should be in the running for the impact it
had on personal computing, so many people got their first experience
of a computer with one of them. 'course there were a number of other
also rans but I'd propose a Sinclair.....


regards John B.


John Ahlstrom

unread,
Dec 21, 1999, 3:00:00 AM12/21/99
to
John Hendrickx wrote:


> But
> what was the computer of the century?
>

--snip snip

> An
> obvious choice would be the IBM 360. It brought computing to businesses,
> for better or for worse, and gave IBM a virtual monopoly during the 60s
> and 70s. Or does some other computer deserve this distinction?
>

No one can ignore the commercial success of the 360. I think on
technological grounds, however, the B5000 might be a better choice
as the first with a number of important characteristics that then
became almost universal, to wit:
1. Commercial and Scientific data types and instructions
2. Multiprocessing
3. Protected Multi-tasking OS
4. System and User modes of operation
5. Floating IO/Processors/Channels
6. Economical high-level-language programming
7. HLL Operating System
8. Commercial Virtual Memory - Atlas beat it by some months
but was never commercially successful

--
Will we be reviled in the 9990s for not having had the
foresight to use 5 digit dates? We know the need for them
is coming. We pretend none of our apps or data bases will
still be in use by then.

Dave Hansen

unread,
Dec 21, 1999, 3:00:00 AM12/21/99
to
On Tue, 21 Dec 1999 12:38:44 +0100, John Hendrickx
<J.Hen...@mailbox.kun.nl> wrote:

>Bohemian Rhapsody was voted record of the century (in a Dutch poll), the

Record or cut? "Bohemian Rhapsody" isn't a record, it's a single. It
was on the "A Night at the Opera" album. Not a horrible choice, but
I'm sure I'm not alone is disagreeing that it is "the (song) of the
century. For album, I'd nominate the Beatles' "Sgt. Pepper". The
single is tougher, but it's have to be earlier -- probably one of
Elvis'. You can see where my prejudices lie...

>T-Ford is car of the century, Time magazine has been polling readers

>about persons of the century in various fields for some months now. But

>what was the computer of the century?

I'd have to say the IBM PC. It wasn't the first. It wasn't the best.
But it was the one that changed everything. For better or for worse.

Regards,

-=Dave
Just my (10-010) cents
I can barely speak for myself, so I certainly can't speak for B-Tree.
Change is inevitable. Progress is not.

Jonesy

unread,
Dec 21, 1999, 3:00:00 AM12/21/99
to
John Hendrickx <J.Hen...@mailbox.kun.nl> wrote:
> Bohemian Rhapsody was voted record of the century (in a Dutch poll), the
> T-Ford is car of the century, Time magazine has been polling readers
> about persons of the century in various fields for some months now. But
> what was the computer of the century?

I'll give it some thought over the next 376 days and get back to you.

Jonesy

--
Marvin L. Jones jonz-AT-rmi.net W3DHJ
Gunnison, Colorado
10 days to go until the Year 2000 -- So what!
376 days to go until the 3rd Millennium of the C.E.

David Given

unread,
Dec 21, 1999, 3:00:00 AM12/21/99
to
In article <385f8fe8...@news.i12.com>,
jo...@invision.co.uk (John Birch) writes:
[...]

> Surely the Sinclair ZX80/1 should be in the running for the impact it
> had on personal computing, so many people got their first experience
> of a computer with one of them. 'course there were a number of other
> also rans but I'd propose a Sinclair.....

I'd agree with this; it would have to be either a Sinclair (the ZX8[01] or
the Spectrum), or the Apple II. Those machines changed the face of
computing. Before then, computers were big, bulky, expensive machines
hidden away in a secret temple, only accessible by their acolytes; after
then, computers were consumer items.

I'd say that without the 8-bit microcomputer, the modern PC would be
completely different; and without the ZX8[01] and the Apple, the 8-bit
microcomputer market would have been unrecognisable.

(Aren't there a bunch of nutters somewhere recreating a ZX80 from scratch
with modern components?)

--
+- David Given ---------------McQ-+
| Work: d...@tao-group.com | Closed mouths gather no feet.
| Play: dgi...@iname.com |
+- http://wired.st-and.ac.uk/~dg -+

lwin

unread,
Dec 21, 1999, 3:00:00 AM12/21/99
to
My vote would be for S/360 as it revolutionized commercial computing.
It did not result in a computer being on everyone's desk, but it did
bring computers to the business mainstream and allowed for more people
to have contact with them than before.

The system design was revolutionary for its day--one universal
architecture serving high and low, business and science, and a wide
variety of peripherals. This led to a great deal of standardization
and mass production. The standardization allowed many independent
vendors to get in the business by developing alternative peripherals
and software. And the architecture remains in use today, 35 years
later.

Brix

unread,
Dec 21, 1999, 3:00:00 AM12/21/99
to
In article <385f8fe8...@news.i12.com>, John Birch says...
> On Tue, 21 Dec 1999 12:38:44 +0100, John Hendrickx

> <J.Hen...@mailbox.kun.nl> wrote:
>
> >Bohemian Rhapsody was voted record of the century (in a Dutch poll), the
> >T-Ford is car of the century, Time magazine has been polling readers
> >about persons of the century in various fields for some months now. But
> >what was the computer of the century?

> Surely the Sinclair ZX80/1 should be in the running for the impact it


> had on personal computing, so many people got their first experience
> of a computer with one of them. 'course there were a number of other
> also rans but I'd propose a Sinclair.....

I'd suggest 2 of them:
1) the IBM PC, because IBM-Compatibles rule today's world (sadly).
2) The C64. It was the most successful computer ever built by a single
manufacturer.

-Wanja-

--
In the 1960s you needed the power of two C64s to get a
rocket to the moon. Now you need a machine which is a vast
number of times more powerful just to run the most popular GUI.

Personal HP: http://www.plush.de/brix

GO64!/Commodore World Magazine:
CSW-Verlag, Goethestr.22, 71364 Winnenden, Deutschland
Tel:/Fax: +49(0)7195/61120, http://www.go64.c64.org
--

wri...@sabu.ebay.sun.com

unread,
Dec 21, 1999, 3:00:00 AM12/21/99
to
Has to be the IBM PC.

Sabu

mstrjon32

unread,
Dec 21, 1999, 3:00:00 AM12/21/99
to
i think apple lisa would fit the description, it was the first computer to
show the world the graphic user interface, what we all use today...


John Hendrickx <J.Hen...@mailbox.kun.nl> wrote in message
news:MPG.12c98531d...@news.uci.kun.nl...


> Bohemian Rhapsody was voted record of the century (in a Dutch poll), the
> T-Ford is car of the century, Time magazine has been polling readers
> about persons of the century in various fields for some months now. But
> what was the computer of the century?
>

> A quick AltaVista search turned up two quotes, one for the e-mate as
> computer of the century, the second of "William B. Sanders. Computer of
> the Century: A Guide to the Radio Shack Model 100 . (Chatsworth, CA. :

> Datamost, Inc.) 1984." The former is hype, the latter was premature. An


> obvious choice would be the IBM 360. It brought computing to businesses,
> for better or for worse, and gave IBM a virtual monopoly during the 60s

Dennis O'Connor

unread,
Dec 21, 1999, 3:00:00 AM12/21/99
to
John Hendrickx wrote:
> But what was the computer of the century?

The "bombes" the British used to crack Enigma.
Arguably the computers that saved the world,
and the beginning of the computer industry.
--
Dennis O'Connor dm...@primenet.com
Vanity Web Page http://www.primenet.com/~dmoc/

Dave Daniels

unread,
Dec 21, 1999, 3:00:00 AM12/21/99
to
In article <83oeck$2ee$1...@nnrp1.deja.com>,

<gnohm...@my-deja.com> wrote:
> The 360 isn't such a bad choice, but the Apple II is the model-T of computing.

The Apple II might hold that position in the US, but from the UK
perspective I agree with the people who suggest the ZX80 and ZX81.

The ZX81 was a usable computer that cost under 100 UKP. IIRC the
kit version was just under 50 UKP and an extra 16K of RAM (the
infamous wobbly 16K RAM pack) cost the same. The documentation was
comprehensive and included things such as a list of Z80 opcodes
that were of invaluable use to a generation of budding computer
geeks. Before the ZX machines you either paid several hundred quid
for a machine or ended up with something that had a hex keypad and
an eight segment LED display. The ZX machines were on the leading
edge of the UK micro computer boom of the early eighties and IMHO
did an enormous amount to promote and encourage it.

Dave Daniels

--
ANTISPAM: Please note that the email address above is false. My
correct address is:

dave_daniels<at>argonet<dot>co<dot>uk

Please replace the <at> and <dot>s with @ and . respectively when
replying - Thanks!

we...@nospam.mediaone.net

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
John Hendrickx <J.Hen...@mailbox.kun.nl> writes:

>Bohemian Rhapsody was voted record of the century (in a Dutch poll), the
>T-Ford is car of the century, Time magazine has been polling readers

>about persons of the century in various fields for some months now. But

>what was the computer of the century?

DEC PDP-10. Great hardware, great software, wonderful to program. When
it was time to move on, people did, but they never forgot where they had
been.

From http://www.inwap.com/pdp10/uta-shutdown.txt and elsewhere:

THE SOUL OF AN OLD MACHINE

Clive B. Dawson

(C) 1984


I started work for the University Computation Center in 1975 as a systems
programmer for the DEC-10, just a couple of months after it arrived on
campus. My previous experience with a DEC-10 had ended when I graduated
from Stevens Tech. Since then, I had spent four rather painful years doing
graduate work on a CDC 6600 system for which I had to learn how to keypunch
again. I welcomed the arrival of the 10 with the joy of someone being
released from prison. I can't begin to count the hours I spent on that
system--well over 10,000 connect hours-- developing software for it, fixing
bugs, and helping users. In turn, it helped produce dissertations for both
my wife and me, and was an endless source of fun and relaxation as well.
(It was also responsible for extending my graduate school career by at
least four years!)

The KI processor had served the campus well for seven and a half years.
Now the user population was drifting over to the two new DEC-20's, and it
was only a matter of time before the rising maintenance costs could no
longer be justified. A flurry of last-minute rescue efforts followed the
announcement that the system would have to be shut down. It seemed
incomprehensible that a perfectly good machine would be removed from
service given the chronic state of saturation common to most computer
systems on most university campuses. The efforts failed, and on October
31, 1982, the DECsystem-10 at the University of Texas at Austin was turned
off for the last time.

The event did not pass without due ceremony. We held a farewell party on
that Halloween Sunday, well attended by current and ex staff members as
well as a few users. Many brought cameras to record a vanishing
breed--they don't make 'em with lights anymore. In one of the stranger
moments we cranked up the PTP: and had paper tape (might as well use it up)
and scratch magtape draped all over the place. The laughing and joking
helped. Many of us on the staff had built up an extremely close-knit group
over the years which had slowly drifted apart as new machines and new
responsibilites came along. This "wake" had a good cathartic effect,
bringing us together at a time when we needed to share feelings that had
hit us harder than we might have cared to admit.

I wondered about the users--all the faceless people scattered througout
dozens of small offices and terminal rooms throughout the campus--the
complete opposite of our small, close-knit staff. Were they feeling the
same emotions? If so, who could each of them share with?

At home very late that night, I felt the urge to dial up one last time. As
I went through my normal routine of checking mail, the Bboard, and the
various system mailboxes, I discovered something completely unexpected.
During the last few hours users had logged in and sent mail to the bboard
and to other system mailboxes like Operator. The curious thing is that
these people had no way of knowing that anybody would ever be around to
read these messages. They were, in the best way they knew how, sharing
their feelings directly with the machine. Some of the messages are
reproduced here as I found them, with only the senders' names altered.


. From: R. B.
. Subject: Dec10
. To: GRIPE
Farewell DEC10 and thank you!


. From: [4435,244]
. Subject: The death of a friend
. To: Bboard
Goodbye, DEC-10, you've been a great friend and co-worker. I'm
going to miss you for a long time. I feel worse than when they
killed Hal in 2001.


. From: B. J.
. Subject: November the 2 is too late
. To: Bboard
it feels like this is the end of an old friend. who says
computers haven't got any personality?


. From: GVCE333
. Subject: Good-Bye old paint
. To: Bboard
The glue factory beckons... Sigh!


. From: [1276,1]
. Subject: Good-bye, DEC-10
. To: Bboard
As a well-spent day bring happy sleep,
so life well used brings happy death.
Leonardo Da Vinci, 1452-1519
Notebooks [c. 1500]

DEC-10, you've been a good and faithful (for the most part)
servant and companion. Farewell.


. From: BSAB553
. Subject: bye
. To: GRIPE
This is last "bye" to the DEC 10; too bad. I liked the DEC 10
better than the DEC 20. I find it hard to believe that this
system could not have been supported to some extent... So long
forever!


. From: LSDT141
. Subject: Bye
. To: OPERATOR
BYE BYE FAITHFUL FRIEND - THE DEC-10

--
Ric Werme | we...@nospam.mediaone.net
http://people.ne.mediaone.net/werme | ^^^^^^^ delete

Ian Joyner

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
John Ahlstrom wrote:

> John Hendrickx wrote:
>
> > But
> > what was the computer of the century?
> >

> --snip snip


>
> > An
> > obvious choice would be the IBM 360. It brought computing to businesses,
> > for better or for worse, and gave IBM a virtual monopoly during the 60s
> > and 70s. Or does some other computer deserve this distinction?
> >
>

> No one can ignore the commercial success of the 360. I think on
> technological grounds, however, the B5000 might be a better choice
> as the first with a number of important characteristics that then
> became almost universal, to wit:

I certainly agree that the B5000 had a far reaching architecture, which has
evolved and is still present in the A Series. The OS (MCP) is still the best
in the industry, far better than Unix or NT and is worked on by a hand full
of programmers rather than the 1000s needed for these other systems.

The B5000 was given a bad wrap by Hennesy and Patterson who made silly
comparisons with the CDC 6000 at the time. I think these comments are still
in their latest edition. That the B5000 has evolved into a machine 100s or
1000s of times faster shows speed has nothing to do with stack architecture.

In fact the A Series also runs as a virtual machine on top of Intel. Perhaps
Unisys could wake up to the fact that they could as yet create the perfect
virtual machine which runs in every browser!

--
Ian Joyner
i.jo...@acm.org Eiffel for Macintosh
http://homepages.tig.com.au/~ijoyner/
http://www.object-tools.com/ot/apple.htm
Objects Unencapsulated -- The book
http://www.prenhall.com/allbooks/ptr_0130142697.html
http://www.elj.com/oue1/ (review)

Zeder

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
In article <83pq80$omn$1...@nnrp02.primenet.com>,

"Dennis O'Connor" <dm...@primenet.com> wrote:
> John Hendrickx wrote:
> > But what was the computer of the century?
>
> The "bombes" the British used to crack Enigma.
> Arguably the computers that saved the world,
> and the beginning of the computer industry.

I second that.

Andrew Paul Cadley

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to

On Tue, 21 Dec 1999, John Birch wrote:

> Surely the Sinclair ZX80/1 should be in the running for the impact it
> had on personal computing, so many people got their first experience
> of a computer with one of them. 'course there were a number of other
> also rans but I'd propose a Sinclair.....

Definately. They were what really brought computing to the masses.

AndyC


Andrew Paul Cadley

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to

On Tue, 21 Dec 1999, Dave Hansen wrote:

> On Tue, 21 Dec 1999 12:38:44 +0100, John Hendrickx
> <J.Hen...@mailbox.kun.nl> wrote:
>

> >Bohemian Rhapsody was voted record of the century (in a Dutch poll), the
>

> Record or cut? "Bohemian Rhapsody" isn't a record, it's a single. It
> was on the "A Night at the Opera" album. Not a horrible choice, but
> I'm sure I'm not alone is disagreeing that it is "the (song) of the
> century. For album, I'd nominate the Beatles' "Sgt. Pepper". The
> single is tougher, but it's have to be earlier -- probably one of
> Elvis'. You can see where my prejudices lie...

I'd put my vote next to "Motorcycle Emptiness" by the Manics. Which
quite probably indicates that I'm too young to pick a song of the century.
Not that surprising really, I wasn't alive for a good two thirds of it.
:-)

AndyC


Don Stokes

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
In article <83pq80$omn$1...@nnrp02.primenet.com>,
Dennis O'Connor <dm...@primenet.com> wrote:
> John Hendrickx wrote:
> > But what was the computer of the century?
>
>The "bombes" the British used to crack Enigma.
>Arguably the computers that saved the world,
>and the beginning of the computer industry.

The "bombes" weren't computers by any stretch of the imagination,
merely electro-mechanical searching devices. The contained something
like one logic circuit.

Not to be confused with the Colossus machines which were much more
advanced and computer-like. Probably these were where Turing et al
figured out how to build the early British computers such as the
Manchester Mk 1 and Pilot Ace.

-- don

Scott Wheeler

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
On Tue, 21 Dec 1999 23:16:29 -0700, "Dennis O'Connor"
<dm...@primenet.com> wrote:

> John Hendrickx wrote:
> > But what was the computer of the century?
>
>The "bombes" the British used to crack Enigma.
>Arguably the computers that saved the world,
>and the beginning of the computer industry.

They weren't the computers though (BTW, I heard recently that they
were originally Polish). I agree in principle: I'd nominate the
Colossus computers that produced the solutions that the bombes ran for
the decoding. Important for the war, introduced the first computer use
of some technologies such as paper tape, and led to the Manchester Mk1
and the LEO.

Scott
--
(please de-mung address if replying by email)

Terry C. Shannon

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to

"Zeder" <mfo...@dcscorp.com> wrote in message
news:83qlqb$jtl$1...@nnrp1.deja.com...
> In article <83pq80$omn$1...@nnrp02.primenet.com>,

> "Dennis O'Connor" <dm...@primenet.com> wrote:
> > John Hendrickx wrote:
> > > But what was the computer of the century?
> >
> > The "bombes" the British used to crack Enigma.
> > Arguably the computers that saved the world,
> > and the beginning of the computer industry.
>
> I second that.

The "bombes" would have been duds if the Brits didn't manage to acquire an
Enigma machine to study and analyze.

A couple more nominations for Computer of the Century.

Mass-market Impact: IA32

Out of the Glass House Award: VAX

Mother of all Distributed Systems: SETI@Home

Highest Concentration of Computing Power: NSA's Basement, Ft. Meade,
Maryland

Jan Vorbrueggen

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
"Terry C. Shannon" <sha...@world.std.com> writes:

> Mother of all Distributed Systems: SETI@Home

I'd rather nominate the Internet Routing Algorithm for that. Also, for the
longest running and most robust application - it hasn't been restarted since
when?

Jan

Dave Hansen

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
On Tue, 21 Dec 1999 21:22:57 -0500, "mstrjon32" <mstr...@rcn.com>
wrote:

>i think apple lisa would fit the description, it was the first computer to
>show the world the graphic user interface, what we all use today...
>

Aargh!

Yet another demonstration of what an amazing resource Xerox had at
PARC, and how horribly they wasted it...

Edward Reid

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
On Tue, 21 Dec 1999 11:29:29 -0500, John Ahlstrom wrote

> No one can ignore the commercial success of the 360.

Technically, I believe OS/360 also incorporated major advances in
processing asynchronous interrupts. Part of the reason it was so late
to be delivered.

In marketing, OS/360 was the prototype of all vaporware and still one
of the outstanding examples. Early 360s were delivered with no
operating system.

In jurisprudence, this practice of vaporware was one of the major
factors leading to the interminable antitrust suit against IBM.

So really, the 360 made progress on quite a few fronts. Or perhaps it
was more erosion than progress.

> 8. Commercial Virtual Memory - Atlas beat it by some months
> but was never commercially successful

I thought it was several years: Atlas 1955, B5000 1961. But I don't
have a reference handy, except for an email someone sent me several
years ago giving these dates.

> Will we be reviled in the 9990s for not having had the
> foresight to use 5 digit dates?

"This problem will be addressed in a future paradigm shift."
(attributed iirc to a DEC response to a TR on this issue)

Edward Reid


amoli...@visi-dot-com.com

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
In article <Fn5DE...@world.std.com>,

Terry C. Shannon <sha...@world.std.com> wrote:
>Highest Concentration of Computing Power: NSA's Basement, Ft. Meade,
>Maryland

Isn't this actually "Some Warehouse in Seattle just before
the Christmas Buying Rush On Nintendo 64s" in whatever year the big rush

Terry C. Shannon

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to

<amoli...@visi-dot-com.com> wrote in message
news:MK684.12$2O6...@ptah.visi.com...

Nope. No Such Agency operates at a, well, higher ECHELON than does
Amazon.com...

Or so the Spook Secretly SPOKE.

Tim Bradshaw

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
* mstrjon32 wrote:
> i think apple lisa would fit the description, it was the first computer to
> show the world the graphic user interface, what we all use today...

Apart from all the work at Xerox before it, that is.

Jeffrey S. Dutky

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
Edward Reid wrote:
>
> On Tue, 21 Dec 1999 11:29:29 -0500, John Ahlstrom wrote
> > No one can ignore the commercial success of the 360.
>
> Technically, I believe OS/360 also incorporated major
> advances in processing asynchronous interrupts. Part of
> the reason it was so late to be delivered.
>
> In marketing, OS/360 was the prototype of all vaporware
> and still one of the outstanding examples. Early 360s
> were delivered with no operating system.
>
> In jurisprudence, this practice of vaporware was one of
> the major factors leading to the interminable antitrust
> suit against IBM.
>
> So really, the 360 made progress on quite a few fronts.
> Or perhaps it was more erosion than progress.

Let's not forget that Fred Brooks' experiences working
on the OS/360 project led to the seminal text in software
engineering 'The Mythical Man-Month'.

- Jeff Dutky

amoli...@visi-dot-com.com

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
In article <Fn5Hy...@world.std.com>,

Terry C. Shannon <sha...@world.std.com> wrote:
>
><amoli...@visi-dot-com.com> wrote in message
>news:MK684.12$2O6...@ptah.visi.com...
>> In article <Fn5DE...@world.std.com>,
>> Terry C. Shannon <sha...@world.std.com> wrote:
>> >Highest Concentration of Computing Power: NSA's Basement, Ft. Meade,
>> >Maryland
>>
>> Isn't this actually "Some Warehouse in Seattle just before
>> the Christmas Buying Rush On Nintendo 64s" in whatever year the big rush
>
>Nope. No Such Agency operates at a, well, higher ECHELON than does
>Amazon.com...

Not amazon ;) There was supposedly a warehouse full of N64s
just before christmas a couple of years ago which represented the
largest concentration of compute power ever. Even if it WAS just
umpteen game consoles in brightly colored boxes, all powered off.

David M. Razler

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
gnohm...@my-deja.com wrote:

| Apple II. The historic stempoint of home computing, an incredible piece of
| engineering, and an incredible commercial success. Withou the Apple II,
| there'd be no trs80 nor ibmpc.


|
| The 360 isn't such a bad choice, but the Apple II is the model-T of computing.
|

Wellllll,

my votes:

Colossus proved much of computer theory practical for dealing with an
"unsolvable" problem - the 12-rotor encoding machine.

ENIAC was a workhorse and testbed

Whirlwind gave us core, and the concept of large RAM (vs. 1 or 2 registers and
some sort of slow storage)

The LINC, DEC 12-bit machines and all the minis that followed became the
Model-Ts of computing - anyone with a grant could own one. They also made
radically powerful computing available to the average researcher and became
the first full-fledged CAM machines.

Sy Cray's machines, from CDC to the end showed us what brute force and raw
speed could do, cutting-edge jets in an era of Model-Ts and maybe Buicks.

The Apple I proved anyone could own and use a computer

The IBM PC made the desktop computer "acceptable" to industry

The XEROX-PARC team's theoretical and prototype machines and code (Lisa's
grandparents) made computers accessable to those without special training or
the need to custom-code everything.

Osborne proved it *could* be taken with you.

dmr

David M. Razler
david....@worldnet.att.net

Thomas Jespersen

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
"Dennis O'Connor" <dm...@primenet.com> writes:

> The "bombes" the British used to crack Enigma.
> Arguably the computers that saved the world,
> and the beginning of the computer industry.

Wasn't that machine kept classified for many years? That means it
could not have started the computer industry.

Derek Peschel

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
In article <3bpgOM8gwgehvLnf=OSstp...@4ax.com>,
Scott Wheeler <sco...@bmtech.co.uk_DELETE_THIS> wrote:
>On Tue, 21 Dec 1999 23:16:29 -0700, "Dennis O'Connor"

><dm...@primenet.com> wrote:
>> John Hendrickx wrote:

>> > But what was the computer of the century?
>>

>>The "bombes" the British used to crack Enigma.
>>Arguably the computers that saved the world,
>>and the beginning of the computer industry.
>

>They weren't the computers though (BTW, I heard recently that they
>were originally Polish). I agree in principle: I'd nominate the
>Colossus computers that produced the solutions that the bombes ran for
>the decoding. Important for the war, introduced the first computer use
>of some technologies such as paper tape, and led to the Manchester Mk1
>and the LEO.

As far as I know, the Colossus machines and the Bombes never did related
work. Colossus was used to break the Lorenz (or Fish) cipher; I suppose it
could have been used on Enigma traffic but that would have been uneconomical.
Instead, the preparation of Bombe set-ups was done by hand from guessed
plaintext or other logical thought.

That's the way I understand it, anyway.

You are right about paper tape. I doubt Colossus had much of an influence
on the later machines (even Turing's own at the National Physical
Laboratory) because it was so hugely secret.

-- Derek

Nick Spalding

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
Derek Peschel wrote:

> I doubt Colossus had much of an influence
> on the later machines (even Turing's own at the National Physical
> Laboratory) because it was so hugely secret.

And Turing just wiped all his detailed knowledge of it from his
memory?
--
Nick Spalding

Tony Lima

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
On Tue, 21 Dec 1999 08:29:29 -0800, John Ahlstrom
<jahl...@cisco.com> wrote:

>John Hendrickx wrote:
>
>
>> But
>> what was the computer of the century?
>>

>--snip snip
>
>> An
>> obvious choice would be the IBM 360. It brought computing to businesses,
>> for better or for worse, and gave IBM a virtual monopoly during the 60s
>> and 70s. Or does some other computer deserve this distinction?
>>
>
>No one can ignore the commercial success of the 360. I think on
>technological grounds, however, the B5000 might be a better choice
>as the first with a number of important characteristics that then
>became almost universal, to wit:

[snip]

Second the nomination of the B5000 series, the first true
time-sharing computer. Subject to the caveat that there are
still 53.5 weeks remaining in the century and who knows what
will be invented during that period. - Tony

Alexandre Pechtchanski

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
On Wed, 22 Dec 1999 10:00:21 -0500, Edward Reid <edw...@paleo.org> wrote:
[ comp.sys.unisys dropped from follow-up for obvious reasons ]

>On Tue, 21 Dec 1999 11:29:29 -0500, John Ahlstrom wrote

[ big snip ]


>> Will we be reviled in the 9990s for not having had the
>> foresight to use 5 digit dates?
>
>"This problem will be addressed in a future paradigm shift."
>(attributed iirc to a DEC response to a TR on this issue)

IIRC, the complaint was that VAX/VMS calendar counted 2000 as a leap year (as it
should), and DEC response pointed out that VMS clock will run out in year
[sorry, don't remember, but many thousand years from now]. The quoted line
dealt with the situation of clock overflow and was prefaced with congratulating
the customer for the lo-o-ong view ;-)

[ When replying, remove *'s from address ]
Alexandre Pechtchanski, Systems Manager, RUH, NY

Andrew W. Rogers

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
In article <c0a26skka0refij18...@4ax.com> Alexandre Pechtchanski <alex*@*rockvax.rockefeller.edu> writes:
>>"This problem will be addressed in a future paradigm shift."
>>(attributed iirc to a DEC response to a TR on this issue)
>
>IIRC, the complaint was that VAX/VMS calendar counted 2000 as a leap year (as it
>should), and DEC response pointed out that VMS clock will run out in year
>[sorry, don't remember, but many thousand years from now].

The text of the SPR response is available at

http://www.openvms.digital.com/openvms/products/year-2000/leap.html

Andrew
(firstname...@east.sun.com)

Charlie Gibbs

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
In article <83pcne$65a$1...@bob.news.rcn.net> mstr...@rcn.com (mstrjon32)
writes:

>i think apple lisa would fit the description, it was the first computer
>to show the world the graphic user interface, what we all use today...

"What you mean _we_, paleface?"

--
cgi...@sky.bus.com (Charlie Gibbs)
Remove the first period after the "at" sign to reply.


John Ahlstrom

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
John Ahlstrom wrote:

> 8. Commercial Virtual Memory - Atlas beat it by some months
> but was never commercially successful

Then Edward Reid wrote:

> > 8. Commercial Virtual Memory - Atlas beat it by some months
> > but was never commercially successful
>
>

According to
The History of the Development of Parallel
Computing
by Gregory V. Wilson g...@cs.toronto.edu
at http://ei.cs.vt.edu/~history/Parallel.html

The Atlas project began in 1956 and Atlas was operational in 1966

I have no information on when the 5000 project began.
I believe it was operational in 62 or 63.

Can anyone help?


JKA
--
We are having a wonderful October this
December.

John Ahlstrom

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
Previously I wrote:

John Ahlstrom wrote:

> 8. Commercial Virtual Memory - Atlas beat it by some months
> but was never commercially successful

Then Edward Reid wrote:

> > 8. Commercial Virtual Memory - Atlas beat it by some months
> > but was never commercially successful
>
>

Actually Edward wrote:


> I thought it was several years: Atlas 1955, B5000 1961. But I don't
> have a reference handy, except for an email someone sent me several
> years ago giving these dates.
>
>

To which I responded:

> According to
> The History of the Development of Parallel
> Computing
> by Gregory V. Wilson g...@cs.toronto.edu
> at http://ei.cs.vt.edu/~history/Parallel.html
>
> The Atlas project began in 1956 and Atlas was operational in 1966
>
> I have no information on when the 5000 project began.
> I believe it was operational in 62 or 63.
>
> Can anyone help?
>
>

bill_h

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
John Ahlstrom wrote:

> Then Edward Reid wrote:
>
> > > 8. Commercial Virtual Memory - Atlas beat it by some months
> > > but was never commercially successful

Wasn't RCA using VM around 1960?

Bill
Tucson


Mark W Brehob

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
In comp.arch John Ahlstrom <jahl...@cisco.com> wrote:
> John Hendrickx wrote:


>> But
>> what was the computer of the century?
>>
> --snip snip

I have to go with the IBM PC. No other computer has had as much influence
on as many people as that series of computers. Moderate architecture
(mainly just old) but the market impact is obvious. The 360 would get my
2nd place vote. It probably made the PC possible.

Here is a different question:
Assuming a fairly free and open society, was the "Personal Computer
revolution" going to happen with or without the IBM PC? Was it an
idea who's time had come?

Mark


Erik Trulsson

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to

It had already started. The IBM PC was just IBM's attempt to break into the
personal computer market. Remember that CP/M and the Apple II was already
out and fairly successful when IBM released its PC. (And the C64, which is
arguably the computer model that has sold most units of all, was released
shortly after.)
One reason that the IBM PC isn't better than it is is that IBM didn't
really think it would be a great success so they didn't put all
that much resources into the development. (This is also why all the clones
exist, since IBM didn't bother to protect the architecture much against
copies.) (At least no in the beginning.)

--
<Insert your favourite quote here.>
Erik Trulsson
ertr...@student.csd.uu.se


Gene Wirchenko

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
Mark W Brehob <bre...@cse.msu.edu> wrote:

>In comp.arch John Ahlstrom <jahl...@cisco.com> wrote:
>> John Hendrickx wrote:
>
>
>>> But
>>> what was the computer of the century?
>>>
>> --snip snip
>
>I have to go with the IBM PC. No other computer has had as much influence
>on as many people as that series of computers. Moderate architecture
>(mainly just old) but the market impact is obvious. The 360 would get my
>2nd place vote. It probably made the PC possible.

IBM rode on the coattails of those who started the micro
industry.

>Here is a different question:
> Assuming a fairly free and open society, was the "Personal Computer
> revolution" going to happen with or without the IBM PC? Was it an
> idea who's time had come?

Yes. After all, there already was an 8-bit market. IBM didn't
get in until the markets had been created. If they hadn't done it,
someone else would have. I used an 8086-based system before the IBM
pc was released.

Sincerely,

Gene Wirchenko

Computerese Irregular Verb Conjugation:
I have preferences.
You have biases.
He/She has prejudices.

John Ahlstrom

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
Bill H wrote:

> Wasn't RCA using VM around 1960?

I believe not. I believe the first RCA VM was a Spectra 70/xx (45?)
which was post 1965.

JKA

Chris Espinosa

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
lwin wrote:
>
> > Here is a different question:
> > Assuming a fairly free and open society, was the "Personal Computer
> > revolution" going to happen with or without the IBM PC? Was it an
> > idea who's time had come?
>
> IBM gave the PC industry standardization. Everything for PCs for that
> point on--both hardware units and software packages--HAD to be wear
> the "IBM Compatible" label or die. That laid a major foundation for
> economies of scale and training and development.
>
> I question whether the industry would have been so standardized without
> IBM's involvement.

In that case I'd give the award to the Kaypro or the original Compaq.
Without them, the concept of "IBM-compatible" would not have existed.
It was the early IBM clones, not the IBM Personal Computer itself, that
made compatibility an issue, and gave the platform popularity that
outgrew its original vendor.

Chris

mstrjon32

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
obviously they got it from xerox, but what xerox did almost no one ever knew
about.....
lisa had a short life too until the mac 128 and 512k's came out.....
but it was the first hardly-affordable, barely-available, gui machine


and charlie gibbs--- if you are still using a command prompt, good for
you.....

mstrjon32 <mstr...@rcn.com> wrote in message
news:83pcne$65a$1...@bob.news.rcn.net...


> i think apple lisa would fit the description, it was the first computer to
> show the world the graphic user interface, what we all use today...
>
>
>
>

> John Hendrickx <J.Hen...@mailbox.kun.nl> wrote in message
> news:MPG.12c98531d...@news.uci.kun.nl...
> > Bohemian Rhapsody was voted record of the century (in a Dutch poll), the
> > T-Ford is car of the century, Time magazine has been polling readers
> > about persons of the century in various fields for some months now. But


> > what was the computer of the century?
> >

> > A quick AltaVista search turned up two quotes, one for the e-mate as
> > computer of the century, the second of "William B. Sanders. Computer of
> > the Century: A Guide to the Radio Shack Model 100 . (Chatsworth, CA. :
> > Datamost, Inc.) 1984." The former is hype, the latter was premature. An


> > obvious choice would be the IBM 360. It brought computing to businesses,
> > for better or for worse, and gave IBM a virtual monopoly during the 60s

> > and 70s. Or does some other computer deserve this distinction? And have
> > there been any polls for something that has only been around for half a
> > century?
>
>

Chris Espinosa

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
Dave Hansen wrote:

>
> On Tue, 21 Dec 1999 12:38:44 +0100, John Hendrickx
> <J.Hen...@mailbox.kun.nl> wrote:
>
> >T-Ford is car of the century, Time magazine has been polling readers
> >about persons of the century in various fields for some months now. But
> >what was the computer of the century?
>
> I'd have to say the IBM PC. It wasn't the first. It wasn't the best.
> But it was the one that changed everything. For better or for worse.
>

As much as I'd like to I can't really say that any PC would count. When
we invented personal computing, what we were rebelling against was the
kind of computing establishment where you had terminals in the $1000
range that hooked up over phone lines or hard-wired connections to big
iron that held all your storage, were the sole source of the data you
needed, and had arcane access controls that severely limited your
freedom to do what you wanted with your data. And you paid for your
connect time and storage. And it was never fast enough and you were
competing with others for the same CPU and connectivity resources. ANd
when it went down, your terminal was useless.

Sound familiar?

The PC era lasted from 1975 to 1995, a fifth of the century and less
than half of the electronic computing era. The net killed it, and
minicomputers (which were written off as dead in 1983 or so) came back
as servers to be the center of computing. How much of the Web runs on
desktop machines?

I'd nominate some Sun SparcServer as the computer of the century, not
out of any love for it, but because high-transaction-volume web servers
are what makes the Internet possible, and even modern PCs are just
slaves to it.

Chris

Paul DeMone

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to

Ian Joyner wrote:
[snip]
> The B5000 was given a bad wrap by Hennesy and Patterson who made silly
> comparisons with the CDC 6000 at the time. I think these comments are still
> in their latest edition. That the B5000 has evolved into a machine 100s or
> 1000s of times faster shows speed has nothing to do with stack architecture.

And exactly what was their heresy? To sully the rarefied world of
ivory tower conjecture about instruction set encoding efficiency or
instruction fetch bandwidth requirements with grubby blue collar
details like physical implementation, and *gasp*, performance?

Stack machines were the perfect fit to brain dead compilers and expensive
flip-flops. But we are long past that phase. We can afford to perform
data flow analysis, common sub-expression reuse, and interprocedural
register allocation. We can also afford reasonably large and capacious
general purpose register files with access times comparable to the minimum
logic delay to perform basic integer arithmetic and logic functions.

Sure it is possible to speed up classical stack machine implementations
with shadow caching of the top section of the stack in multiported
registers. But why create an architecture and implemention edifice to
hide the best attributes about GPRs - high speed random access through
multiple independent ports, while adding unnecessary complication to
superscalar implementation (T9000 ring a bell?)

Setting aside the anachronistic stack architecture, what was so great
about a 48 bit architecture with tagged data words to reduce opcode
encoding size? Or an inefficient and wasteful floating point format
chosen to overlap with integer data representation? And maybe compilers
have grown up enough to decide when to do a full fledged block structured
activation stack frame with display and when to do something a bit more
lightweight. So don't build unwieldy assumptions about language
implementation into hardware.


All opinions strictly my own.
--
Paul W. DeMone The 801 experiment SPARCed an ARMs race of EPIC
Kanata, Ontario proportions to put more PRECISION and POWER into
dem...@mosaid.com architectures with MIPSed results but ALPHA's well
pde...@igs.net that ends well.

Paul DeMone

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to

"Terry C. Shannon" wrote:
[snip]
> Mass-market Impact: IA32
>
Without a doubt.

It is a crappy, bizarre, and unwieldy architecture. And through
no virtue other than being non-threatening to IBM's alternative
low end architectures, it scored the design win of the century.

The popularity of the x86 platform and the self-reinforcing
coalescing of software development around it has led to a
revolution in bringing computing to the masses in the 20th
century.

The real question is what will be the computer of the *next*
century and what is the MPU in it? I doubt it will be a beige
box sitting under everyone's desk. Either the massive behind
the scene data servers driving the internet economy or the
ubiqitous embedded control processors in handheld or worn
communications/computation devices. Regardless, the x86 isn't
an attractive choice for either application.

Intel's entries for the big/little 21st century computers are
IA-64 and StrongARM-II. But the competition to both are formi-
dible and answer to this question is far from resolved. Here are
the MPU contenders of the next century (alphabetical ordering):

Big Little

(high performance, 100's (Embeddable CPU core, very
of watts, huge bandwidth low power consumption, good
massive die size and pin performance, reasonable code
count) density, SIMD DSP)

Alpha ARM/StrongARM
IA-64 MAJC
POWER MIPS
SPARC PowerPC

Hank Murphy

unread,
Dec 22, 1999, 3:00:00 AM12/22/99
to
A good idea, but IMHO the wrong ocean.

This may sound like blasphemy in comp.arch, but the Battle Of The Atlantic
would have been won without the Collossus. Contributing factors were, in no
particular order:
- Signals intelligence - Admiral Doenitz overused the radio, giving many
hints as to the location of wolf packs
- Long range air patrols from Iceland starting in 1942, which hindered
U-boat daylight surface travel
- Improved convoy and ASW tactics
- RAF Coastal Command attacking U-boats close to their bases
- The entrance of the US into the war, which led to hundreds of additional
escort vessels
For a good, fairly concise overview, see "The Price Of Admiralty" by John
Keegan.

OTOH...the comparatively unheralded codebreakers in a Pearl Harbor basement,
using IBM punched card tabulating equipment, probably did more than we can
imagine to win the Battle Of Midway, which was the turning point of the
Pacific War. But they weren't really computers, so I don't think they
qualify for this contest.

My own contenders would be the 360 and 808x, with possible honorable
mentions going to the RISC architectures and PDP-11/VAX for popularizing
departmental computing. There are several others deserving of praise for
design breakthroughs - e.g. B5000, Eniac - but which did not have the direct
impact of the 360 and 808x.

But the computer of the century probably is the Intel 4004, for starting the
embedded microcontroller revolution. We don't put 360s or 808xes into heart
pacemakers, automobiles, or Tomahawk missiles. And this is probably what
history will recognize as the more significant change in 100 years.

The most significant architecture
would have been the
Dennis O'Connor wrote in message <83pq80$omn$1...@nnrp02.primenet.com>...


> John Hendrickx wrote:
> > But what was the computer of the century?
>

>The "bombes" the British used to crack Enigma.
>Arguably the computers that saved the world,
>and the beginning of the computer industry.

>--
>Dennis O'Connor dm...@primenet.com
>Vanity Web Page http://www.primenet.com/~dmoc/
>
>


lwin

unread,
Dec 23, 1999, 3:00:00 AM12/23/99
to

we...@nospam.mediaone.net

unread,
Dec 23, 1999, 3:00:00 AM12/23/99
to
John Ahlstrom <jahl...@cisco.com> writes:


>According to
>The History of the Development of Parallel
>Computing
> by Gregory V. Wilson g...@cs.toronto.edu
>at http://ei.cs.vt.edu/~history/Parallel.html

>The Atlas project began in 1956 and Atlas was operational in 1966

Hadn't seen that page. Odd, it doesn't mention the Bailey Meter 756,
the first commercially successful parallel system, introduced in 1962.
(It was designed for controlling power plants, one in Australia ran until
last year.) Executed off a drum, used germanium transistors. It's now
at the Australia Computer Museum.

Gotta get a Web page up for that.

-Ric Werme

--
Ric Werme | we...@nospam.mediaone.net
http://people.ne.mediaone.net/werme | ^^^^^^^ delete

Gene Wirchenko

unread,
Dec 23, 1999, 3:00:00 AM12/23/99
to
lwi...@bbs.cpcn.com (lwin) wrote:

Sure it would have been. It was. One word: CP/M. It was used
widely and most every man[ufacturer] and his dog selling business
systems had a version of CP/M to run on their boxes.

Charles Richmond

unread,
Dec 23, 1999, 3:00:00 AM12/23/99
to
amoli...@visi-dot-com.com wrote:
>
> In article <Fn5DE...@world.std.com>,
> Terry C. Shannon <sha...@world.std.com> wrote:
> >Highest Concentration of Computing Power: NSA's Basement, Ft. Meade,
> >Maryland
>
> Isn't this actually "Some Warehouse in Seattle just before
> the Christmas Buying Rush On Nintendo 64s" in whatever year the big rush
>
Or some *huge* garbage dump next year, when Nintendo comes out with it's new
great game system to compete with Sega Dreamcast. All the stores are trying
to push *real* hard to sell all the Nintendo64's they can this Christmas,
because they know it is their last real chance. But people seem to be catching
on and waiting for the new game system. Result: stacks and stacks of unsold
Nintendo 64's in the middle of electronics stores everywhere.

--
+-------------------------------------------------------------+
| Charles and Francis Richmond <rich...@plano.net> |
+-------------------------------------------------------------+

donald tees

unread,
Dec 23, 1999, 3:00:00 AM12/23/99
to

Nick Spalding wrote in message ...
Actually, that is exactly what he did ...


Bill Pechter

unread,
Dec 23, 1999, 3:00:00 AM12/23/99
to
In article <Fn5DE...@world.std.com>,
Terry C. Shannon <sha...@world.std.com> wrote:
>
>A couple more nominations for Computer of the Century.
>
>Mass-market Impact: IA32
>
>Out of the Glass House Award: VAX
>
>Mother of all Distributed Systems: SETI@Home

>
>Highest Concentration of Computing Power: NSA's Basement, Ft. Meade,
>Maryland
>

Well, I'd like to nominate the DEC PDP-8.

The PDP-8 was the machine that brought the computer out of the Glass
House and made possible the PDP11, Vax and most of the microcomputers by
showing the uses for mass produced embeddable machines.

The PDP-8 was produced up through the '80's (over 15 years) in forms
from boards to the 8 on a chip by Harris and Intersil.

The PDP-8 began the idea of "Personal Computers" before CP/M, Apple,
and IBM knew what they were. It's OS's influenced the later os's
like RT11 (and through RT11 -- CP/M, and it's evil clone MS-DOS).

If not the PDP8, I'd say the IBM 360 line has had the greatest impact
on the industry.

Bill
--
-------------------------------------------------------------------------------
bpechter@.monmouth.com|pec...@pechter.dyndns.org|pec...@pechter.bsdonline.org
Three things never anger: First, the one who runs your DEC,
The one who does Field Service and the one who signs your check.

Scott Wheeler

unread,
Dec 23, 1999, 3:00:00 AM12/23/99
to
On 22 Dec 1999 18:00:15 GMT, dpes...@u.washington.edu (Derek Peschel)
wrote:

>As far as I know, the Colossus machines and the Bombes never did related
>work. Colossus was used to break the Lorenz (or Fish) cipher;

You're right - I'd forgotten that.

Scott
--
(please de-mung address if replying by email)

Scott Wheeler

unread,
Dec 23, 1999, 3:00:00 AM12/23/99
to
On 22 Dec 1999 18:01:52 +0100, Thomas Jespersen <tho...@daimi.au.dk>
wrote:

>Wasn't that machine kept classified for many years?

Until the 70's (dash it, it's only sporting to give the cousins time
to catch up).

>That means it could not have started the computer industry.

Yes, but as Derek and others said, Turing ran the Pilot ACE at the
NPL. Looking at one Web site I found, that had a tiny amount of memory
but was interesting because it looked a bit like a free-clocking
processor with a RISC-type instruction set which gave very good
performance for the time.

I'm pretty sure some of the staff ended up at Manchester on the Baby
and the Mk1. Does anyone know what Tommy Flowers did after Colossus?

Larry Kilgallen

unread,
Dec 23, 1999, 3:00:00 AM12/23/99
to
In article <FB9iOKnesnRUNu...@4ax.com>, Scott Wheeler <sco...@bmtech.co.uk_DELETE_THIS> writes:

> and the Mk1. Does anyone know what Tommy Flowers did after Colossus?

A recent PBS broadcast in the US said he "went back to the Post Office".

Scott Wheeler

unread,
Dec 23, 1999, 3:00:00 AM12/23/99
to
On Wed, 22 Dec 1999 15:00:26 GMT, "Terry C. Shannon"
<sha...@world.std.com> wrote:

>Highest Concentration of Computing Power: NSA's Basement, Ft. Meade,
>Maryland

I wonder about this. The oft-repeated statement that they measure
computing power in acres looks like marketing aimed at Washington, and
frankly I don't believe it. Since no-one other than GCHQ staff ever
find out much about NSA, and they make the NSA seem as verbose as race
commentator on amphetamines, it's a bit difficult to get information
on what they really use. Maybe it's a bit naive, but I suspect that
some CGI firms would give them a run for their money.

Scott Wheeler

unread,
Dec 23, 1999, 3:00:00 AM12/23/99
to
On Wed, 22 Dec 1999 15:00:26 GMT, "Terry C. Shannon"
<sha...@world.std.com> wrote:

>The "bombes" would have been duds if the Brits didn't manage to acquire an
>Enigma machine to study and analyze.

No: Bill Tutte and his group deduced the structure of the 12-rotor
Lorentz machine from messages
(http://www.cranfield.ac.uk/ccc/bpark/lectures/ts-ieee-feb19-1999.txt),
so I think we can assume they could have done it with the 3-rotor
Enigma.

Coming back to Colossus, another "first" for it is that it was the
first ur-computer to be produced in more than one-off quantities.
There were around 12 (Mk1 and Mk2) produced altogether.

Eric J. Korpela

unread,
Dec 23, 1999, 3:00:00 AM12/23/99
to
In article <83pq80$omn$1...@nnrp02.primenet.com>,

Dennis O'Connor <dm...@primenet.com> wrote:
> John Hendrickx wrote:
> > But what was the computer of the century?
>
>The "bombes" the British used to crack Enigma.
>Arguably the computers that saved the world,
>and the beginning of the computer industry.

And now we loop back in time a year or two ago to the thread about
the Norden bombsight being the computer that is responsible for the
most deaths. (No one limited "computer" to "digital-computer" in
this thread). I'd say it was at least as instrumental to saving the
world as what you're describing, and more of a computer than the bulk
of what was used to crack enigma.

Eric
--
Eric Korpela | An object at rest can never be
kor...@ssl.berkeley.edu | stopped.
<a href="http://sag-www.ssl.berkeley.edu/~korpela">Click for home page.</a>


Jeffrey S. Dutky

unread,
Dec 23, 1999, 3:00:00 AM12/23/99
to
Paul DeMone wrote:
> Intel's entries for the big/little 21st century computers
> are IA-64 and StrongARM-II. But the competition to both
> are formidible and answer to this question is far from

> resolved. Here are the MPU contenders of the next century
> (alphabetical ordering):
>
> Big Little
>
> (high performance, 100's (Embeddable CPU core, very
> of watts, huge bandwidth low power consumption, good
> massive die size and pin performance, reasonable code
> count) density, SIMD DSP)
>
> Alpha ARM/StrongARM
> IA-64 MAJC
> POWER MIPS
> SPARC PowerPC
>

I think you missed a few under the 'Little' category. What
about Dragonball and the Hitachi H and SuperH, both of which
are seeing use in some reasonably popular handheld systems.
Aren't there also a few DSPs still breathing? (the TMS320C
series at least)

- Jeff Dutky

Grace H J Sturgess

unread,
Dec 23, 1999, 3:00:00 AM12/23/99
to
Atlas was operational way before 1966. I used it in 1963. Mind you there
were severe limitations to their Algol compiler at that time. My program
blew all their compiler limits. So, they used my program as a test of their
compiler. Atlas was the most impressive machine I have ever seen. I used the
one at London University and would go down there in the morning to run my
program. The techs would release the computer to the operators. About 4-5
operators would grab a cart and start feeding paper tape into the machine.
Pretty soon the dozen or so tape drives would be whizzing around and paper
would start pouring out of 6 or more printers. Then when my program started
compiling, everything stopped as the compiler grabbed all the resources it
could. Fortunately the compile only took a couple of minutes, then the run
would die and I would get a stack trace back dump to look at.

Grace

John Ahlstrom <jahl...@cisco.com> wrote in message
news:38613509...@cisco.com...
> Previously I wrote:


>
> John Ahlstrom wrote:
>
> > 8. Commercial Virtual Memory - Atlas beat it by some months
> > but was never commercially successful
>
>
>
> Then Edward Reid wrote:
>
> > > 8. Commercial Virtual Memory - Atlas beat it by some months
> > > but was never commercially successful
> >
> >
>

> Actually Edward wrote:
>
>
> > I thought it was several years: Atlas 1955, B5000 1961. But I don't
> > have a reference handy, except for an email someone sent me several
> > years ago giving these dates.
> >
> >
>
> To which I responded:
>

> > According to
> > The History of the Development of Parallel
> > Computing
> > by Gregory V. Wilson g...@cs.toronto.edu
> > at http://ei.cs.vt.edu/~history/Parallel.html
> >
> > The Atlas project began in 1956 and Atlas was operational in 1966
> >

> > I have no information on when the 5000 project began.
> > I believe it was operational in 62 or 63.
> >
> > Can anyone help?
> >
> >

bill_h

unread,
Dec 23, 1999, 3:00:00 AM12/23/99
to
Chris Espinosa wrote:
>
> Dave Hansen wrote:

> > I'd have to say the IBM PC. It wasn't the first. It wasn't the best.
> > But it was the one that changed everything. For better or for worse.

> The PC era lasted from 1975 to 1995, a fifth of the century and less


> than half of the electronic computing era. The net killed it, and
> minicomputers (which were written off as dead in 1983 or so) came back
> as servers to be the center of computing. How much of the Web runs on
> desktop machines?

A not insignificant question. But hard to answer. Every site I've seen
seems to have one or more desktops connected in, handling some mundane
chore or another (like Unix Shell accounts for those wanting a bomb
proof way of receiving email).

> I'd nominate some Sun SparcServer as the computer of the century, not
> out of any love for it, but because high-transaction-volume web servers
> are what makes the Internet possible, and even modern PCs are just
> slaves to it.

No, what IS making the WEB possible is people willing to haul
traffic belonging to OTHERS without PAY for doing so. And as
the dollar value (and PROFITS going to those OTHERS!) goes up,
I would guess most of those carrying that traffic will be trying
to figure out some way to set up toll booths and load that traffic
with higher costs. About all they get out of it right now is the
opportunity to run an address sniffer, assuming they have customers
in the spam business willing to buy or trade for those addresses.

One benefit would be, if spammers actually had to PAY for each and
every message, most of the spam ''problem'' would sinply go away.

Bill
Tucson


Dowe Keller

unread,
Dec 23, 1999, 3:00:00 AM12/23/99
to
Dave Daniels wrote:
>
>The Apple II might hold that position in the US, but from the UK
>perspective I agree with the people who suggest the ZX80 and ZX81.
>The ZX81 was a usable computer that cost under 100 UKP. IIRC the
>kit version was just under 50 UKP and an extra 16K of RAM (the
>infamous wobbly 16K RAM pack) cost the same. The documentation was
>comprehensive and included things such as a list of Z80 opcodes
>that were of invaluable use to a generation of budding computer
>geeks. Before the ZX machines you either paid several hundred quid
>for a machine or ended up with something that had a hex keypad and
>an eight segment LED display. The ZX machines were on the leading
>edge of the UK micro computer boom of the early eighties and IMHO

I had a ZX81 (they called them Timex/Sinclair 1000s over here on the
left side o' the pond), IMNSHO the ZX81 had more of an impact than
the Apple II (OK, I know I'm about to be flamed here). True, the
Apple came out before, and was arguably the better computer. But
the ZX81 was the first computer that was truly afordable. At twelve
years old, I was able to buy a ZX81 by saving up my allowance. That
mitigates some of the little ZXs shortcommings by a long way (Like that
crummy little keybord :( ).

BTW: How mutch did an Apple II cost in 1982?

Dowe Keller do...@worldnet.att.net http://home.att.net/~dowe

Joseph Yuska

unread,
Dec 23, 1999, 3:00:00 AM12/23/99
to
Scott Wheeler wrote:
>
> On Wed, 22 Dec 1999 15:00:26 GMT, "Terry C. Shannon"
> <sha...@world.std.com> wrote:
>
> >Highest Concentration of Computing Power: NSA's Basement, Ft. Meade,
> >Maryland
>
> I wonder about this. The oft-repeated statement that they measure
> computing power in acres looks like marketing aimed at Washington, and
> frankly I don't believe it. Since no-one other than GCHQ staff ever
> find out much about NSA, and they make the NSA seem as verbose as race
> commentator on amphetamines, it's a bit difficult to get information
> on what they really use. Maybe it's a bit naive, but I suspect that
> some CGI firms would give them a run for their money.
>
> Scott
> --
> (please de-mung address if replying by email)


I can't comment on their contemporary status, but from 1968-74, The top
salesman for CDC by a very large margin was the gentleman who handled
the NSA account.


Joe Yuska

George Herbert

unread,
Dec 23, 1999, 3:00:00 AM12/23/99
to
Scott Wheeler <sco...@bmtech.co.uk_DELETE_THIS> wrote:
>>Highest Concentration of Computing Power: NSA's Basement, Ft. Meade,
>>Maryland
>
>I wonder about this. The oft-repeated statement that they measure
>computing power in acres looks like marketing aimed at Washington, and
>frankly I don't believe it. Since no-one other than GCHQ staff ever
>find out much about NSA, and they make the NSA seem as verbose as race
>commentator on amphetamines, it's a bit difficult to get information
>on what they really use. Maybe it's a bit naive, but I suspect that
>some CGI firms would give them a run for their money.

While it's not usually a subject of open postings or discussion,
most people active in the industry over here know someone who
has something to do with No Such Agency, and while they are
usually pretty circumspect about any sort of details which would
actually break security, rough order of magnitude ideas of what's
there are around. It's pretty big.

CGI firms are big, but not world class, concentrations of CPU
horsepower. If you look at the numbers, there are many banks and
research labs and computer companies which have more CPU horsepower
than any CGI firm.


-george william herbert
gher...@crl.com


Tim McCaffrey

unread,
Dec 23, 1999, 3:00:00 AM12/23/99
to
In article <jv306s891khd1p8g5...@4ax.com>,
david....@worldnet.att.net says...
>

>Colossus proved much of computer theory practical for dealing with an
>"unsolvable" problem - the 12-rotor encoding machine.
>
>ENIAC was a workhorse and testbed
>
>
I agree with David here, consider what would ENIAC did:

1) It proved an electronic calculating machine that complex
(more complex than it really needed to be, actually) could be
made to work. (Yes, Colussus and others did this, but they
were not public).

2) It gave Eckert and Mauchley credibility.

3) That credibility translated to credit, which allowed them
to create Eckert-Mauchley computers (later Univac).

4) Various people that worked with Eckert & Mauchley went off to
create there own companies, such as CDC.

5) IBM got real interested in computers because of E & M.

In other words, while the ENIAC wasn't really a computer in the
modern sense of the word (it was more like a FPGA), and wasn't really
the first computer, I believe it really kicked off commercial computing.

Tim McCaffrey

All opinions strictly my own.

whole thing off.


Terry C. Shannon

unread,
Dec 23, 1999, 3:00:00 AM12/23/99
to

> Scott Wheeler <sco...@bmtech.co.uk_DELETE_THIS> wrote:
> >>Highest Concentration of Computing Power: NSA's Basement, Ft. Meade,
> >>Maryland
> >
> >I wonder about this. The oft-repeated statement that they measure
> >computing power in acres looks like marketing aimed at Washington, and
> >frankly I don't believe it. Since no-one other than GCHQ staff ever
> >find out much about NSA, and they make the NSA seem as verbose as race
> >commentator on amphetamines, it's a bit difficult to get information
> >on what they really use. Maybe it's a bit naive, but I suspect that
> >some CGI firms would give them a run for their money.
>

Gee, I wouldn't really know. After all, I never worked, um, **directly**
for NSA.

Terry Shannon
SGT USASA 509th Radio Research Group Viet Nam 1970-72 MOS 96B20, 98B20,
98C20

PS-- Bamford's "The Puzzle Palace" sheds interesting light on NSA and its
doings. The book is dated; at the time it was written the NSA's basement
boasted only 15 acres of computers.

Jim Stewart

unread,
Dec 23, 1999, 3:00:00 AM12/23/99
to


Add a number 6,

It provided Johnny Von Neuman with the inspiration to define the CPU
architecture that still bears his name.

Jim

John Ahlstrom

unread,
Dec 23, 1999, 3:00:00 AM12/23/99
to
Paul DeMone wrote:


> And exactly what was their heresy? To sully the rarefied world of
> ivory tower conjecture about instruction set encoding efficiency or
> instruction fetch bandwidth requirements with grubby blue collar
> details like physical implementation, and *gasp*, performance?
>
>

Gee Paul, I don't remember writing anything about
stacks or tags. I did forget to mention protected multi-
tasking. And Ian explicitly said, in the passage you quoted:


> That the B5000 has evolved into a machine
> 100s or
> > 1000s of times faster shows speed has nothing to do with stack
> architecture.
>

Watch this space for a different way to look at the
Hennesey-Patterson numbers. It ain't surprising
that a 100 nsec CPU coupled with a 1 microsec
memory machine with 10 CPU functional units can beat
a 1 microsec CPU with a 4 microsec memory and
1 CPU functional unit.

What's surprising is that the 6600 didn't beat the 5500 by MUCH
more.


JKA

Paul DeMone

unread,
Dec 23, 1999, 3:00:00 AM12/23/99
to

Charles Richmond wrote:
[snip]


> Result: stacks and stacks of unsold
> Nintendo 64's in the middle of electronics stores everywhere.

Yet rambus still cannot turn in decent earnings. ROFL.
At least MIPS made a few nickles on N64 ;-)

All opinions strictly my own.

Toon Moene

unread,
Dec 23, 1999, 3:00:00 AM12/23/99
to
Alexandre Pechtchanski <alex*@*rockvax.rockefeller.edu> wrote:

> IIRC, the complaint was that VAX/VMS calendar counted 2000 as a leap year
(as it
> should), and DEC response pointed out that VMS clock will run out in year
> [sorry, don't remember, but many thousand years from now].

Well, if your slide rule is still operational (the models I know are all Y2K
compliant), you could even calculate it yourself.

The epoch is 18 (?) November 1858, the increment is in units of 100
nanoseconds and the clock tick is kept in a signed 64-bit integer.

Success !

[ That's what *I* would call a home assignment :-) ]

--
Toon Moene (mailto:to...@moene.indiv.nluug.nl)
Saturnushof 14, 3738 XG Maartensdijk, The Netherlands
Phone: +31 346 214290; Fax: +31 346 214286
GNU Fortran: http://egcs.cygnus.com/onlinedocs/g77_news.html

lwin

unread,
Dec 24, 1999, 3:00:00 AM12/24/99
to
> In that case I'd give the award to the Kaypro or the original Compaq.
> Without them, the concept of "IBM-compatible" would not have existed.
> It was the early IBM clones, not the IBM Personal Computer itself, that
> made compatibility an issue, and gave the platform popularity that
> outgrew its original vendor.

To have a compatibility standard, you have to something people are
willing to agree as the standard. The key issue of the IBM PC was
that it had wide acceptability and was usable as a standard.

lwin

unread,
Dec 24, 1999, 3:00:00 AM12/24/99
to
> 4) Various people that worked with Eckert & Mauchley went off to
> create there own companies, such as CDC.

I think the CDC crew was an outgrowth of the ERA unit, which was
separate, even though both were owned by Remington Rand.


> 5) IBM got real interested in computers because of E & M.

It is true E-M's publicity gave IBM the kick in the pants to get
moving on computers aggressively. But IBM was ALREADY into computer
development. IBM developed the Mark I for Harvard, and then the
SSEC for itself. While both weren't pure "electronic" computers,
they were far more sophisticated computers than anything developed
to date, and in some ways more sophisticated than ENIAC's original
version. IBM had an electronic calculator on the market in 1948
and it was very popular as a "poor man's computer". IBM was
developing a business computer (it's "tape processing machine")
but then put it aside to develop its scientific computer, the 701.


> In other words, while the ENIAC wasn't really a computer in the
> modern sense of the word (it was more like a FPGA), and wasn't really
> the first computer, I believe it really kicked off commercial computing.

The experience of ENIAC was critical in developing the UNIVAC, which was
the first commerical computer. The 1952 presidential election forecasting
was a major public relations coup for computers.

lwin

unread,
Dec 24, 1999, 3:00:00 AM12/24/99
to
> It had already started. The IBM PC was just IBM's attempt to break into the
> personal computer market. Remember that CP/M and the Apple II was already
> out and fairly successful when IBM released its PC....

Automobiles and automobile manufacturers were in place when Henry Ford
introduced the Model T, but that car is recognized as the beginning of
the auto industry. There were lots of other cars before and during
the reign of the Model T, including some names of today, but the Model
T is what we remember and credit as the real start.


It was the IBM PC that set the micro market on fire. Maybe the IBM PC was
a success solely because it had the IBM label on it. None the less, it was
that that made microcomputers "legitimate" instead of a curiosity or a
"toy".

> One reason that the IBM PC isn't better than it is is that IBM didn't
> really think it would be a great success so they didn't put all
> that much resources into the development.

IBM wanted to get it out the door fast with a minimum of overhead,
and used tried and true off-the-shelf hardware. Per the IBM way, they
also had software packages ready to go. That decision was quite prudent.

I believe the "open architecture" design was intentional to purposely
allow other hardware and software developers the chance to produce
things that would improve the utility of the product. In other words,
if you were to develop a 3rd party add-on hardware card, that's great,
because now another customer would have a reason to buy the IBM PC.
And that's exactly what happened. (I suspect back then they were still
a little sensitive about anti-trust, too). Another smart decision.


As I understand it, IBM blew it AFTER the PC became a big success, when
the rest of the company tried to milk its profits. I think IBM started
to expect the same profitability from its PC as it did from its mainframe
market, but instead should've looked at it more like its typewriters--
high volume.

Bernd Paysan

unread,
Dec 24, 1999, 3:00:00 AM12/24/99
to
Paul DeMone wrote:
> Stack machines were the perfect fit to brain dead compilers and expensive
> flip-flops. But we are long past that phase. We can afford to perform
> data flow analysis, common sub-expression reuse, and interprocedural
> register allocation. We can also afford reasonably large and capacious
> general purpose register files with access times comparable to the minimum
> logic delay to perform basic integer arithmetic and logic functions.

No, not in general. CPUs don't just sit in large tower cases under your
desk, they now sit almost everywhere. There are many more CPUs in
embedded systems, and all those CPUs still have to be small. Most of
them are ACCU machines, and most of their ISAs are so brain-dead that
writing a compiler is really a pain. Stack computers, which can be
implemented with about the same amount of resources as simple ACCU
machines, allow to produce several times more compact code - with a much
simpler compiler - as for the latter (especially since you can pass
parameters to common subroutines much easier), and every K of ROM is
costly in these applications (a few cents price increase of a sub-dollar
chip is expensive).

Stack machines scale well up to the point where you have to go
superscalar. The doctor tells you: if it hurts, don't do that. Stack
oriented code is already a good enough target for recompilation, so you
have that as an upgrade path.

I'd also like to point out that "closing the semantic gap" ISA elements
are at least as bad for stack computers as for any other sort of CISCy
ISA: there is typically no point in doing that, and on a stack computer,
a corresponding subroutine is also about as cheap to call as the complex
microcoded operation itself. Ok, that's only true for two-stack machines
with a separate return stack, but that should be state of the art by now
;-).

--
Bernd Paysan
"If you want it done right, you have to do it yourself"
http://www.jwdt.com/~paysan/

David Rifkind

unread,
Dec 24, 1999, 3:00:00 AM12/24/99
to
On Thu, 23 Dec 1999 14:54:10 -0800, Jim Stewart wrote:
>
>
>Tim McCaffrey wrote:
>>
>> In article <jv306s891khd1p8g5...@4ax.com>,
>> david....@worldnet.att.net says...
>> >
>>
>> >Colossus proved much of computer theory practical for dealing with an
>> >"unsolvable" problem - the 12-rotor encoding machine.
>> >
>> >ENIAC was a workhorse and testbed
>> >
>> >
>> I agree with David here, consider what would ENIAC did:
>>
>> 1) It proved an electronic calculating machine that complex
>> (more complex than it really needed to be, actually) could be
>> made to work. (Yes, Colussus and others did this, but they
>> were not public).
>>
>> 2) It gave Eckert and Mauchley credibility.
>>
>> 3) That credibility translated to credit, which allowed them
>> to create Eckert-Mauchley computers (later Univac).
>>
>> 4) Various people that worked with Eckert & Mauchley went off to
>> create there own companies, such as CDC.
>>
>> 5) IBM got real interested in computers because of E & M.
>>
>> In other words, while the ENIAC wasn't really a computer in the
>> modern sense of the word (it was more like a FPGA), and wasn't really
>> the first computer, I believe it really kicked off commercial computing.
>>
>> Tim McCaffrey

>>
>> All opinions strictly my own.
>> whole thing off.
>
>
>Add a number 6,
>
>It provided Johnny Von Neuman with the inspiration to define the CPU
>architecture that still bears his name.

Questionable. Von Neumann's name is attached to the architecture
because his draft report on the EDVAC was widely circulated and
influential, but EDVAC itself was another Eckert-Mauchly design.

EDVAC didn't run until around 1952, but interest sparked by the design,
through the draft report and events at the Moore School, led directly to
EDSAC, and influenced most later designs. That makes EDVAC my
candidate.

--
"The privileged being which we call human is distinguished from other
animals only by certain double-edged manifestations which in charity we
can only call 'inhuman.'" -- Epiktistes

Ian Joyner

unread,
Dec 24, 1999, 3:00:00 AM12/24/99
to
Paul DeMone wrote:

> Ian Joyner wrote:
> [snip]
> > The B5000 was given a bad wrap by Hennesy and Patterson who made silly
> > comparisons with the CDC 6000 at the time. I think these comments are still
> > in their latest edition. That the B5000 has evolved into a machine 100s or


> > 1000s of times faster shows speed has nothing to do with stack architecture.
>

> And exactly what was their heresy?

I didn't say anything about heresy. The heresy is usually seen the other way--you
are treated as a heretic if you doubt Hennessy and Patterson. But what they did
wrong was to use a dubious comparison.

> To sully the rarefied world of
> ivory tower conjecture about instruction set encoding efficiency or
> instruction fetch bandwidth requirements with grubby blue collar
> details like physical implementation, and *gasp*, performance?

Grubby blue collar?

No I see that the two can be separate, and indeed the past 30 years have shown
this with the same ISAs getting faster and faster.

> Stack machines were the perfect fit to brain dead compilers and expensive
> flip-flops.

Brain-dead? Is your argument so groundless that you have to characterize things
this way?

No. Paradoxically, correct separation of concerns as ISA design and physical
implementation as above means that other things considerations be integrated.
Indeed the B5000 was the first machine to design hardware that took into
consideration that these things were supposed to be programmed. So they
integrated software and hardware considerations. However, the ISA has survived
well--precisely because it abstracted away from the physical details. That way
programmers were protected from such details and their software investment was
protected from the last 30 years of hardware change. Contrast that with the CDC
6000, where is that software investment today?

> But we are long past that phase.

Are we? Seems like software is more a mess than ever. And my thesis here is that
this has resulted from low-level architectures programmed in low-level languages
(although some of these are dressed up like HLLs, like C and C++).

> We can afford to perform
> data flow analysis, common sub-expression reuse, and interprocedural
> register allocation. We can also afford reasonably large and capacious
> general purpose register files with access times comparable to the minimum
> logic delay to perform basic integer arithmetic and logic functions.
>

> Sure it is possible to speed up classical stack machine implementations
> with shadow caching of the top section of the stack in multiported
> registers. But why create an architecture and implemention edifice to
> hide the best attributes about GPRs - high speed random access through
> multiple independent ports, while adding unnecessary complication to
> superscalar implementation (T9000 ring a bell?)
>
> Setting aside the anachronistic stack architecture, what was so great
> about a 48 bit architecture with tagged data words to reduce opcode
> encoding size? Or an inefficient and wasteful floating point format
> chosen to overlap with integer data representation? And maybe compilers
> have grown up enough to decide when to do a full fledged block structured
> activation stack frame with display and when to do something a bit more
> lightweight. So don't build unwieldy assumptions about language
> implementation into hardware.


>
> All opinions strictly my own.

> --
> Paul W. DeMone The 801 experiment SPARCed an ARMs race of EPIC
> Kanata, Ontario proportions to put more PRECISION and POWER into
> dem...@mosaid.com architectures with MIPSed results but ALPHA's well
> pde...@igs.net that ends well.

The key to good performance is to keep the data as close to the processor as
possible. You can do that with both register-based and stack-based machines.
Optimizations therefore--for the most part--lie below the level of instruction
set architecture.

--
Ian Joyner
i.jo...@acm.org Eiffel for Macintosh
http://homepages.tig.com.au/~ijoyner/ http://www.object-tools.com/ot/apple.htm

Objects Unencapsulated -- The book
http://www.prenhall.com/allbooks/ptr_0130142697.html
http://www.elj.com/oue1/ (review)

Ian Joyner

unread,
Dec 24, 1999, 3:00:00 AM12/24/99
to
Gene Wirchenko wrote:

> Mark W Brehob <bre...@cse.msu.edu> wrote:


>
> >In comp.arch John Ahlstrom <jahl...@cisco.com> wrote:
> >> John Hendrickx wrote:
> >
> >
> >>> But
> >>> what was the computer of the century?
> >>>

> >> --snip snip
> >
> >I have to go with the IBM PC. No other computer has had as much influence
> >on as many people as that series of computers. Moderate architecture
> >(mainly just old) but the market impact is obvious. The 360 would get my
> >2nd place vote. It probably made the PC possible.

I wouldn't go for a machine that two kids in a garage could have invented.
Five years earlier for the two kids, it might have been a significant
achievement. But for a multi-billion dollar corporation 5 years later, I don't
think so.

Rui Pedro Mendes Salgueiro

unread,
Dec 24, 1999, 3:00:00 AM12/24/99
to
In comp.arch Toon Moene <to...@moene.indiv.nluug.nl> wrote:
> Alexandre Pechtchanski <alex*@*rockvax.rockefeller.edu> wrote:
>> IIRC, the complaint was that VAX/VMS calendar counted 2000 as a leap year
> (as it
>> should), and DEC response pointed out that VMS clock will run out in year
>> [sorry, don't remember, but many thousand years from now].

> The epoch is 18 (?) November 1858, the increment is in units of 100

> nanoseconds and the clock tick is kept in a signed 64-bit integer.

> [ That's what *I* would call a home assignment :-) ]

% bc
2^63
9223372036854775808
^D
% units
You have: 922337203685477580800 ns
You want: years
* 29227.727
/ 3.4214088e-05

So, the clock will wrap-around in the year 31086.

BTW, Linux (for Alpha)'s clock (still the only one in which time_t is
64 bits ?) will wrap-around in 292277266665. That is 292 GigaYears.
Since the remaining life span of the Sun is only ~5 GigaYears, this
should be enough.

--
http://www.mat.uc.pt/~rps/f1/ a born-again-tifoso
Mark Sandman - Morphine, RIP (July 3th, 1999, Italy)
.pt is Portugal| `Whom the gods love die young'-Menander (342-292 BC)
Europe | Villeneuve 50-82, Toivonen 56-86, Senna 60-94

jmfb...@aol.com

unread,
Dec 24, 1999, 3:00:00 AM12/24/99
to
In article <83uefo$r...@netaxs.com>, lwi...@bbs.cpcn.com (lwin) wrote:
>> It had already started. The IBM PC was just IBM's attempt to break into
the
>> personal computer market. Remember that CP/M and the Apple II was
already
>> out and fairly successful when IBM released its PC....
>
>Automobiles and automobile manufacturers were in place when Henry Ford
>introduced the Model T, but that car is recognized as the beginning of
>the auto industry. There were lots of other cars before and during
>the reign of the Model T, including some names of today, but the Model
>T is what we remember and credit as the real start.

And that's a very important point to remember. The Model T is
credited as the real start because people who didn't have tons
of money could use the car. The device became a utility rather
than a luxury.

The reason the PC (or anything else you want to call it) became
popular was because enough people got exposed to using computers
through timesharing. Timesharing gave a person the illusion
that the whole computer was his. IBM didn't do this since
the card turn around time was very long in busy shops.

Those people who could login in their office, _with the door
shut_, started using computers as an integral part of their
work. (They could make mistakes without somebody laughing at
them.) So my vote is the PDP-10 with all those timesharing
systems that were implemented on it. It wasn't just the
hardware alone that was important. It was the thousands and
thousands of people that got used to using the computer on
a day-to-day basis. Computer usage became a necessity rather
than a privilege.


>It was the IBM PC that set the micro market on fire. Maybe the IBM PC was
>a success solely because it had the IBM label on it. None the less, it was
>that that made microcomputers "legitimate" instead of a curiosity or a
>"toy".

You got the idea. But it wasn't the PC that overcame the curiousity
factor.


>> One reason that the IBM PC isn't better than it is is that IBM didn't
>> really think it would be a great success so they didn't put all
>> that much resources into the development.
>
>IBM wanted to get it out the door fast with a minimum of overhead,
>and used tried and true off-the-shelf hardware. Per the IBM way, they
>also had software packages ready to go. That decision was quite prudent.
>
>I believe the "open architecture" design was intentional to purposely
>allow other hardware and software developers the chance to produce
>things that would improve the utility of the product.

IBM was brought to that "open architecture" concept, kicking and
screaming, by DEC. One of the reasons DEC was successful (in
those olden days) was because we didn't have a rule about "thou
shalt not talk to competitors' computers).


> In other words,
>if you were to develop a 3rd party add-on hardware card, that's great,
>because now another customer would have a reason to buy the IBM PC.
>And that's exactly what happened. (I suspect back then they were still
>a little sensitive about anti-trust, too). Another smart decision.

This doesn't sound right. That flavor of hardware business was
generated by the intense startups that solved computing problems.
IBM was playing catchup. Think of all those gamers who had
a lot of money to throw at hardware.

>
>
>As I understand it, IBM blew it AFTER the PC became a big success, when
>the rest of the company tried to milk its profits. I think IBM started
>to expect the same profitability from its PC as it did from its mainframe
>market, but instead should've looked at it more like its typewriters--
>high volume.

IBM's targets were still the business world, weren't they? Didn't
the home computer market grow out of the gaming market?

/BAH


Subtract a hundred and four for e-mail.

Russell Crook

unread,
Dec 24, 1999, 3:00:00 AM12/24/99
to
Rui Pedro Mendes Salgueiro wrote:
>

<snip VMS clock expiry calculation>

>
> So, the clock will wrap-around in the year 31086.
>
> BTW, Linux (for Alpha)'s clock (still the only one in which time_t is
> 64 bits ?)

Not so. 64-bit Solaris 7 (and beyond) kernels have 64 bit
time_t (which is what 64-bit applications use,
although legacy 32-bit applications still use a
32-bit time_t in that environment).

Don't know about other 64-bit Unixes.

> will wrap-around in 292277266665. That is 292 GigaYears.
> Since the remaining life span of the Sun is only ~5 GigaYears, this
> should be enough.

For most things :->

>
> --
> http://www.mat.uc.pt/~rps/f1/ a born-again-tifoso
> Mark Sandman - Morphine, RIP (July 3th, 1999, Italy)
> .pt is Portugal| `Whom the gods love die young'-Menander (342-292 BC)
> Europe | Villeneuve 50-82, Toivonen 56-86, Senna 60-94

--
Russell Crook, Systems Engineer, Computer Systems
Sun Microsystems of Canada Inc. 19 Allstate Parkway, Suite 305
Markham, Ontario, Canada L3R 5A4 rmc...@Canada.Sun.com
Tel: +1-905-943-4625 Fax: +1-905-943-4601
Not speaking officially for Sun (or anyone else, for that matter).

In theory, there's no difference between theory and practice;
in practice, there is.

Alexandre Pechtchanski

unread,
Dec 24, 1999, 3:00:00 AM12/24/99
to
On 23 Dec 1999 20:26:25 GMT, Toon Moene <to...@moene.indiv.nluug.nl> wrote:

>Alexandre Pechtchanski <alex*@*rockvax.rockefeller.edu> wrote:
>
>> IIRC, the complaint was that VAX/VMS calendar counted 2000 as a leap year
>(as it
>> should), and DEC response pointed out that VMS clock will run out in year
>> [sorry, don't remember, but many thousand years from now].
>

>Well, if your slide rule is still operational (the models I know are all Y2K
>compliant), you could even calculate it yourself.
>

>The epoch is 18 (?) November 1858, the increment is in units of 100
>nanoseconds and the clock tick is kept in a signed 64-bit integer.

Ugh, thanks, I guess... I've forgotten epoch start during the three years I
don't administer VMS anymore...

>Success !


>
>[ That's what *I* would call a home assignment :-) ]

;-)

[ When replying, remove *'s from address ]
Alexandre Pechtchanski, Systems Manager, RUH, NY

Peter Seebach

unread,
Dec 24, 1999, 3:00:00 AM12/24/99
to
In article <83vns3$8bv$1...@rena.mat.uc.pt>,

Rui Pedro Mendes Salgueiro <r...@rena.mat.uc.pt> wrote:
>BTW, Linux (for Alpha)'s clock (still the only one in which time_t is
>64 bits ?) will wrap-around in 292277266665. That is 292 GigaYears.

>Since the remaining life span of the Sun is only ~5 GigaYears, this
>should be enough.

Oh, *MY* code won't still be running, sure. That's what everyone says.

Short-term thinkers. Bah!

-s
--
Copyright 1999, All rights reserved. Peter Seebach / se...@plethora.net
C/Unix wizard, Pro-commerce radical, Spam fighter. Boycott Spamazon!
Consulting & Computers: http://www.plethora.net/
Get paid to surf! No spam. http://www.alladvantage.com/go.asp?refid=GZX636

lwin

unread,
Dec 24, 1999, 3:00:00 AM12/24/99
to
> The reason the PC (or anything else you want to call it) became
> popular was because enough people got exposed to using computers
> through timesharing. Timesharing gave a person the illusion
> that the whole computer was his. IBM didn't do this since
> the card turn around time was very long in busy shops.

If you're saying timesharing was the most important innovation,
then I would say the Kemeny-Kurtz Dartmouth SYSTEM (the BASIC
language, timesharing support, host computer, and Teletypes and
modems all together) represents the most important.


> This doesn't sound right. That flavor of hardware business was
> generated by the intense startups that solved computing problems.
> IBM was playing catchup. Think of all those gamers who had
> a lot of money to throw at hardware.

IBM almost always played catchup. But when it got a product out
the door, the industry took notice.


John Stott

unread,
Dec 24, 1999, 3:00:00 AM12/24/99
to
Toon Moene <to...@moene.indiv.nluug.nl> wrote:

>Alexandre Pechtchanski <alex*@*rockvax.rockefeller.edu> wrote:
>
>> IIRC, the complaint was that VAX/VMS calendar counted 2000 as a leap year
>(as it
>> should), and DEC response pointed out that VMS clock will run out in year
>> [sorry, don't remember, but many thousand years from now].
>
>Well, if your slide rule is still operational (the models I know are all Y2K
>compliant), you could even calculate it yourself.
>
>The epoch is 18 (?) November 1858, the increment is in units of 100
>nanoseconds and the clock tick is kept in a signed 64-bit integer.

To quote Digital:

This base time of Nov. 17, 1858 has since been used by TOPS-10,
TOPS-20, and VAX/VMS. Given this base date, the 100 nanosecond
granularity implemented within VAX/VMS, and the 63-bit absolute time
representation (the sign bit must be clear), VMS should have no
trouble with time until:

31-JUL-31086 02:48:05.47

At this time, all clocks and time-keeping operations within VMS will
suddenly stop, as system time values go negative.

--
John P. Stott jps...@src.wisc.edu
Synchrotron Radiation Center http://www.src.wisc.edu
University of Wisconsin-Madison http://www.wisc.edu

Doug Siebert

unread,
Dec 24, 1999, 3:00:00 AM12/24/99
to
Rui Pedro Mendes Salgueiro <r...@rena.mat.uc.pt> writes:

>BTW, Linux (for Alpha)'s clock (still the only one in which time_t is
>64 bits ?) will wrap-around in 292277266665. That is 292 GigaYears.
>Since the remaining life span of the Sun is only ~5 GigaYears, this
>should be enough.


I think all the 64 bit versions of Unix have a 64 bit time_t. I know
HP-UX does. I just checked IRIX and it appears that it deliberately
uses int for time_t on 64 bit versions. Oh well, IRIX probably won't be
around in 2038 anyway. Considering how SGI is racing faster than anyone
to port all their useful proprietary Unix technology over to Linux, it
probably won't even take until 2008.

HP-UX 11's release notes mention about the 64 bit time_t but say that
due to standards, various functions like asctime() are only good up
through 9999 -- because the standards state it provides the return value
in a 26 character string. I really hope anyone who is concerned about
a Y10K problem due to this is just kidding...

--
Douglas Siebert Director of Computing Facilities
douglas...@uiowa.edu Division of Mathematical Sciences, U of Iowa

I'm not too interested in caller ID. But caller IQ, I'll pay a lot for that!

John Savard

unread,
Dec 24, 1999, 3:00:00 AM12/24/99
to
Paul DeMone <pde...@igs.net> wrote, in part:

> The popularity of the x86 platform and the self-reinforcing
> coalescing of software development around it has led to a
> revolution in bringing computing to the masses in the 20th
> century.

Yes, and that same popularity, I think, makes it virtually certain
that the IA-64 will be the dominant architecture of the next
generation: it will simply eclipse the 386 architecture the way the
386 architecture replaced the 8086 architecture.

Anything not in the direct line of Intel development will simply not
have a real chance to compete.

John Savard (jsavard<at>ecn<dot>ab<dot>ca)
http://www.ecn.ab.ca/~jsavard/crypto.htm

John Savard

unread,
Dec 24, 1999, 3:00:00 AM12/24/99
to
Paul DeMone <pde...@igs.net> wrote, in part:

> The real question is what will be the computer of the *next*
> century and what is the MPU in it?

While I think the IA-64 has the next decade or so sewn up, a century
is a *long* time. Reconfigurable hardware will probably be one of the
first things to make a big splash...

by 2099, I think we'll have seen sentient silicon.

John Savard

unread,
Dec 24, 1999, 3:00:00 AM12/24/99
to
lwi...@bbs.cpcn.com (lwin) wrote, in part:

>> It had already started. The IBM PC was just IBM's attempt to break into the
>> personal computer market. Remember that CP/M and the Apple II was already
>> out and fairly successful when IBM released its PC....

>Automobiles and automobile manufacturers were in place when Henry Ford
>introduced the Model T, but that car is recognized as the beginning of
>the auto industry. There were lots of other cars before and during
>the reign of the Model T, including some names of today, but the Model
>T is what we remember and credit as the real start.

>It was the IBM PC that set the micro market on fire. Maybe the IBM PC was


>a success solely because it had the IBM label on it. None the less, it was
>that that made microcomputers "legitimate" instead of a curiosity or a
>"toy".

It did knock out the CP/M machines, basically because it had more
growth potential.

Of course, MS-DOS was an imitation of CP/M, and CP/M was an imitation
of some DEC operating systems. (The "PIP" command for copying in CP/M
was a dead giveaway...)

But then, the early DEC assembly languages imitated the IBM 704 (just
about everyone did, with three-letter mnemonics, until the IBM 360
showed people there was another way), so it comes full circle...

John Savard

unread,
Dec 24, 1999, 3:00:00 AM12/24/99
to
dpes...@u.washington.edu (Derek Peschel) wrote, in part:

>You are right about paper tape. I doubt Colossus had much of an influence
>on the later machines (even Turing's own at the National Physical
>Laboratory) because it was so hugely secret.

It was also very special-purpose in nature; but things like how one
made a logic gate from vacuum tubes may well have been influenced by
it, or how one goes about keeping a machine with thousands of vacuum
tubes running with a decent MTBF - as independently rediscovered on
this side of the pond, you run them well below spec.

The secret high-level characteristics of Colossus probably did have
little influence: but its low-level characteristics could well have
been the starting point for many people in Britain. Just by revealing
that an electronic computer is possible, it had an influence.

John Savard

unread,
Dec 24, 1999, 3:00:00 AM12/24/99
to
John Ahlstrom <jahl...@cisco.com> wrote, in part:

>Will we be reviled in the 9990s for not having had the
>foresight to use 5 digit dates? We know the need for them
>is coming. We pretend none of our apps or data bases will
>still be in use by then.

That's a little *too* far off to worry about. By then, computers will
be using flexible-length data types, and will be intelligent enough to
see if the code they're executing makes sense...

John Savard

unread,
Dec 24, 1999, 3:00:00 AM12/24/99
to
Thomas Jespersen <tho...@daimi.au.dk> wrote, in part:

>Wasn't that machine kept classified for many years? That means it
>could not have started the computer industry.

While it was indeed secret, those who knew the secret still would have
been able to make use of some of its lower-level details that were
strictly in the domain of electronics, not cryptography, as a starting
point to build on.

Dennis Yelle

unread,
Dec 24, 1999, 3:00:00 AM12/24/99
to
"Keith R. Williams" wrote:
>
> On Tue, 21 Dec 1999 16:29:29, John Ahlstrom <jahl...@cisco.com>

> wrote:
>
> > John Hendrickx wrote:
> >
> >
> > > But
> > > what was the computer of the century?
> > >
>
> Well, I was thinking of the DigiComp-1. I know a lot of
> programmers/engineers that had one of these plastic pups as their
> first computers.

Is that the one with 3 bits of RAM and about 36 bits of ROM?

I had one of those.

Dennis Yelle

P. S.
I misread "plastic pups" as "plastic pumps" the first time I read
the message above. "plastic pumps" describes the device, because
the "clock" was a plastic thing that one moved in and out by hand
to clock the thing into the next state.

--
Want to get paid for using the internet?
If so, go to: http://alladvantage.com/go.asp?refid=BAL536

Keith R. Williams

unread,
Dec 25, 1999, 3:00:00 AM12/25/99
to
On Tue, 21 Dec 1999 16:29:29, John Ahlstrom <jahl...@cisco.com>
wrote:

> John Hendrickx wrote:
>
>
> > But
> > what was the computer of the century?
> >

Well, I was thinking of the DigiComp-1. I know a lot of
programmers/engineers that had one of these plastic pups as their
first computers.

----
Keith


Keith R. Williams

unread,
Dec 25, 1999, 3:00:00 AM12/25/99
to
On Fri, 24 Dec 1999 17:05:06, jsa...@domain.ctry (John Savard)
wrote:

> Paul DeMone <pde...@igs.net> wrote, in part:
>

> > The popularity of the x86 platform and the self-reinforcing
> > coalescing of software development around it has led to a
> > revolution in bringing computing to the masses in the 20th
> > century.
>
> Yes, and that same popularity, I think, makes it virtually certain
> that the IA-64 will be the dominant architecture of the next
> generation: it will simply eclipse the 386 architecture the way the
> 386 architecture replaced the 8086 architecture.

I disagree totally. The 386 "eclipsed" the 8086 because it was
upward compatable, equally cheap, and many times faster. In fact
the software was many years late exploiting the features of the
386 and 486.

> Anything not in the direct line of Intel development will simply not
> have a real chance to compete.

That's exactly the point. IA64 is not in the "direct line" of
Intel development. It has the same chance to fail as the
IAPX-432. Meanwhile the x86 will live on just because of the
weight of the applications behind it.

----
Keith

Gene Wirchenko

unread,
Dec 25, 1999, 3:00:00 AM12/25/99
to
jsa...@domain.ctry (John Savard) wrote:

>lwi...@bbs.cpcn.com (lwin) wrote, in part:

[snip]

>>It was the IBM PC that set the micro market on fire. Maybe the IBM PC was
>>a success solely because it had the IBM label on it. None the less, it was
>>that that made microcomputers "legitimate" instead of a curiosity or a
>>"toy".
>
>It did knock out the CP/M machines, basically because it had more
>growth potential.

Hardly. CP/M on 8-bit machines was at about the end of its
lifetime anyway.

[snip]

Sincerely,

Gene Wirchenko

Computerese Irregular Verb Conjugation:
I have preferences.
You have biases.
He/She has prejudices.

Victor A. Garcia

unread,
Dec 25, 1999, 3:00:00 AM12/25/99
to
Afraid you might be wrong, when I started working around computers (circa
1974), we asked the question of the year 2000 problem to our boss, ( he had
a PhD. in Mathematics), guess what he say to us:

" Why worry about something so far ahead, they will have solved it by
then"

Yeah, sure..........

"John Savard" <jsa...@domain.ctry> wrote in message
news:3863ab2f...@news.prosurfr.com...

It is loading more messages.
0 new messages