Is computer history taugh now?

78 views
Skip to first unread message

dk

unread,
Feb 15, 2007, 9:18:46 PM2/15/07
to
Hi,

I am wondering if computer is still taught. It seems to me that it is
limited to only 5 years back. The reason I say this, is that in some
academic groups, some people think windows is the first pc operating
system, forgetting about cp/m and others.

David

Anne & Lynn Wheeler

unread,
Feb 15, 2007, 9:55:07 PM2/15/07
to

cms was definitely personal computing from mid-60s ... some past posts
mentioning that at least some cp/m influenced by cp67/cms
http://www.garlic.com/~lynn/2004b.html#5 small bit of cp/m & cp/67 trivia from alt.folklore.computers n.g. (thread)
http://www.garlic.com/~lynn/2004e.html#38 [REALLY OT!] Overuse of symbolic constants
http://www.garlic.com/~lynn/2004h.html#40 Which Monitor Would You Pick??????

Kildall using cp67/cms at navy post-graduate in 72
http://www.khet.net/gmc/docs/museum/en_cpmName.html

cms personal computing ran in (the really old, new thing) virtual
machine ... originally cp67/cms and later vm370/cms (in the morph from
cp67 to vm370, cms was renamed from the cambridge monitor system to
the conversational monitor system).

internal cms implementation had/has "handles" like CON1, RDR1, PUN1,
PTR1, TAP1, TAP2, DSK1, DSK2, DSK3, ....

table from gh20-0859, pg. 5, cp-67/cms user's guide
http://www.garlic.com/~lynn/2004.html#45 40th anniversary of IBM System/360 on 7 Apr 2004

other references:
http://www.garlic.com/~lynn/2002m.html#11 DOS history question
http://www.garlic.com/~lynn/2004b.html#0 Is DOS unix?
http://www.garlic.com/~lynn/2004b.html#56 Oldest running code
http://www.garlic.com/~lynn/2004c.html#3 Oldest running code

also
http://www.khet.net/gmc/docs/museum/en_cpmName.html

from above:

And CP/CMS stands for Control Program/Cambridge Monitor System, the
first "virtual machine" OS to go "prime time", and was written not by
the product OS people, but by the research laboratory!

... snip ...

there actually some amount of sensitivity regarding the above
statement.

There was an article that appeared in a corporate monthly publication
that made some assertions that virtual machines were first done by
corporate Researchers. There was a number of protests written by
internal employees from the cambridge science center
http://www.garlic.com/~lynn/subtopic.html#545tech

demanding a retraction (which never happened).

Later a similar article appeared that claimed that virtual machines
were first done by corporate researchers. The letters of protest were
again repeated, this time the publication editor responded that
"researcher" (with little case "r") could be construed as including
people at the corporate science centers (as opposed to the previous
scenario where upper case "R" could only be construed to mean people
from the corporate Research divison).

Randy Howard

unread,
Feb 15, 2007, 9:57:09 PM2/15/07
to
On Thu, 15 Feb 2007 20:18:46 -0600, dk wrote
(in article <1171592326....@a75g2000cwd.googlegroups.com>):

CP/M was late to the party.

--
Randy Howard (2reply remove FOOBAR)
"The power of accurate observation is called cynicism by those
who have not got it." - George Bernard Shaw

Michael Black

unread,
Feb 15, 2007, 10:46:56 PM2/15/07
to
Randy Howard (randy...@FOOverizonBAR.net) writes:
> On Thu, 15 Feb 2007 20:18:46 -0600, dk wrote
> (in article <1171592326....@a75g2000cwd.googlegroups.com>):
>
>> Hi,
>>
>> I am wondering if computer is still taught. It seems to me that it is
>> limited to only 5 years back. The reason I say this, is that in some
>> academic groups, some people think windows is the first pc operating
>> system, forgetting about cp/m and others.
>
> CP/M was late to the party.
>
But doesn't that prove the poster's point?

He's lamenting that "MSDOS" is seen by many as the first operating system,
and then mentions CP/M.

He's a result of that same mentality, only the specific time has shifted
a few years earlier.

If people think CP/M is an example of an "early operating system" then
of course history isn't being taught in this area.

Michael


Randy Howard

unread,
Feb 15, 2007, 11:14:28 PM2/15/07
to
On Thu, 15 Feb 2007 21:46:56 -0600, Michael Black wrote
(in article <er39fg$af1$1...@theodyn.ncf.ca>):

> Randy Howard (randy...@FOOverizonBAR.net) writes:
>> On Thu, 15 Feb 2007 20:18:46 -0600, dk wrote
>> (in article <1171592326....@a75g2000cwd.googlegroups.com>):
>>
>>> Hi,
>>>
>>> I am wondering if computer is still taught. It seems to me that it is
>>> limited to only 5 years back. The reason I say this, is that in some
>>> academic groups, some people think windows is the first pc operating
>>> system, forgetting about cp/m and others.
>>
>> CP/M was late to the party.
>>
> But doesn't that prove the poster's point?
>
> He's lamenting that "MSDOS" is seen by many as the first operating system,
> and then mentions CP/M.
>
> He's a result of that same mentality, only the specific time has shifted
> a few years earlier.

Correct. I was trying to be a bit more subtle about calling him on it.

> If people think CP/M is an example of an "early operating system" then
> of course history isn't being taught in this area.

One wonders why "history" is a term used to describe teaching of
something in the so recent past. I think of things happening centuries
ago or longer as such.

I wonder if 1000 years from now anyone will care at all about the perks
of 8-bit processors?

Walter Bushell

unread,
Feb 15, 2007, 11:33:36 PM2/15/07
to
In article <1171592326....@a75g2000cwd.googlegroups.com>,
"dk" <sa...@kanecki.com> wrote:

cp/m was a late comer.

--
"The power of the Executive to cast a man into prison without formulating any
charge known to the law, and particularly to deny him the judgement of his
peers, is in the highest degree odious and is the foundation of all totali-
tarian government whether Nazi or Communist." -- W. Churchill, Nov 21, 1943

Walter Bushell

unread,
Feb 15, 2007, 11:37:37 PM2/15/07
to
In article <0001HW.C1FA8BC4...@news.verizon.net>,
Randy Howard <randy...@FOOverizonBAR.net> wrote:

They may be more interested in chipping flint. I think this period, from
the early 1900s will be of great interest to historians, if civilization
and humanity survive.

David Librik

unread,
Feb 16, 2007, 2:49:41 AM2/16/07
to
Walter Bushell <pr...@panix.com> writes:
>In article <1171592326....@a75g2000cwd.googlegroups.com>,
> "dk" <sa...@kanecki.com> wrote:
>> The reason I say this, is that in some
>> academic groups, some people think windows is the first pc operating
>> system, forgetting about cp/m and others.

>cp/m was a late comer.

I don't see how you can say that. Notice that he said "PC operating system."

If "PC" means the IBM Personal Computer, then MSDOS (not Windows) was
basically the first DOS. I'm excluding the ROM Basic as an "operating
system," though you could argue it was if you wanted to bore everybody
with tedious semantics. (I don't remember if there was an earlier
in-house disk operating system used for testing and development.)

If "PC" refers to any home microcomputer, as it sometimes did in the
pre-IBM-PC days, then CP/M has a pretty good case for being one of the
first DOSes. (Again, I'm excluding the Micro-Soft BASIC that came on
paper tape, cassette, or ROM.) Written in 1974, it ran on the Altair
and predates every other microcomputer DOS that I know of.

If "PC" means *any* personal computer, as in "well *I* had a PDP-11 for
myself at home back in '73 and it ran RT-11, the finest operating system
ever written" then you win. :-)

- David Librik
lib...@panix.com

grey...@gmaildo.tcom

unread,
Feb 16, 2007, 5:10:40 AM2/16/07
to
On 2007-02-16, Randy Howard <randy...@FOOverizonBAR.net> wrote:
>>>
>> But doesn't that prove the poster's point?
>>
>> He's lamenting that "MSDOS" is seen by many as the first operating system,
>> and then mentions CP/M.
>>
>> He's a result of that same mentality, only the specific time has shifted
>> a few years earlier.
>
> Correct. I was trying to be a bit more subtle about calling him on it.
>
>> If people think CP/M is an example of an "early operating system" then
>> of course history isn't being taught in this area.
>
> One wonders why "history" is a term used to describe teaching of
> something in the so recent past. I think of things happening centuries
> ago or longer as such.
>
> I wonder if 1000 years from now anyone will care at all about the perks
> of 8-bit processors?

For the same reasons that `History' (rather that stories of how our
side was always right) is taught, to show how different systems were
tried over the years, how some failed, and others succeeded.

--
Greymaus
Just another grumpy old man

Steve O'Hara-Smith

unread,
Feb 16, 2007, 4:50:48 AM2/16/07
to
On Fri, 16 Feb 2007 07:49:41 +0000 (UTC)
David Librik <lib...@panix.com> wrote:

> Walter Bushell <pr...@panix.com> writes:

> >cp/m was a late comer.
>
> I don't see how you can say that. Notice that he said "PC operating
> system."
>
> If "PC" means the IBM Personal Computer, then MSDOS (not Windows) was
> basically the first DOS. I'm excluding the ROM Basic as an "operating

CP/M-86 was around at the same time as MSDOS first appeared. I'd be
hard put to say which was first, but whichever it was the gap was small.

> If "PC" refers to any home microcomputer, as it sometimes did in the
> pre-IBM-PC days, then CP/M has a pretty good case for being one of the
> first DOSes.

It's certainly been around just about as long as micro based
computers.

> (Again, I'm excluding the Micro-Soft BASIC that came on
> paper tape, cassette, or ROM.) Written in 1974, it ran on the Altair
> and predates every other microcomputer DOS that I know of.

What did the early SWTPC 6800 based machines run ? They were out at
the same time as the Altair IIRC.

--
C:>WIN | Directable Mirror Arrays
The computer obeys and wins. | A better way to focus the sun
You lose and Bill collects. | licences available see
| http://www.sohara.org/

Peter Flass

unread,
Feb 16, 2007, 6:57:21 AM2/16/07
to
David Librik wrote:
> Walter Bushell <pr...@panix.com> writes:
>
>>In article <1171592326....@a75g2000cwd.googlegroups.com>,
>>"dk" <sa...@kanecki.com> wrote:
>>
>>> The reason I say this, is that in some
>>>academic groups, some people think windows is the first pc operating
>>>system, forgetting about cp/m and others.
>
>
>>cp/m was a late comer.
>
>
> I don't see how you can say that. Notice that he said "PC operating system."

The fallacy is that windoze is *not* an operating system, at least until
NT, and probably later.

jmfb...@aol.com

unread,
Feb 16, 2007, 8:10:10 AM2/16/07
to
>Hi,
>
>I am wondering if computer is still taught.

Not yet. We're still making it.

>It seems to me that it is
>limited to only 5 years back. The reason I say this, is that in some
>academic groups, some people think windows is the first pc operating
>system, forgetting about cp/m and others.

They do that because the past is too near. There's no money nor
prestige(sp?) in studying the auld stuff yet. I think that
part of this is due to the computer biz not being developed under
similar constraints as science does. We never documented what
didn't work as part of job; we patched it and then edited the
mistake out of the soruces. No other human being saw the mistake.

/BAH

jmfb...@aol.com

unread,
Feb 16, 2007, 8:14:11 AM2/16/07
to
In article <er3nml$s2e$1...@reader2.panix.com>,

David Librik <lib...@panix.com> wrote:
>Walter Bushell <pr...@panix.com> writes:
>>In article <1171592326....@a75g2000cwd.googlegroups.com>,
>> "dk" <sa...@kanecki.com> wrote:
>>> The reason I say this, is that in some
>>> academic groups, some people think windows is the first pc operating
>>> system, forgetting about cp/m and others.
>
>>cp/m was a late comer.
>
>I don't see how you can say that. Notice that he said "PC operating system."
>
>If "PC" means the IBM Personal Computer, then MSDOS (not Windows) was
>basically the first DOS. I'm excluding the ROM Basic as an "operating
>system," though you could argue it was if you wanted to bore everybody
>with tedious semantics. (I don't remember if there was an earlier
>in-house disk operating system used for testing and development.)
>
>If "PC" refers to any home microcomputer, as it sometimes did in the
>pre-IBM-PC days, then CP/M has a pretty good case for being one of the
>first DOSes. (Again, I'm excluding the Micro-Soft BASIC that came on
>paper tape, cassette, or ROM.) Written in 1974, it ran on the Altair
>and predates every other microcomputer DOS that I know of.

Thank you for kicking our main frame butt ;-).


>
>If "PC" means *any* personal computer, as in "well *I* had a PDP-11 for
>myself at home back in '73

Bragger...

and it ran RT-11, the finest operating system
>ever written" then you win. :-)

I didn't care for RT-11 as much as I cared for IAS. RT-11 was
too [emoticon gropes for a good word...fails] constraining.

/BAH

jmfb...@aol.com

unread,
Feb 16, 2007, 8:17:57 AM2/16/07
to
In article <45d59c23$0$24709$4c36...@roadrunner.com>,

Sigh! It is not an OS now--I don't give a shit where it resides
in core. You determine an OS by the services it provides and
the level of execution is needs to provide them. The functionaly
of Windows is simply not monitor-exec level.

What it does is our equivalent of doing direct I/O to the raw
disk from a host NFTing in.

There! How's that analogy?

/BAH

Charlie Gibbs

unread,
Feb 16, 2007, 11:30:53 AM2/16/07
to
In article <0001HW.C1FA8BC4...@news.verizon.net>,
randy...@FOOverizonBAR.net (Randy Howard) writes:

> On Thu, 15 Feb 2007 21:46:56 -0600, Michael Black wrote
> (in article <er39fg$af1$1...@theodyn.ncf.ca>):
>
>> Randy Howard (randy...@FOOverizonBAR.net) writes:
>>
>>> On Thu, 15 Feb 2007 20:18:46 -0600, dk wrote
>>> (in article <1171592326....@a75g2000cwd.googlegroups.com>):
>>>
>>>> Hi,
>>>>
>>>> I am wondering if computer is still taught. It seems to me that it
>>>> is limited to only 5 years back.

It saves a lot of effort, and makes revisionism much easier.

>>>> The reason I say this, is that in some academic groups, some people
>>>> think windows is the first pc operating system, forgetting about
>>>> cp/m and others.
>>>
>>> CP/M was late to the party.
>>>
>> But doesn't that prove the poster's point?
>>
>> He's lamenting that "MSDOS" is seen by many as the first operating
>> system, and then mentions CP/M.

He never mentioned MS-DOS. I'm sure that many of the people he's
talking about would be mystified if you opened a command window.

>> He's a result of that same mentality, only the specific time has
>> shifted a few years earlier.
>
> Correct. I was trying to be a bit more subtle about calling him on it.
>
>> If people think CP/M is an example of an "early operating system"
>> then of course history isn't being taught in this area.
>
> One wonders why "history" is a term used to describe teaching
> of something in the so recent past. I think of things happening
> centuries ago or longer as such.

There's nothing in the definition of "history" that says it has
to be centuries ago. For really old events, the word "history"
often has the word "ancient" prepended. There's nothing wrong
with this; think about it the next time you use the command-line
history feature on your machine.

> I wonder if 1000 years from now anyone will care at all about the
> perks of 8-bit processors?

There'll always be ancient historians to think about ancient history.

--
/~\ cgi...@kltpzyxm.invalid (Charlie Gibbs)
\ / I'm really at ac.dekanfrus if you read it the right way.
X Top-posted messages will probably be ignored. See RFC1855.
/ \ HTML will DEFINITELY be ignored. Join the ASCII ribbon campaign!

Dave Hansen

unread,
Feb 16, 2007, 11:47:54 AM2/16/07
to

I'm not sure if you're more interested in learning more about computer
history, or lamenting that it isn't taught anymore (if it ever was).
But if it's the former, see http://www.cbi.umn.edu/, The Charles
Babbage Institute at the University of MN.

Regards,

-=Dave

Al _Kossow

unread,
Feb 16, 2007, 11:50:58 AM2/16/07
to

The Computer History Museum (computerhistory.org) is a more accessable
source for computer history than CBI, which is mainly a research library.


--
Posted via a free Usenet account from http://www.teranews.com

hanc...@bbs.cpcn.com

unread,
Feb 16, 2007, 1:46:15 PM2/16/07
to
On Feb 15, 9:18 pm, "dk" <s...@kanecki.com> wrote:

> I am wondering if computer is still taught.

When I had a computer science classes, they gave a brief mention of
the technology prior to that point. They mentioned Babbage's
Difference Engine, Hollerith doing the 1890 Census, the ENIAC, and
that computers were getting smaller, cheaper, and more powerful
through the years to date.

None of this "was on the test", it was just part of the introductory
chapter.

Other than perhaps a teacher digressing into some personal
reminiscing, no other history was taught. When I got into industry I
learned stuff from workers and the fact my employer had some "old"
stuff still running.

I don't recall students, including myself at the time, being very much
interested in computer history. Indeed, we wanted to learned the very
newest stuff. I think that is a characteristic of youth.


Frankly I don't think there'd be much interest nor use of computer
history. It's really a separate subject.

In other fields, how much history is taught? Do engineers spend time
learning about long obsolete methods of bridge construction (beyond a
general intro)? Do doctors learn about long obsolete healing methods
(beyond a general intro)?

I think predecessor techniques ought to be taught if they have
relevance to modern day problem solving. Perhaps doctors should learn
about house calls and the techniques of the old fashioned doctors seen
in the movies so as to understand the importantance of merely
comforting the patient beyond medicine. If a seldom used old drug or
treatment still has value in some cases, then it still should be
taught (sulfa drugs are still needed as anti-biotics occassionally).
Computer people should be taught basic concepts of cateogrization,
sorting, collating, and tabulation, even without a tab machine
process.

However, there is no need to teach about the conditions of vacuum
tubes or drum memory or wiring a plugboard.


Brian Inglis

unread,
Feb 16, 2007, 1:53:00 PM2/16/07
to
On 16 Feb 07 08:30:53 -0800 in alt.folklore.computers, "Charlie Gibbs"
<cgi...@kltpzyxm.invalid> wrote:

>In article <0001HW.C1FA8BC4...@news.verizon.net>,
>randy...@FOOverizonBAR.net (Randy Howard) writes:
>

>> One wonders why "history" is a term used to describe teaching
>> of something in the so recent past. I think of things happening
>> centuries ago or longer as such.
>
>There's nothing in the definition of "history" that says it has
>to be centuries ago. For really old events, the word "history"
>often has the word "ancient" prepended. There's nothing wrong
>with this; think about it the next time you use the command-line
>history feature on your machine.
>
>> I wonder if 1000 years from now anyone will care at all about the
>> perks of 8-bit processors?
>
>There'll always be ancient historians to think about ancient history.

Come on, a millenium isn't that long; ancient history probably starts in
the Roman era BCE: not sure I'd label the Roman departure from
Britannia, Viking excursions, or the Dark Ages, as ancient history. The
pros assuredly have their own definition, which may not agree with my
opinion.

--
Thanks. Take care, Brian Inglis Calgary, Alberta, Canada

Brian....@CSi.com (Brian[dot]Inglis{at}SystematicSW[dot]ab[dot]ca)
fake address use address above to reply

Pascal Bourguignon

unread,
Feb 16, 2007, 1:58:12 PM2/16/07
to

hanc...@bbs.cpcn.com writes:
> Frankly I don't think there'd be much interest nor use of computer
> history. It's really a separate subject.

On the contrary, IF you can get some old computer running, it might be
very interesting because the components were macroscopic. You could
see bits perfored in cards or paper tape. Nowadays, computers reduce
to a little number of chips and there's nothing to see. If you can
play with old hardware, a lot of CS can be learned with them.


> In other fields, how much history is taught? Do engineers spend time
> learning about long obsolete methods of bridge construction (beyond a
> general intro)? Do doctors learn about long obsolete healing methods
> (beyond a general intro)?

But perhaps there's less obsolescence in CS than in the other domains.
Of course devices and interfaces change, but the fundamentals of CS,
from theory to programming are about the same. After all, we still
use programming languages whose design started 50 years ago, and OSes
whose design started 30 or 20 years ago (it's like if M.D. or
engineers used tools invented 700 years ago).


> I think predecessor techniques ought to be taught if they have
> relevance to modern day problem solving. Perhaps doctors should learn
> about house calls and the techniques of the old fashioned doctors seen
> in the movies so as to understand the importantance of merely
> comforting the patient beyond medicine. If a seldom used old drug or
> treatment still has value in some cases, then it still should be
> taught (sulfa drugs are still needed as anti-biotics occassionally).
> Computer people should be taught basic concepts of cateogrization,
> sorting, collating, and tabulation, even without a tab machine
> process.

Yes, but having access to the actual hardware is a good pedagogical
device.


> However, there is no need to teach about the conditions of vacuum
> tubes or drum memory or wiring a plugboard.

Any interested student will learn about them by himself anyways.


Perhaps the big difference between CS and any other domain, is that
more people start to learn CS stuff by themselves than MD or
architecture. It's easier to write a computer program at 10 than
it is to do an appendicectomy.


--
__Pascal Bourguignon__ http://www.informatimago.com/

"Our users will know fear and cower before our software! Ship it!
Ship it and let them flee like the dogs they are!"

hanc...@bbs.cpcn.com

unread,
Feb 16, 2007, 2:20:09 PM2/16/07
to
On Feb 16, 1:58 pm, Pascal Bourguignon <p...@informatimago.com> wrote:

> On the contrary, IF you can get some old computer running, it might be
> very interesting because the components were macroscopic. You could
> see bits perfored in cards or paper tape. Nowadays, computers reduce
> to a little number of chips and there's nothing to see. If you can
> play with old hardware, a lot of CS can be learned with them.

I agree that playing with hardware can teach a lot. But even years
ago in my day that was impractical; the mainframe was too busy and too
secured to allow that kind of downtime for individual hardware
experimentation. We had paper assemblers and simulator programs to
teach fundamentals.


> But perhaps there's less obsolescence in CS than in the other domains.
> Of course devices and interfaces change, but the fundamentals of CS,
> from theory to programming are about the same. After all, we still
> use programming languages whose design started 50 years ago, and OSes
> whose design started 30 or 20 years ago (it's like if M.D. or
> engineers used tools invented 700 years ago).

The degree of obsolencence depends on what direction the student is
going. Certain very basic concepts of systems analysis haven't
changed too much although the tools have. But some of the working
environments are radically different--Unix and the Internet are very
different than batch and COBOL or Fortran. I understand Excel is used
as a substitute for where Fortran was used. Certainly CAD eliminates
some other needs. Likewise, Excel eliminates some COBOL tasks.

Hardware has dramatically changed, not only its technology, but how we
use it. Because of the massive decreases of the price of hardware, we
can deploy it differently than in the past. We don't have to ration
out bandwidth or storage as in the past. We don't have to worry about
every bit. Some bit saving techniques were even obsolete in my day,
when machines got bigger than 64K.


> Perhaps the big difference between CS and any other domain, is that
> more people start to learn CS stuff by themselves than MD or
> architecture. It's easier to write a computer program at 10 than
> it is to do an appendicectomy.

Very true. It's also more accessible.

Aspiring doctors in my high school would work as hospital volunteers
so as to get some early medical exposure. However, the duties and
whereabouts of such teen volunteers were sharply limited; I don't
think they were allowed to follow around on rounds or observe
procedures. I was a volunteer as well, but interested in getting
business experience (too young to get a paid job which were rare). I
was able to get into the hospital computer room and observe a lot
more, as well as be a part of administrative procedures.


CBFalconer

unread,
Feb 16, 2007, 3:20:34 PM2/16/07
to
Brian Inglis wrote:
>
... snip ...

>
> Come on, a millenium isn't that long; ancient history probably
> starts in the Roman era BCE: not sure I'd label the Roman
> departure from Britannia, Viking excursions, or the Dark Ages, as
> ancient history. The pros assuredly have their own definition,
> which may not agree with my opinion.

According to my kids, before they grew up, 1970 was ancient
history. 1940 was pre-history. It seems to depend on the
viewpoint.

--
<http://www.cs.auckland.ac.nz/~pgut001/pubs/vista_cost.txt>
<http://www.securityfocus.com/columnists/423>

"A man who is right every time is not likely to do very much."
-- Francis Crick, co-discover of DNA
"There is nothing more amazing than stupidity in action."
-- Thomas Matthews


CBFalconer

unread,
Feb 16, 2007, 3:24:03 PM2/16/07
to
hanc...@bbs.cpcn.com wrote:
>
... snip ...

>
> However, there is no need to teach about the conditions of vacuum
> tubes or drum memory or wiring a plugboard.

Well, you can build a vacuum tube with much less technology.
Something like an adept technician, a glassblower, and a vacuum
pump. Could come in handy.

hanc...@bbs.cpcn.com

unread,
Feb 16, 2007, 4:41:16 PM2/16/07
to
On Feb 16, 3:24 pm, CBFalconer <cbfalco...@yahoo.com> wrote:
> Well, you can build a vacuum tube with much less technology.
> Something like an adept technician, a glassblower, and a vacuum
> pump. Could come in handy.

Sorry to nitpick, but actually you couldn't, at least not for digital
service.

When IBM began to use tubes for its machines, it used stock radio
tubes. It found they didn't work too well.

A radio tube was designed to amplify varying signals, usually in the
middle of the range and that was it. A digital tube, in contrast, was
either fully conducting or fully off. Radio tubes weren't meant to do
that and some had trouble reversing state as needed. Also, some
minute dirt in a radio tube would not be noticeable on the radio but
would interfere with digital switching.

IBM had to work with tube makers to bring digital tubes up to a far
higher level of performance. This was a tough job.


Eugene Miya

unread,
Feb 16, 2007, 4:28:08 PM2/16/07
to
Sales?
>I am wondering if computer <history> is still taught.

It, by in large, is not taught. It's not necessary to know the history
of computers in order to use them. History tends to be more a story of
people than technology. Very few classes on the history of computers
and computation exist. It's like a side bar to other classes or taught
as part of social science, etc.

> It seems to me that it is
>limited to only 5 years back. The reason I say this, is that in some
>academic groups, some people think windows is the first pc operating
>system, forgetting about cp/m and others.

No surprise. I work with a librarian who thinks Steve Wozniak invented
the mouse and is responsible for her carpal tunnel when the real guy
worked a building over (wind tunnel really) and did it after he left
here. And this is in the Santa Clara Valley.

--

Lawrence Statton XE2/N1GAK

unread,
Feb 16, 2007, 5:51:34 PM2/16/07
to
hanc...@bbs.cpcn.com writes:

> On Feb 16, 3:24 pm, CBFalconer <cbfalco...@yahoo.com> wrote:
> > Well, you can build a vacuum tube with much less technology.
> > Something like an adept technician, a glassblower, and a vacuum
> > pump. Could come in handy.
>
> Sorry to nitpick, but actually you couldn't, at least not for digital
> service.

I'll have to agree with Hancock here.

With not but an adept tech, a glass blower, and a vaccum pump, you
could probably build a low-mu triode good to a megahertz, but
generating high performance vacuum tubes was a black art that
continuously developed over 60 years.

It seems to me that every generation see the technology they grew up
with as "obvious" and "trivial", whether it was the Steam Engine, the
6SN7, the 74LS245 or the C&T entire-PC-in-one-custom-ASIC.

What will your grandchildren write fifty years from now (in this very
froup)?

"You can build a Pentium Twelve with much less technolgoy.
Something like an adept technician, some sand and some bondout
wire and leadframes. Could come in handy."

--
Lawrence Statton - lawre...@abaluon.abaom s/aba/c/g
Computer software consists of only two components: ones and
zeros, in roughly equal proportions. All that is required is to
place them into the correct order.

Charlie Gibbs

unread,
Feb 16, 2007, 5:59:01 PM2/16/07
to
In article <1171651574....@a75g2000cwd.googlegroups.com>,
hanc...@bbs.cpcn.com (hancock4) writes:

> Frankly I don't think there'd be much interest nor use of computer
> history. It's really a separate subject.
>
> In other fields, how much history is taught? Do engineers spend time
> learning about long obsolete methods of bridge construction (beyond a
> general intro)? Do doctors learn about long obsolete healing methods
> (beyond a general intro)?
>
> I think predecessor techniques ought to be taught if they have
> relevance to modern day problem solving.

They have more relevance than many people think - if for no other
reason than that they colour existing designs. Even brand-new
designs would look much different if they weren't influenced -
for good or bad - by what came before. I think it's good that
people learn the fundamentals of design principles that remain
valid much longer than our planned-obsolescence generation believes.

"Those who ignore history are doomed to repeat it."

Eugene Miya

unread,
Feb 16, 2007, 5:16:39 PM2/16/07
to
>> >>> I am wondering if computer is still taught.
>> > If people think CP/M is an example of an "early operating system" then
>> > of course history isn't being taught in this area.

In article <proto-8E9086....@reader2.panix.com>,


Walter Bushell <pr...@panix.com> wrote:
>In article <0001HW.C1FA8BC4...@news.verizon.net>,
> Randy Howard <randy...@FOOverizonBAR.net> wrote:
>> One wonders why "history" is a term used to describe teaching of
>> something in the so recent past. I think of things happening centuries
>> ago or longer as such.

Many profesional historians ascribe to that belief.
In point of fact that's why many historians get into the professional
history game: to escape the present, modern world. This is merely 1
reason why the field pissed off feminists and their adoption of
"herstory". We may be "itstory." It's like E. A. Robinson's poem about
Miniver Cheevey. I know fair number of non-technologists like this.

>> I wonder if 1000 years from now anyone will care at all about the perks
>> of 8-bit processors?
>
>They may be more interested in chipping flint. I think this period, from
>the early 1900s will be of great interest to historians, if civilization
>and humanity survive.

I would not just say "anyone." A small number of historically minded
technologists may care. Definitely not most historians.


My favorite quote which I wish I had heard and remembered in 4th grade
would have to be:

From a long view of the history of mankind - seen from, say, ten thousand
years from now, there can be little doubt that the most significant event
of the 19th century will be judged as Maxwell's discovery of the laws of
electrodynamics. The American Civil War will pale into provincial
insignificance in comparison with this important scientific event of the
same decade.

R. P. Feynman, Lectures on Physics, Vol. II, Addison-Wesley, 1964, page 1-11.


That's what has to survive.

--

Eugene Miya

unread,
Feb 16, 2007, 5:45:01 PM2/16/07
to
In article <2267.638T2...@kltpzyxm.invalid>,

Charlie Gibbs <cgi...@kltpzyxm.invalid> wrote:
>In article <0001HW.C1FA8BC4...@news.verizon.net>,
>randy...@FOOverizonBAR.net (Randy Howard) writes:
>> One wonders why "history" is a term used to describe teaching
>> of something in the so recent past. I think of things happening
>> centuries ago or longer as such.

Then use the word "Chronology."

>There's nothing in the definition of "history" that says it has
>to be centuries ago. For really old events, the word "history"
>often has the word "ancient" prepended. There's nothing wrong
>with this; think about it the next time you use the command-line
>history feature on your machine.

Basically true.

>> I wonder if 1000 years from now anyone will care at all about the
>> perks of 8-bit processors?
>
>There'll always be ancient historians to think about ancient history.

Not as simple as that.
It's going to be the story of those who care "his story."
Historians have their specific interests; they aren't quite
archeologists for instance. The education system attempts to force them
to teach generalist history. In turn we, as students, tend to have
General Education requirements which tend to have 1-2 history courses.

The general masses of students (our colleagues) tended to have some
amount of world history and in the US: US history. Some of this comes
in primary and elemenary school much less college. I watched my classmates
took and grumbled about "Western Civ." and "US history" (4 and 17) as
undergrads. I bided my time a bit, and I started with "The History of
the Atomic Age." I lucked out. I took the special seminar part first
then the main reading class second. In the seminar, we met with and had
dinner with people like Fermi's wife, and this guy named Feynman came up
and his talk was turned into a section and chapter in a book. And Teller.
And many lesser knowns (not to me, these guys were big guys like Manley,
Norris Bradbury, Kisatowski (sp), others). To this day I stay in touch
with my history prof, toss to him all the old historically relevant
reports and meeting notes, and the weird thing is that these guys I
would read about or have to buy text books for; I know many of them,
work with some of them, etc. Not the really old guys. I should not
have eaten that salad last evening because McCarthy asked me to join him
and his daughter at dinner.

In a few colleges there will be computing history classes. And I expect
them to get the history wrong. But very few.

--

Richard

unread,
Feb 16, 2007, 7:04:02 PM2/16/07
to
[Please do not mail me a copy of your followup]

Peter Flass <Peter...@Yahoo.com> spake the secret code
<45d59c23$0$24709$4c36...@roadrunner.com> thusly:

>The fallacy is that windoze is *not* an operating system, at least until
>NT, and probably later.

Correct. Windoze is not an operating system.

However, Windows is an operating system.
--
"The Direct3D Graphics Pipeline" -- DirectX 9 draft available for download
<http://www.xmission.com/~legalize/book/download/index.html>

Legalize Adulthood! <http://blogs.xmission.com/legalize/>

Richard F. Drushel

unread,
Feb 16, 2007, 8:00:19 PM2/16/07
to
Charlie Gibbs <cgi...@kltpzyxm.invalid> wrote:

: Even brand-new


: designs would look much different if they weren't influenced -
: for good or bad - by what came before.

For many years, I had a 1975/6 Amana RadarRange -- the first
commercial microwave oven IIRC. It was donated by an elderly member of
our Coleco ADAM users' group who was moving and no longer had room for
it. In appearance, it was *EXACTLY* like the over-the-stove fold-down-
drawer toaster oven of the period, brushed stainless steel exterior,
shiny chrome handle, mechanical timer with a bell. Even though the
internals had nothing to do with a resistive-heat oven, in order to sell,
it had to look like people's existing idea of an oven. It would have
to fit in with existing kitchen decor, maybe replace the cabinet space
of an existing built-in toaster oven. It even had to be made out of stuff
that homemakers could use their existing cleaning and scrubbing stuff with.
Novel way to cook, but disturb as little else as possible. Over time, of
course, the design and materials of microwaves has changed.

Our RadarRange lasted until about 2001, when kids in the basement
managed to knock it off the VCR cart it was on, denting it very badly,
and I didn't want to chance trying to power it up ever again. It weighed
a ton...

*Rich*

--
Richard F. Drushel, Ph.D. | "They fell: for Heaven to them no hope
Instructor and Executive Officer | imparts / Who hear not for the beating
Department of Biology, CWRU | of their hearts."
Cleveland, Ohio 44106-7080 U.S.A. | -- Edgar Allan Poe, "Al-Aaraaf"

Richard F. Drushel

unread,
Feb 16, 2007, 8:03:18 PM2/16/07
to
Richard F. Drushel <r...@po.cwru.edu> wrote:

: For many years, I had a 1975/6 Amana RadarRange -- the first

And I have misremembered the spelling: it's Radarange, elided R.
I was going back and forth on it...

CBFalconer

unread,
Feb 16, 2007, 10:19:36 PM2/16/07
to
Eugene Miya wrote:
>
... snip ...

>
> My favorite quote which I wish I had heard and remembered in 4th
> grade would have to be:
>
> From a long view of the history of mankind - seen from, say, ten
> thousand years from now, there can be little doubt that the most
> significant event of the 19th century will be judged as Maxwell's
> discovery of the laws of electrodynamics. The American Civil War
> will pale into provincial insignificance in comparison with this
> important scientific event of the same decade.
>
> R. P. Feynman, Lectures on Physics, Vol. II, Addison-Wesley, 1964, page 1-11.
>
> That's what has to survive.

I sat and stared at that for several minutes, trying to come up
with a valid competitor. I failed. The Michaelson-Morley
experiment was considered, since it affected much later, but it is
not epochal. Rigor in mathematics was another candidate.

CBFalconer

unread,
Feb 16, 2007, 10:26:45 PM2/16/07
to

You're thinking of cathode poisoning, or sleeping sickness. That's
why we used the 5692 in place of the 12AU7. (IIRC)

Quadibloc

unread,
Feb 16, 2007, 10:39:58 PM2/16/07
to
dk wrote:
> I am wondering if computer is still taught. It seems to me that it is

> limited to only 5 years back. The reason I say this, is that in some
> academic groups, some people think windows is the first pc operating
> system, forgetting about cp/m and others.

It is certainly true that many other operating systems, such as CP/M
or OS/9 (not MacOS 9, but the operating system used on the Tandy Color
Computer) were used on computers sold at prices that ordinary
individuals owned in large numbers.

The Honeywell 316 in the Neiman-Marcus catalog just came with an ASR
33 for input/output, and wasn't a disk-based system which would have
had what we would call an operating system by modern standards.

However, the term "PC" is ambiguous, and may be taken to refer not to
a computer for personal use in general, but instead specifically to
the IBM Personal Computer and the machines which followed it, such as
the IBM Personal Computer XT (presumed to be "Extended Technology",
with a 10 megabyte hard drive) and the IBM Personal Computer AT (with
an 80286 microprocessor).

In that case, the first operating system for the IBM Personal
Computer, in all its 4.77 MHz 8088 glory, was PC-DOS, written by
Microsoft, and later sold directly to the public by Microsoft under
the name MS-DOS. CP/M-86 was another operating system later offered
for that machine.

There were programs that ran on top of PC-DOS or MS-DOS which provided
an enhanced user environment. Some, such as Borland's Sidekick, did
not really provide any functions that one would normally associate
with an operating system. But others did provide an API (applications
program interface) of their own, thus earning the right to be called
operating systems (at least by one definition). TopView, by IBM, was
one example of such a program; it provided a text-mode based window,
but still provided a menu and mouse environment.

Graphical User Interfaces that ran on top of MS-DOS included GEM
Desktop (the distribution of which for the IBM PC architecture ended
up being curtailed because of a lawsuit, although it continued to be
used for the Atari ST), GeoWorks, and, of course, Microsoft Windows.

While the ability to run 32-bit code was introduced with the Win32s
extension to Windows 3.1, a later version of Windows, Windows 95,
extended the ability of Microsoft Windows to handle 32-bit code, and,
as well, was sold as a single, integrated operating system rather than
a separate shell to run on top of a version of MS-DOS.

John Savard

Quadibloc

unread,
Feb 16, 2007, 10:43:40 PM2/16/07
to
Brian Inglis wrote:
> Come on, a millenium isn't that long; ancient history probably starts in
> the Roman era BCE: not sure I'd label the Roman departure from
> Britannia, Viking excursions, or the Dark Ages, as ancient history. The
> pros assuredly have their own definition, which may not agree with my
> opinion.

As I recall, the official definition of "ancient history" is as
follows:

Prehistory was when people didn't know how to write yet.

The oldest possible history is "ancient", we don't have any kind of
really old history that is older than ancient.

So ancient history started when Mesopotamia, Egypt, and China first
started writing.

Ancient history was followed by Mediaeval history, and the transition
between the two is marked by the downfall of the western Roman Empire.
So the Dark Ages were at the beginning of Mediaeval history,
definitely not in ancient history.

John Savard

Quadibloc

unread,
Feb 16, 2007, 10:51:43 PM2/16/07
to
Lawrence Statton XE2/N1GAK wrote:
> With not but an adept tech, a glass blower, and a vaccum pump, you
> could probably build a low-mu triode good to a megahertz, but
> generating high performance vacuum tubes was a black art that
> continuously developed over 60 years.

This is true enough.

If one could build a vacuum tube good enough to use in a radio,
though, it could be used in a computer. The idea of running vacuum
tubes well below their rated specs for radio service to make them more
reliable is the key here, not the improvements in the quality of
vacuum tubes later achieved by IBM.

But a glassblower and some metalworking technology and a vacuum pump
are *not* all that's needed to make even a barely-useful radio tube.

To me, the most obvious omission is that the cathode needs a special
coating to make it efficiently emit electrons when heated - somewhat
analogous to the use of a Welsbach mantle in a lantern.

I could throw in the mica insulators too, that make everything stand
up, but that's nit-picking, being a small detail people would pick up
on their own easily enough. And miniature tubes dispensed with the
Bakelite base... the point is, once one has the *basics*, then one can
experiment and relearn the "black art" over time, particularly if one
has competition and an economic incentive.

John Savard

Joe Pfeiffer

unread,
Feb 16, 2007, 11:00:48 PM2/16/07
to
hanc...@bbs.cpcn.com writes:

Tubes were before my time so I can't really speak from experience, but
I have to wonder if this is true. In the solid-state world saturating
logic is a whole lot easier than a good analog amplifier; if you can
amplify in the middle of the range then doing a switch is trivial.
There are stories (again before my time) about early-days CDC people
going dumpster-diving behind semiconductor fabs grabbing transistors
that had failed QC.

IBM working with tube companies to develop cheaper, lower-quality
components would actually make more sense to me.

Walter Bushell

unread,
Feb 17, 2007, 12:27:30 AM2/17/07
to
In article <45D67448...@yahoo.com>,
CBFalconer <cbfal...@yahoo.com> wrote:

> Eugene Miya wrote:
> >
> ... snip ...
> >
> > My favorite quote which I wish I had heard and remembered in 4th
> > grade would have to be:
> >
> > From a long view of the history of mankind - seen from, say, ten
> > thousand years from now, there can be little doubt that the most
> > significant event of the 19th century will be judged as Maxwell's
> > discovery of the laws of electrodynamics. The American Civil War
> > will pale into provincial insignificance in comparison with this
> > important scientific event of the same decade.
> >
> > R. P. Feynman, Lectures on Physics, Vol. II, Addison-Wesley, 1964, page
> > 1-11.
> >
> > That's what has to survive.
>
> I sat and stared at that for several minutes, trying to come up
> with a valid competitor. I failed. The Michaelson-Morley
> experiment was considered, since it affected much later, but it is
> not epochal. Rigor in mathematics was another candidate.

20th century Quantum Mechanics and Relativity, nuclear "weapons". OTOH
probably something unnoticed by most people today.

--
"The power of the Executive to cast a man into prison without formulating any
charge known to the law, and particularly to deny him the judgement of his
peers, is in the highest degree odious and is the foundation of all totali-
tarian government whether Nazi or Communist." -- W. Churchill, Nov 21, 1943

Walter Bushell

unread,
Feb 17, 2007, 12:31:33 AM2/17/07
to
In article <1171683598.5...@s48g2000cws.googlegroups.com>,
"Quadibloc" <jsa...@ecn.ab.ca> wrote:

> here were programs that ran on top of PC-DOS or MS-DOS which provided
> an enhanced user environment.

Windows 3.x among others. The graphics did look like an explosion in a
Disney Animation factory.

Walter Bushell

unread,
Feb 17, 2007, 12:33:19 AM2/17/07
to
In article <1267.638T...@kltpzyxm.invalid>,
"Charlie Gibbs" <cgi...@kltpzyxm.invalid> wrote:

> In article <1171651574....@a75g2000cwd.googlegroups.com>,
> hanc...@bbs.cpcn.com (hancock4) writes:
>
> > Frankly I don't think there'd be much interest nor use of computer
> > history. It's really a separate subject.
> >
> > In other fields, how much history is taught? Do engineers spend time
> > learning about long obsolete methods of bridge construction (beyond a
> > general intro)? Do doctors learn about long obsolete healing methods
> > (beyond a general intro)?
> >
> > I think predecessor techniques ought to be taught if they have
> > relevance to modern day problem solving.
>
> They have more relevance than many people think - if for no other
> reason than that they colour existing designs. Even brand-new
> designs would look much different if they weren't influenced -
> for good or bad - by what came before. I think it's good that
> people learn the fundamentals of design principles that remain
> valid much longer than our planned-obsolescence generation believes.
>
> "Those who ignore history are doomed to repeat it."

In Summer school.

Steve O'Hara-Smith

unread,
Feb 17, 2007, 1:47:06 AM2/17/07
to
On 16 Feb 07 14:59:01 -0800
"Charlie Gibbs" <cgi...@kltpzyxm.invalid> wrote:

> "Those who ignore history are doomed to repeat it."

"History would be a better teacher if it weren't so stupefyingly
repetitious"

--
C:>WIN | Directable Mirror Arrays
The computer obeys and wins. | A better way to focus the sun
You lose and Bill collects. | licences available see
| http://www.sohara.org/

Trog Woolley

unread,
Feb 17, 2007, 7:14:09 AM2/17/07
to
While stranded on the hard shoulder of the information super highway sa...@kanecki.com typed:
> Hi,

>
> I am wondering if computer is still taught. It seems to me that it is
> limited to only 5 years back. The reason I say this, is that in some
> academic groups, some people think windows is the first pc operating
> system, forgetting about cp/m and others.

I work in a school as the ICT Technician. The pupils will tell you that
Bill Gates invented both the PC and the internet. The older ones have
heard of Windows NT & 98 (which we used to have 3 years ago), while the
younger ones think something called XP is all that has ever run on
computers. The simple fact that all of the systems that they use in
school are Linux seems to have escaped them. If I show them a picture of
an IBM 360, which was the first system I worked on, they don't believe
it's a computer.

If anyone knows of a really good picture of a CDC Cyber 173, which is what
we had when I was at uni, I'd love to show them that.

I said to one the other day (he's about 15) that I built my first computer
in high school, when I was his age. He said that's not impressive, as
he had been building computers since he was 10. I then pointed out that
by "building a computer" I didn't mean going to a shop and buying a
motherboard, some memory, disc drive, etc and pluging it all together.
Oh no, I mean soldering wires to relays, toggle switches and light bulb
holders, and standing well back when you first powered it up. We had 50V
running through the busbar, which got a bit warm sometimes.

--
Trog Woolley | trog at trogwoolley dot com
(A Croweater back residing in Pommie Land with Linux)
Isis Astarte Diana Hecate Demeter Kali Inanna

jmfb...@aol.com

unread,
Feb 17, 2007, 8:14:56 AM2/17/07
to

I would think a lot of them would get it wrong. Those were making
it have rewritten it. :-)

But computing is technolgoy. I didn't get math history in my
math classes. They were little side bars in the texts and a few
comments from the instructor, if s/he cared--most didn't care.

Science is a different story. You get the history, kind of,
because of what you have to learn first before you can get
to the more advanced step.

/BAH

jmfb...@aol.com

unread,
Feb 17, 2007, 8:20:21 AM2/17/07
to
In article <87y7mya...@thalassa.informatimago.com>,

Pascal Bourguignon <p...@informatimago.com> wrote:
>
>hanc...@bbs.cpcn.com writes:
>> Frankly I don't think there'd be much interest nor use of computer
>> history. It's really a separate subject.
>
>On the contrary, IF you can get some old computer running, it might be
>very interesting because the components were macroscopic. You could
>see bits perfored in cards or paper tape. Nowadays, computers reduce
>to a little number of chips and there's nothing to see. If you can
>play with old hardware, a lot of CS can be learned with them.
>
>
>> In other fields, how much history is taught? Do engineers spend time
>> learning about long obsolete methods of bridge construction (beyond a
>> general intro)? Do doctors learn about long obsolete healing methods
>> (beyond a general intro)?
>
>But perhaps there's less obsolescence in CS than in the other domains.
>Of course devices and interfaces change, but the fundamentals of CS,
>from theory to programming are about the same.

This is the strange thing about what is getting taught. The fundamentals
aren't getting taught.

> After all, we still
>use programming languages whose design started 50 years ago, and OSes
>whose design started 30 or 20 years ago (it's like if M.D. or
>engineers used tools invented 700 years ago).
>
>
>> I think predecessor techniques ought to be taught if they have
>> relevance to modern day problem solving. Perhaps doctors should learn
>> about house calls and the techniques of the old fashioned doctors seen
>> in the movies so as to understand the importantance of merely
>> comforting the patient beyond medicine. If a seldom used old drug or
>> treatment still has value in some cases, then it still should be
>> taught (sulfa drugs are still needed as anti-biotics occassionally).
>> Computer people should be taught basic concepts of cateogrization,
>> sorting, collating, and tabulation, even without a tab machine
>> process.
>
>Yes, but having access to the actual hardware is a good pedagogical
>device.

To learn well, I think it takes hand manipulations. I can't
think of a better way to learn how to manipulate bits than
to toggle switches and have instantaneous feedback of
blinkenlights or smoke or screeches.


>
>
>> However, there is no need to teach about the conditions of vacuum
>> tubes or drum memory or wiring a plugboard.
>
>Any interested student will learn about them by himself anyways.

You assume that the student will hear about these things. They are
not, so how can they investigate something they don't know they don't
know?

>
>
>Perhaps the big difference between CS and any other domain, is that
>more people start to learn CS stuff by themselves than MD or
>architecture. It's easier to write a computer program at 10 than
>it is to do an appendicectomy.

That is because not much damage can be done. Testing and making
mistakes are not a danger to one's breathing continuation.

/BAH

jmfb...@aol.com

unread,
Feb 17, 2007, 8:24:23 AM2/17/07
to
In article <proto-9B4399....@reader2.panix.com>,

Walter Bushell <pr...@panix.com> wrote:
>In article <1267.638T...@kltpzyxm.invalid>,
> "Charlie Gibbs" <cgi...@kltpzyxm.invalid> wrote:
>
>> In article <1171651574....@a75g2000cwd.googlegroups.com>,
>> hanc...@bbs.cpcn.com (hancock4) writes:
>>
>> > Frankly I don't think there'd be much interest nor use of computer
>> > history. It's really a separate subject.
>> >
>> > In other fields, how much history is taught? Do engineers spend time
>> > learning about long obsolete methods of bridge construction (beyond a
>> > general intro)? Do doctors learn about long obsolete healing methods
>> > (beyond a general intro)?
>> >
>> > I think predecessor techniques ought to be taught if they have
>> > relevance to modern day problem solving.
>>
>> They have more relevance than many people think - if for no other
>> reason than that they colour existing designs. Even brand-new
>> designs would look much different if they weren't influenced -
>> for good or bad - by what came before. I think it's good that
>> people learn the fundamentals of design principles that remain
>> valid much longer than our planned-obsolescence generation believes.
>>
>> "Those who ignore history are doomed to repeat it."
>
>In Summer school.

Nope, not in the computing biz. Unfortunately, it will be done
on the job.

/BAH

jmfb...@aol.com

unread,
Feb 17, 2007, 8:26:26 AM2/17/07
to
In article <er5gpi$op1$1...@news.xmission.com>,

legaliz...@mail.xmission.com (Richard) wrote:
>[Please do not mail me a copy of your followup]
>
>Peter Flass <Peter...@Yahoo.com> spake the secret code
><45d59c23$0$24709$4c36...@roadrunner.com> thusly:
>
>>The fallacy is that windoze is *not* an operating system, at least until
>>NT, and probably later.
>
>Correct. Windoze is not an operating system.
>
>However, Windows is an operating system.

Nope. Note that not even Microsoft considers it an OS.

/BAH

jmfb...@aol.com

unread,
Feb 17, 2007, 8:29:24 AM2/17/07
to
In article <er6rih$llg$1$8300...@news.demon.co.uk>,

Start a building project just like when you were a kid. Try to
let the kiddies play, too :-).

/BAH

Peter Flass

unread,
Feb 17, 2007, 8:38:09 AM2/17/07
to

Has anyone thought about what should be covered in these courses? Right
now it's up to the prof., but maybe someone should consider a standard
curriculum, line the ACM has for CS.

jmfb...@aol.com

unread,
Feb 17, 2007, 8:32:41 AM2/17/07
to
In article <proto-D50C6F....@reader2.panix.com>,

Walter Bushell <pr...@panix.com> wrote:
>In article <45D67448...@yahoo.com>,
> CBFalconer <cbfal...@yahoo.com> wrote:
>
>> Eugene Miya wrote:
>> >
>> ... snip ...
>> >
>> > My favorite quote which I wish I had heard and remembered in 4th
>> > grade would have to be:
>> >
>> > From a long view of the history of mankind - seen from, say, ten
>> > thousand years from now, there can be little doubt that the most
>> > significant event of the 19th century will be judged as Maxwell's
>> > discovery of the laws of electrodynamics. The American Civil War
>> > will pale into provincial insignificance in comparison with this
>> > important scientific event of the same decade.
>> >
>> > R. P. Feynman, Lectures on Physics, Vol. II, Addison-Wesley, 1964, page
>> > 1-11.
>> >
>> > That's what has to survive.
>>
>> I sat and stared at that for several minutes, trying to come up
>> with a valid competitor. I failed. The Michaelson-Morley
>> experiment was considered, since it affected much later, but it is
>> not epochal. Rigor in mathematics was another candidate.
>
>20th century Quantum Mechanics and Relativity, nuclear "weapons". OTOH
>probably something unnoticed by most people today.

What did QM and Relativity (Special or General) have to do with developing
the first atom bomb?

/BAH

Peter Flass

unread,
Feb 17, 2007, 8:42:27 AM2/17/07
to
Lawrence Statton XE2/N1GAK wrote:

> hanc...@bbs.cpcn.com writes:
>
>
>>On Feb 16, 3:24 pm, CBFalconer <cbfalco...@yahoo.com> wrote:
>>
>>>Well, you can build a vacuum tube with much less technology.
>>>Something like an adept technician, a glassblower, and a vacuum
>>>pump. Could come in handy.
>>
>>Sorry to nitpick, but actually you couldn't, at least not for digital
>>service.
>
>
> I'll have to agree with Hancock here.
>
> With not but an adept tech, a glass blower, and a vaccum pump, you
> could probably build a low-mu triode good to a megahertz, but
> generating high performance vacuum tubes was a black art that
> continuously developed over 60 years.
>
> It seems to me that every generation see the technology they grew up
> with as "obvious" and "trivial", whether it was the Steam Engine, the
> 6SN7, the 74LS245 or the C&T entire-PC-in-one-custom-ASIC.

This is what makes the history of technology interesting. You
appreciate current stuff a lot more whan you see all the false starts
and dead-ends, and all the problems that had to be solved to get where
we are.

The Antikithera mechanism seems to be as good as early clocks 1000+
years later, lacking only spring-steel to make it self-moving. On the
other hand, since all the gears had to be cut by hand it was almost
necessarily a one-off piece. How much math had to be invented to build it?


Peter Flass

unread,
Feb 17, 2007, 8:48:23 AM2/17/07
to
Charlie Gibbs wrote:

> In article <1171651574....@a75g2000cwd.googlegroups.com>,
> hanc...@bbs.cpcn.com (hancock4) writes:
>
>
>>Frankly I don't think there'd be much interest nor use of computer
>>history. It's really a separate subject.
>>
>>In other fields, how much history is taught? Do engineers spend time
>>learning about long obsolete methods of bridge construction (beyond a
>>general intro)? Do doctors learn about long obsolete healing methods
>>(beyond a general intro)?
>>
>>I think predecessor techniques ought to be taught if they have
>>relevance to modern day problem solving.
>
>
> They have more relevance than many people think - if for no other
> reason than that they colour existing designs. Even brand-new
> designs would look much different if they weren't influenced -
> for good or bad - by what came before. I think it's good that
> people learn the fundamentals of design principles that remain
> valid much longer than our planned-obsolescence generation believes.
>

My last post on this topic, I promise;-) This is a very good point.
Computer architecture started off as "100 flowers", and seems to have
gotten channeled into a very few streams, to mix metaphors. IA-32, for
example, is "fast" only because a tremendous amount of effort has gone
into making it so. Who's to say that, given the same effort, some other
current or past architectures might not have been as fasty or faster?

A path taken years ago because of some technological limitation that has
since been superceded should probably be looked at from time to time and
not taken for granted. For example, lots of things were done in the old
days because of limitations on the number of gates on a chip that are
simply not relevant.


Peter Flass

unread,
Feb 17, 2007, 8:49:13 AM2/17/07
to
Richard wrote:

> [Please do not mail me a copy of your followup]
>
> Peter Flass <Peter...@Yahoo.com> spake the secret code
> <45d59c23$0$24709$4c36...@roadrunner.com> thusly:
>
>
>>The fallacy is that windoze is *not* an operating system, at least until
>>NT, and probably later.
>
>
> Correct. Windoze is not an operating system.
>
> However, Windows is an operating system.

Nope, Windows(tm) is a GUI on top of an OS, originally DOS, but now
some rip-off from DEC.

CBFalconer

unread,
Feb 17, 2007, 8:56:29 AM2/17/07
to
Walter Bushell wrote:
> CBFalconer <cbfal...@yahoo.com> wrote:
>> Eugene Miya wrote:
>>>
>> ... snip ...
>>>
>>> My favorite quote which I wish I had heard and remembered in 4th
>>> grade would have to be:
>>>
>>> From a long view of the history of mankind - seen from, say, ten
>>> thousand years from now, there can be little doubt that the most
>>> significant event of the 19th century will be judged as Maxwell's
>>> discovery of the laws of electrodynamics. The American Civil War
>>> will pale into provincial insignificance in comparison with this
>>> important scientific event of the same decade.
>>>
>>> R. P. Feynman, Lectures on Physics, Vol. II, Addison-Wesley, 1964,
>>> page 1-11.
>>>
>>> That's what has to survive.
>>
>> I sat and stared at that for several minutes, trying to come up
>> with a valid competitor. I failed. The Michaelson-Morley
>> experiment was considered, since it affected much later, but it is
>> not epochal. Rigor in mathematics was another candidate.
>
> 20th century Quantum Mechanics and Relativity, nuclear "weapons".
> OTOH probably something unnoticed by most people today.

He limited it to the 19th century.

Quadibloc

unread,
Feb 17, 2007, 9:48:03 AM2/17/07
to
jmfb...@aol.com wrote:
> What did QM and Relativity (Special or General) have to do with developing
> the first atom bomb?

The Special Theory of Relativity led to the equation E=mc^2. This
allowed the energy yield of a fission reaction to be estimated from
measurements of isotopic masses.

The relationship of Quantum Mechanics to the development of atomic
weapons is considerably more far-reaching.

When radioactivity was first discovered, it was a mysterious
phenomenon. Thus, we still use the name "X-rays" for the form of
electromagnetic radiation discovered and produced by Roentgen.

The same scientists who were making theoretical advances in quantum
mechanics, and using it to explain the structure of the electron
shells of the atom, were also working to make sense out of the
discoveries related to radioactivity. For just one example, the "half-
life" of a radiosotope derives from the properties of its potential
well in relation to the phenomenon of quantum tunneling.

It is true that much of the work that led to the discovery of the
neutron, and the recognition of the existence of nuclear fission, was
experimental and pragmatic, and did not depend greatly on quantum
mechanics. But there was still a relationship, even if the flow of
ideas mostly went the other way; the new discoveries about the atom
were a tremendous stimulus to the field of quantum mechanics.

John Savard

Michael Black

unread,
Feb 17, 2007, 10:49:49 AM2/17/07
to
Peter Flass (Peter...@Yahoo.com) writes:

> This is what makes the history of technology interesting. You
> appreciate current stuff a lot more whan you see all the false starts
> and dead-ends, and all the problems that had to be solved to get where
> we are.
>
> The Antikithera mechanism seems to be as good as early clocks 1000+
> years later, lacking only spring-steel to make it self-moving. On the
> other hand, since all the gears had to be cut by hand it was almost
> necessarily a one-off piece. How much math had to be invented to build it?
>
>

And often, design ends up being serial. So this gets added to that, and
then someone else comes along and adds to that.

Sometimes it's helpful to go back to the beginning, or around then, and
revisit the early things. Then maybe one of those early points is a start
for a whole new branch of design, rather than merely extending the previous
branch.

In some cases, something comes along, and then is discarded because it's
too hard to implement at the time. but when revisited decades later, other
things have changed to make that a far more viable thing.

But if history/the basics aren't taught or learned, then few will go
off and find those early things to pursue now. They won't know there
was something before last week.

IN other cases, they'll be doomed to some bulky design because that's
what's current, while someone older who live through earlier designs
could instantly suggest a far simpler solution from an earlier era.

I saw this happen in one of the sci.electronics.* newsgroups a couple
of years ago. Someone started a long thread, admittedly he did return
to it, asking about how to do something. But only with time was his
real intent revealed, after all kinds of fancy and more recent solutions
were offered up. ANd every time someone suggested something different,
the original poster would go off on that tangent, because he didn't
have the background or history to evaluate that specific bit, so he
just followed. Eventually, it turned out to be a really simple solution
from days gone by, which would have been obvious if he'd been looking over
older books (not "learning history", but reading material from the time
that later became history), but by now it had become so foreign that he
had a hard time wrapping his mind around it.

IN some ways, history shouldn't be about dates and such, though the
general timeline is important, but to give an overview of what has
happened, so when something from the past is needed, someone will have
a tendency to check out the past when looking for solutions.

Michael

Brian Inglis

unread,
Feb 17, 2007, 11:24:41 AM2/17/07
to
On 16 Feb 2007 19:43:40 -0800 in alt.folklore.computers, "Quadibloc"
<jsa...@ecn.ab.ca> wrote:

So the 5th Century is the official transition.
I thought the Dark Ages started later, from the decline of the
Byzantines and rise of the Turks until the start of the Renaissance.

--
Thanks. Take care, Brian Inglis Calgary, Alberta, Canada

Brian....@CSi.com (Brian[dot]Inglis{at}SystematicSW[dot]ab[dot]ca)
fake address use address above to reply

Walter Bushell

unread,
Feb 17, 2007, 12:24:30 PM2/17/07
to
In article <er705p$8qk...@s994.apx1.sbo.ma.dialup.rcn.com>,
jmfb...@aol.com wrote:

Er, they were the theoretical underpinnings of a _major_ engineering
effort that was the Manhattan Project. On the order of _Vista_
apparently. >;)(

Walter Bushell

unread,
Feb 17, 2007, 12:33:21 PM2/17/07
to
In article <4v6et25lq8gpcvbnb...@4ax.com>,
Brian Inglis <Brian....@SystematicSW.Invalid> wrote:

> So the 5th Century is the official transition.
> I thought the Dark Ages started later, from the decline of the
> Byzantines and rise of the Turks until the start of the Renaissance.

The Renaissance was partially led by and perhaps sparked by the flight
of scholars from Byzantium to the West, IIRC. Compare to the rise of
American physics and chemistry from the massive influx of German
physicists and chemists before WWII. In fact the American post War pre
eminence in many other fields can not be explained without mentioning
this influx of mostly unwanted Germans.

Walter Bushell

unread,
Feb 17, 2007, 12:41:05 PM2/17/07
to
In article <45d707aa$0$5214$4c36...@roadrunner.com>,
Peter Flass <Peter...@Yahoo.com> wrote:

<snip>

> >
>
> My last post on this topic, I promise;-) This is a very good point.
> Computer architecture started off as "100 flowers", and seems to have
> gotten channeled into a very few streams, to mix metaphors. IA-32, for
> example, is "fast" only because a tremendous amount of effort has gone
> into making it so. Who's to say that, given the same effort, some other
> current or past architectures might not have been as fasty or faster?

Many people think PPC could have been a contender. But Intel had so much
money from personal computer chips that no one wanted to compete in
chips for personal computers. I would have thought IBM would have done
it for corporate pride and morale reasons and to keep their chip
designers on their toes, but not.

> A path taken years ago because of some technological limitation that has
> since been superceded should probably be looked at from time to time and
> not taken for granted. For example, lots of things were done in the old
> days because of limitations on the number of gates on a chip that are
> simply not relevant.

--

Walter Bushell

unread,
Feb 17, 2007, 1:01:18 PM2/17/07
to
In article <er786t$g2c$1...@theodyn.ncf.ca>,
et...@FreeNet.Carleton.CA (Michael Black) wrote:

> In some cases, something comes along, and then is discarded because it's
> too hard to implement at the time. but when revisited decades later, other
> things have changed to make that a far more viable thing.

Computers are an interesting example. Babbage comes to mind.

David Wade

unread,
Feb 17, 2007, 1:50:08 PM2/17/07
to

"dk" <sa...@kanecki.com> wrote in message
news:1171592326....@a75g2000cwd.googlegroups.com...

> Hi,
>
> I am wondering if computer is still taught. It seems to me that it is
> limited to only 5 years back. The reason I say this, is that in some
> academic groups, some people think windows is the first pc operating
> system, forgetting about cp/m and others.
>

I think its even worse that that. In Manchester I can visit our museum of
Science and see cotton machinery from the last century, and equipment from
the early part of this. But the only working computer is a replica of
"Babe". I think that the mainframe has been forgotton,,,,

> David
>


Anne & Lynn Wheeler

unread,