I am wondering if computer is still taught. It seems to me that it is
limited to only 5 years back. The reason I say this, is that in some
academic groups, some people think windows is the first pc operating
system, forgetting about cp/m and others.
David
cms was definitely personal computing from mid-60s ... some past posts
mentioning that at least some cp/m influenced by cp67/cms
http://www.garlic.com/~lynn/2004b.html#5 small bit of cp/m & cp/67 trivia from alt.folklore.computers n.g. (thread)
http://www.garlic.com/~lynn/2004e.html#38 [REALLY OT!] Overuse of symbolic constants
http://www.garlic.com/~lynn/2004h.html#40 Which Monitor Would You Pick??????
Kildall using cp67/cms at navy post-graduate in 72
http://www.khet.net/gmc/docs/museum/en_cpmName.html
cms personal computing ran in (the really old, new thing) virtual
machine ... originally cp67/cms and later vm370/cms (in the morph from
cp67 to vm370, cms was renamed from the cambridge monitor system to
the conversational monitor system).
internal cms implementation had/has "handles" like CON1, RDR1, PUN1,
PTR1, TAP1, TAP2, DSK1, DSK2, DSK3, ....
table from gh20-0859, pg. 5, cp-67/cms user's guide
http://www.garlic.com/~lynn/2004.html#45 40th anniversary of IBM System/360 on 7 Apr 2004
other references:
http://www.garlic.com/~lynn/2002m.html#11 DOS history question
http://www.garlic.com/~lynn/2004b.html#0 Is DOS unix?
http://www.garlic.com/~lynn/2004b.html#56 Oldest running code
http://www.garlic.com/~lynn/2004c.html#3 Oldest running code
also
http://www.khet.net/gmc/docs/museum/en_cpmName.html
from above:
And CP/CMS stands for Control Program/Cambridge Monitor System, the
first "virtual machine" OS to go "prime time", and was written not by
the product OS people, but by the research laboratory!
... snip ...
there actually some amount of sensitivity regarding the above
statement.
There was an article that appeared in a corporate monthly publication
that made some assertions that virtual machines were first done by
corporate Researchers. There was a number of protests written by
internal employees from the cambridge science center
http://www.garlic.com/~lynn/subtopic.html#545tech
demanding a retraction (which never happened).
Later a similar article appeared that claimed that virtual machines
were first done by corporate researchers. The letters of protest were
again repeated, this time the publication editor responded that
"researcher" (with little case "r") could be construed as including
people at the corporate science centers (as opposed to the previous
scenario where upper case "R" could only be construed to mean people
from the corporate Research divison).
CP/M was late to the party.
--
Randy Howard (2reply remove FOOBAR)
"The power of accurate observation is called cynicism by those
who have not got it." - George Bernard Shaw
He's lamenting that "MSDOS" is seen by many as the first operating system,
and then mentions CP/M.
He's a result of that same mentality, only the specific time has shifted
a few years earlier.
If people think CP/M is an example of an "early operating system" then
of course history isn't being taught in this area.
Michael
> Randy Howard (randy...@FOOverizonBAR.net) writes:
>> On Thu, 15 Feb 2007 20:18:46 -0600, dk wrote
>> (in article <1171592326....@a75g2000cwd.googlegroups.com>):
>>
>>> Hi,
>>>
>>> I am wondering if computer is still taught. It seems to me that it is
>>> limited to only 5 years back. The reason I say this, is that in some
>>> academic groups, some people think windows is the first pc operating
>>> system, forgetting about cp/m and others.
>>
>> CP/M was late to the party.
>>
> But doesn't that prove the poster's point?
>
> He's lamenting that "MSDOS" is seen by many as the first operating system,
> and then mentions CP/M.
>
> He's a result of that same mentality, only the specific time has shifted
> a few years earlier.
Correct. I was trying to be a bit more subtle about calling him on it.
> If people think CP/M is an example of an "early operating system" then
> of course history isn't being taught in this area.
One wonders why "history" is a term used to describe teaching of
something in the so recent past. I think of things happening centuries
ago or longer as such.
I wonder if 1000 years from now anyone will care at all about the perks
of 8-bit processors?
cp/m was a late comer.
--
"The power of the Executive to cast a man into prison without formulating any
charge known to the law, and particularly to deny him the judgement of his
peers, is in the highest degree odious and is the foundation of all totali-
tarian government whether Nazi or Communist." -- W. Churchill, Nov 21, 1943
They may be more interested in chipping flint. I think this period, from
the early 1900s will be of great interest to historians, if civilization
and humanity survive.
>cp/m was a late comer.
I don't see how you can say that. Notice that he said "PC operating system."
If "PC" means the IBM Personal Computer, then MSDOS (not Windows) was
basically the first DOS. I'm excluding the ROM Basic as an "operating
system," though you could argue it was if you wanted to bore everybody
with tedious semantics. (I don't remember if there was an earlier
in-house disk operating system used for testing and development.)
If "PC" refers to any home microcomputer, as it sometimes did in the
pre-IBM-PC days, then CP/M has a pretty good case for being one of the
first DOSes. (Again, I'm excluding the Micro-Soft BASIC that came on
paper tape, cassette, or ROM.) Written in 1974, it ran on the Altair
and predates every other microcomputer DOS that I know of.
If "PC" means *any* personal computer, as in "well *I* had a PDP-11 for
myself at home back in '73 and it ran RT-11, the finest operating system
ever written" then you win. :-)
- David Librik
lib...@panix.com
For the same reasons that `History' (rather that stories of how our
side was always right) is taught, to show how different systems were
tried over the years, how some failed, and others succeeded.
--
Greymaus
Just another grumpy old man
> Walter Bushell <pr...@panix.com> writes:
> >cp/m was a late comer.
>
> I don't see how you can say that. Notice that he said "PC operating
> system."
>
> If "PC" means the IBM Personal Computer, then MSDOS (not Windows) was
> basically the first DOS. I'm excluding the ROM Basic as an "operating
CP/M-86 was around at the same time as MSDOS first appeared. I'd be
hard put to say which was first, but whichever it was the gap was small.
> If "PC" refers to any home microcomputer, as it sometimes did in the
> pre-IBM-PC days, then CP/M has a pretty good case for being one of the
> first DOSes.
It's certainly been around just about as long as micro based
computers.
> (Again, I'm excluding the Micro-Soft BASIC that came on
> paper tape, cassette, or ROM.) Written in 1974, it ran on the Altair
> and predates every other microcomputer DOS that I know of.
What did the early SWTPC 6800 based machines run ? They were out at
the same time as the Altair IIRC.
--
C:>WIN | Directable Mirror Arrays
The computer obeys and wins. | A better way to focus the sun
You lose and Bill collects. | licences available see
| http://www.sohara.org/
The fallacy is that windoze is *not* an operating system, at least until
NT, and probably later.
Not yet. We're still making it.
>It seems to me that it is
>limited to only 5 years back. The reason I say this, is that in some
>academic groups, some people think windows is the first pc operating
>system, forgetting about cp/m and others.
They do that because the past is too near. There's no money nor
prestige(sp?) in studying the auld stuff yet. I think that
part of this is due to the computer biz not being developed under
similar constraints as science does. We never documented what
didn't work as part of job; we patched it and then edited the
mistake out of the soruces. No other human being saw the mistake.
/BAH
Thank you for kicking our main frame butt ;-).
>
>If "PC" means *any* personal computer, as in "well *I* had a PDP-11 for
>myself at home back in '73
Bragger...
and it ran RT-11, the finest operating system
>ever written" then you win. :-)
I didn't care for RT-11 as much as I cared for IAS. RT-11 was
too [emoticon gropes for a good word...fails] constraining.
/BAH
Sigh! It is not an OS now--I don't give a shit where it resides
in core. You determine an OS by the services it provides and
the level of execution is needs to provide them. The functionaly
of Windows is simply not monitor-exec level.
What it does is our equivalent of doing direct I/O to the raw
disk from a host NFTing in.
There! How's that analogy?
/BAH
> On Thu, 15 Feb 2007 21:46:56 -0600, Michael Black wrote
> (in article <er39fg$af1$1...@theodyn.ncf.ca>):
>
>> Randy Howard (randy...@FOOverizonBAR.net) writes:
>>
>>> On Thu, 15 Feb 2007 20:18:46 -0600, dk wrote
>>> (in article <1171592326....@a75g2000cwd.googlegroups.com>):
>>>
>>>> Hi,
>>>>
>>>> I am wondering if computer is still taught. It seems to me that it
>>>> is limited to only 5 years back.
It saves a lot of effort, and makes revisionism much easier.
>>>> The reason I say this, is that in some academic groups, some people
>>>> think windows is the first pc operating system, forgetting about
>>>> cp/m and others.
>>>
>>> CP/M was late to the party.
>>>
>> But doesn't that prove the poster's point?
>>
>> He's lamenting that "MSDOS" is seen by many as the first operating
>> system, and then mentions CP/M.
He never mentioned MS-DOS. I'm sure that many of the people he's
talking about would be mystified if you opened a command window.
>> He's a result of that same mentality, only the specific time has
>> shifted a few years earlier.
>
> Correct. I was trying to be a bit more subtle about calling him on it.
>
>> If people think CP/M is an example of an "early operating system"
>> then of course history isn't being taught in this area.
>
> One wonders why "history" is a term used to describe teaching
> of something in the so recent past. I think of things happening
> centuries ago or longer as such.
There's nothing in the definition of "history" that says it has
to be centuries ago. For really old events, the word "history"
often has the word "ancient" prepended. There's nothing wrong
with this; think about it the next time you use the command-line
history feature on your machine.
> I wonder if 1000 years from now anyone will care at all about the
> perks of 8-bit processors?
There'll always be ancient historians to think about ancient history.
--
/~\ cgi...@kltpzyxm.invalid (Charlie Gibbs)
\ / I'm really at ac.dekanfrus if you read it the right way.
X Top-posted messages will probably be ignored. See RFC1855.
/ \ HTML will DEFINITELY be ignored. Join the ASCII ribbon campaign!
I'm not sure if you're more interested in learning more about computer
history, or lamenting that it isn't taught anymore (if it ever was).
But if it's the former, see http://www.cbi.umn.edu/, The Charles
Babbage Institute at the University of MN.
Regards,
-=Dave
The Computer History Museum (computerhistory.org) is a more accessable
source for computer history than CBI, which is mainly a research library.
--
Posted via a free Usenet account from http://www.teranews.com
> I am wondering if computer is still taught.
When I had a computer science classes, they gave a brief mention of
the technology prior to that point. They mentioned Babbage's
Difference Engine, Hollerith doing the 1890 Census, the ENIAC, and
that computers were getting smaller, cheaper, and more powerful
through the years to date.
None of this "was on the test", it was just part of the introductory
chapter.
Other than perhaps a teacher digressing into some personal
reminiscing, no other history was taught. When I got into industry I
learned stuff from workers and the fact my employer had some "old"
stuff still running.
I don't recall students, including myself at the time, being very much
interested in computer history. Indeed, we wanted to learned the very
newest stuff. I think that is a characteristic of youth.
Frankly I don't think there'd be much interest nor use of computer
history. It's really a separate subject.
In other fields, how much history is taught? Do engineers spend time
learning about long obsolete methods of bridge construction (beyond a
general intro)? Do doctors learn about long obsolete healing methods
(beyond a general intro)?
I think predecessor techniques ought to be taught if they have
relevance to modern day problem solving. Perhaps doctors should learn
about house calls and the techniques of the old fashioned doctors seen
in the movies so as to understand the importantance of merely
comforting the patient beyond medicine. If a seldom used old drug or
treatment still has value in some cases, then it still should be
taught (sulfa drugs are still needed as anti-biotics occassionally).
Computer people should be taught basic concepts of cateogrization,
sorting, collating, and tabulation, even without a tab machine
process.
However, there is no need to teach about the conditions of vacuum
tubes or drum memory or wiring a plugboard.
>In article <0001HW.C1FA8BC4...@news.verizon.net>,
>randy...@FOOverizonBAR.net (Randy Howard) writes:
>
>> One wonders why "history" is a term used to describe teaching
>> of something in the so recent past. I think of things happening
>> centuries ago or longer as such.
>
>There's nothing in the definition of "history" that says it has
>to be centuries ago. For really old events, the word "history"
>often has the word "ancient" prepended. There's nothing wrong
>with this; think about it the next time you use the command-line
>history feature on your machine.
>
>> I wonder if 1000 years from now anyone will care at all about the
>> perks of 8-bit processors?
>
>There'll always be ancient historians to think about ancient history.
Come on, a millenium isn't that long; ancient history probably starts in
the Roman era BCE: not sure I'd label the Roman departure from
Britannia, Viking excursions, or the Dark Ages, as ancient history. The
pros assuredly have their own definition, which may not agree with my
opinion.
--
Thanks. Take care, Brian Inglis Calgary, Alberta, Canada
Brian....@CSi.com (Brian[dot]Inglis{at}SystematicSW[dot]ab[dot]ca)
fake address use address above to reply
On the contrary, IF you can get some old computer running, it might be
very interesting because the components were macroscopic. You could
see bits perfored in cards or paper tape. Nowadays, computers reduce
to a little number of chips and there's nothing to see. If you can
play with old hardware, a lot of CS can be learned with them.
> In other fields, how much history is taught? Do engineers spend time
> learning about long obsolete methods of bridge construction (beyond a
> general intro)? Do doctors learn about long obsolete healing methods
> (beyond a general intro)?
But perhaps there's less obsolescence in CS than in the other domains.
Of course devices and interfaces change, but the fundamentals of CS,
from theory to programming are about the same. After all, we still
use programming languages whose design started 50 years ago, and OSes
whose design started 30 or 20 years ago (it's like if M.D. or
engineers used tools invented 700 years ago).
> I think predecessor techniques ought to be taught if they have
> relevance to modern day problem solving. Perhaps doctors should learn
> about house calls and the techniques of the old fashioned doctors seen
> in the movies so as to understand the importantance of merely
> comforting the patient beyond medicine. If a seldom used old drug or
> treatment still has value in some cases, then it still should be
> taught (sulfa drugs are still needed as anti-biotics occassionally).
> Computer people should be taught basic concepts of cateogrization,
> sorting, collating, and tabulation, even without a tab machine
> process.
Yes, but having access to the actual hardware is a good pedagogical
device.
> However, there is no need to teach about the conditions of vacuum
> tubes or drum memory or wiring a plugboard.
Any interested student will learn about them by himself anyways.
Perhaps the big difference between CS and any other domain, is that
more people start to learn CS stuff by themselves than MD or
architecture. It's easier to write a computer program at 10 than
it is to do an appendicectomy.
--
__Pascal Bourguignon__ http://www.informatimago.com/
"Our users will know fear and cower before our software! Ship it!
Ship it and let them flee like the dogs they are!"
> On the contrary, IF you can get some old computer running, it might be
> very interesting because the components were macroscopic. You could
> see bits perfored in cards or paper tape. Nowadays, computers reduce
> to a little number of chips and there's nothing to see. If you can
> play with old hardware, a lot of CS can be learned with them.
I agree that playing with hardware can teach a lot. But even years
ago in my day that was impractical; the mainframe was too busy and too
secured to allow that kind of downtime for individual hardware
experimentation. We had paper assemblers and simulator programs to
teach fundamentals.
> But perhaps there's less obsolescence in CS than in the other domains.
> Of course devices and interfaces change, but the fundamentals of CS,
> from theory to programming are about the same. After all, we still
> use programming languages whose design started 50 years ago, and OSes
> whose design started 30 or 20 years ago (it's like if M.D. or
> engineers used tools invented 700 years ago).
The degree of obsolencence depends on what direction the student is
going. Certain very basic concepts of systems analysis haven't
changed too much although the tools have. But some of the working
environments are radically different--Unix and the Internet are very
different than batch and COBOL or Fortran. I understand Excel is used
as a substitute for where Fortran was used. Certainly CAD eliminates
some other needs. Likewise, Excel eliminates some COBOL tasks.
Hardware has dramatically changed, not only its technology, but how we
use it. Because of the massive decreases of the price of hardware, we
can deploy it differently than in the past. We don't have to ration
out bandwidth or storage as in the past. We don't have to worry about
every bit. Some bit saving techniques were even obsolete in my day,
when machines got bigger than 64K.
> Perhaps the big difference between CS and any other domain, is that
> more people start to learn CS stuff by themselves than MD or
> architecture. It's easier to write a computer program at 10 than
> it is to do an appendicectomy.
Very true. It's also more accessible.
Aspiring doctors in my high school would work as hospital volunteers
so as to get some early medical exposure. However, the duties and
whereabouts of such teen volunteers were sharply limited; I don't
think they were allowed to follow around on rounds or observe
procedures. I was a volunteer as well, but interested in getting
business experience (too young to get a paid job which were rare). I
was able to get into the hospital computer room and observe a lot
more, as well as be a part of administrative procedures.
According to my kids, before they grew up, 1970 was ancient
history. 1940 was pre-history. It seems to depend on the
viewpoint.
--
<http://www.cs.auckland.ac.nz/~pgut001/pubs/vista_cost.txt>
<http://www.securityfocus.com/columnists/423>
"A man who is right every time is not likely to do very much."
-- Francis Crick, co-discover of DNA
"There is nothing more amazing than stupidity in action."
-- Thomas Matthews
Well, you can build a vacuum tube with much less technology.
Something like an adept technician, a glassblower, and a vacuum
pump. Could come in handy.
Sorry to nitpick, but actually you couldn't, at least not for digital
service.
When IBM began to use tubes for its machines, it used stock radio
tubes. It found they didn't work too well.
A radio tube was designed to amplify varying signals, usually in the
middle of the range and that was it. A digital tube, in contrast, was
either fully conducting or fully off. Radio tubes weren't meant to do
that and some had trouble reversing state as needed. Also, some
minute dirt in a radio tube would not be noticeable on the radio but
would interfere with digital switching.
IBM had to work with tube makers to bring digital tubes up to a far
higher level of performance. This was a tough job.
It, by in large, is not taught. It's not necessary to know the history
of computers in order to use them. History tends to be more a story of
people than technology. Very few classes on the history of computers
and computation exist. It's like a side bar to other classes or taught
as part of social science, etc.
> It seems to me that it is
>limited to only 5 years back. The reason I say this, is that in some
>academic groups, some people think windows is the first pc operating
>system, forgetting about cp/m and others.
No surprise. I work with a librarian who thinks Steve Wozniak invented
the mouse and is responsible for her carpal tunnel when the real guy
worked a building over (wind tunnel really) and did it after he left
here. And this is in the Santa Clara Valley.
--
> On Feb 16, 3:24 pm, CBFalconer <cbfalco...@yahoo.com> wrote:
> > Well, you can build a vacuum tube with much less technology.
> > Something like an adept technician, a glassblower, and a vacuum
> > pump. Could come in handy.
>
> Sorry to nitpick, but actually you couldn't, at least not for digital
> service.
I'll have to agree with Hancock here.
With not but an adept tech, a glass blower, and a vaccum pump, you
could probably build a low-mu triode good to a megahertz, but
generating high performance vacuum tubes was a black art that
continuously developed over 60 years.
It seems to me that every generation see the technology they grew up
with as "obvious" and "trivial", whether it was the Steam Engine, the
6SN7, the 74LS245 or the C&T entire-PC-in-one-custom-ASIC.
What will your grandchildren write fifty years from now (in this very
froup)?
"You can build a Pentium Twelve with much less technolgoy.
Something like an adept technician, some sand and some bondout
wire and leadframes. Could come in handy."
--
Lawrence Statton - lawre...@abaluon.abaom s/aba/c/g
Computer software consists of only two components: ones and
zeros, in roughly equal proportions. All that is required is to
place them into the correct order.
> Frankly I don't think there'd be much interest nor use of computer
> history. It's really a separate subject.
>
> In other fields, how much history is taught? Do engineers spend time
> learning about long obsolete methods of bridge construction (beyond a
> general intro)? Do doctors learn about long obsolete healing methods
> (beyond a general intro)?
>
> I think predecessor techniques ought to be taught if they have
> relevance to modern day problem solving.
They have more relevance than many people think - if for no other
reason than that they colour existing designs. Even brand-new
designs would look much different if they weren't influenced -
for good or bad - by what came before. I think it's good that
people learn the fundamentals of design principles that remain
valid much longer than our planned-obsolescence generation believes.
"Those who ignore history are doomed to repeat it."
In article <proto-8E9086....@reader2.panix.com>,
Walter Bushell <pr...@panix.com> wrote:
>In article <0001HW.C1FA8BC4...@news.verizon.net>,
> Randy Howard <randy...@FOOverizonBAR.net> wrote:
>> One wonders why "history" is a term used to describe teaching of
>> something in the so recent past. I think of things happening centuries
>> ago or longer as such.
Many profesional historians ascribe to that belief.
In point of fact that's why many historians get into the professional
history game: to escape the present, modern world. This is merely 1
reason why the field pissed off feminists and their adoption of
"herstory". We may be "itstory." It's like E. A. Robinson's poem about
Miniver Cheevey. I know fair number of non-technologists like this.
>> I wonder if 1000 years from now anyone will care at all about the perks
>> of 8-bit processors?
>
>They may be more interested in chipping flint. I think this period, from
>the early 1900s will be of great interest to historians, if civilization
>and humanity survive.
I would not just say "anyone." A small number of historically minded
technologists may care. Definitely not most historians.
My favorite quote which I wish I had heard and remembered in 4th grade
would have to be:
From a long view of the history of mankind - seen from, say, ten thousand
years from now, there can be little doubt that the most significant event
of the 19th century will be judged as Maxwell's discovery of the laws of
electrodynamics. The American Civil War will pale into provincial
insignificance in comparison with this important scientific event of the
same decade.
R. P. Feynman, Lectures on Physics, Vol. II, Addison-Wesley, 1964, page 1-11.
That's what has to survive.
--
Then use the word "Chronology."
>There's nothing in the definition of "history" that says it has
>to be centuries ago. For really old events, the word "history"
>often has the word "ancient" prepended. There's nothing wrong
>with this; think about it the next time you use the command-line
>history feature on your machine.
Basically true.
>> I wonder if 1000 years from now anyone will care at all about the
>> perks of 8-bit processors?
>
>There'll always be ancient historians to think about ancient history.
Not as simple as that.
It's going to be the story of those who care "his story."
Historians have their specific interests; they aren't quite
archeologists for instance. The education system attempts to force them
to teach generalist history. In turn we, as students, tend to have
General Education requirements which tend to have 1-2 history courses.
The general masses of students (our colleagues) tended to have some
amount of world history and in the US: US history. Some of this comes
in primary and elemenary school much less college. I watched my classmates
took and grumbled about "Western Civ." and "US history" (4 and 17) as
undergrads. I bided my time a bit, and I started with "The History of
the Atomic Age." I lucked out. I took the special seminar part first
then the main reading class second. In the seminar, we met with and had
dinner with people like Fermi's wife, and this guy named Feynman came up
and his talk was turned into a section and chapter in a book. And Teller.
And many lesser knowns (not to me, these guys were big guys like Manley,
Norris Bradbury, Kisatowski (sp), others). To this day I stay in touch
with my history prof, toss to him all the old historically relevant
reports and meeting notes, and the weird thing is that these guys I
would read about or have to buy text books for; I know many of them,
work with some of them, etc. Not the really old guys. I should not
have eaten that salad last evening because McCarthy asked me to join him
and his daughter at dinner.
In a few colleges there will be computing history classes. And I expect
them to get the history wrong. But very few.
--
Peter Flass <Peter...@Yahoo.com> spake the secret code
<45d59c23$0$24709$4c36...@roadrunner.com> thusly:
>The fallacy is that windoze is *not* an operating system, at least until
>NT, and probably later.
Correct. Windoze is not an operating system.
However, Windows is an operating system.
--
"The Direct3D Graphics Pipeline" -- DirectX 9 draft available for download
<http://www.xmission.com/~legalize/book/download/index.html>
Legalize Adulthood! <http://blogs.xmission.com/legalize/>
: Even brand-new
: designs would look much different if they weren't influenced -
: for good or bad - by what came before.
For many years, I had a 1975/6 Amana RadarRange -- the first
commercial microwave oven IIRC. It was donated by an elderly member of
our Coleco ADAM users' group who was moving and no longer had room for
it. In appearance, it was *EXACTLY* like the over-the-stove fold-down-
drawer toaster oven of the period, brushed stainless steel exterior,
shiny chrome handle, mechanical timer with a bell. Even though the
internals had nothing to do with a resistive-heat oven, in order to sell,
it had to look like people's existing idea of an oven. It would have
to fit in with existing kitchen decor, maybe replace the cabinet space
of an existing built-in toaster oven. It even had to be made out of stuff
that homemakers could use their existing cleaning and scrubbing stuff with.
Novel way to cook, but disturb as little else as possible. Over time, of
course, the design and materials of microwaves has changed.
Our RadarRange lasted until about 2001, when kids in the basement
managed to knock it off the VCR cart it was on, denting it very badly,
and I didn't want to chance trying to power it up ever again. It weighed
a ton...
*Rich*
--
Richard F. Drushel, Ph.D. | "They fell: for Heaven to them no hope
Instructor and Executive Officer | imparts / Who hear not for the beating
Department of Biology, CWRU | of their hearts."
Cleveland, Ohio 44106-7080 U.S.A. | -- Edgar Allan Poe, "Al-Aaraaf"
: For many years, I had a 1975/6 Amana RadarRange -- the first
And I have misremembered the spelling: it's Radarange, elided R.
I was going back and forth on it...
I sat and stared at that for several minutes, trying to come up
with a valid competitor. I failed. The Michaelson-Morley
experiment was considered, since it affected much later, but it is
not epochal. Rigor in mathematics was another candidate.
You're thinking of cathode poisoning, or sleeping sickness. That's
why we used the 5692 in place of the 12AU7. (IIRC)
It is certainly true that many other operating systems, such as CP/M
or OS/9 (not MacOS 9, but the operating system used on the Tandy Color
Computer) were used on computers sold at prices that ordinary
individuals owned in large numbers.
The Honeywell 316 in the Neiman-Marcus catalog just came with an ASR
33 for input/output, and wasn't a disk-based system which would have
had what we would call an operating system by modern standards.
However, the term "PC" is ambiguous, and may be taken to refer not to
a computer for personal use in general, but instead specifically to
the IBM Personal Computer and the machines which followed it, such as
the IBM Personal Computer XT (presumed to be "Extended Technology",
with a 10 megabyte hard drive) and the IBM Personal Computer AT (with
an 80286 microprocessor).
In that case, the first operating system for the IBM Personal
Computer, in all its 4.77 MHz 8088 glory, was PC-DOS, written by
Microsoft, and later sold directly to the public by Microsoft under
the name MS-DOS. CP/M-86 was another operating system later offered
for that machine.
There were programs that ran on top of PC-DOS or MS-DOS which provided
an enhanced user environment. Some, such as Borland's Sidekick, did
not really provide any functions that one would normally associate
with an operating system. But others did provide an API (applications
program interface) of their own, thus earning the right to be called
operating systems (at least by one definition). TopView, by IBM, was
one example of such a program; it provided a text-mode based window,
but still provided a menu and mouse environment.
Graphical User Interfaces that ran on top of MS-DOS included GEM
Desktop (the distribution of which for the IBM PC architecture ended
up being curtailed because of a lawsuit, although it continued to be
used for the Atari ST), GeoWorks, and, of course, Microsoft Windows.
While the ability to run 32-bit code was introduced with the Win32s
extension to Windows 3.1, a later version of Windows, Windows 95,
extended the ability of Microsoft Windows to handle 32-bit code, and,
as well, was sold as a single, integrated operating system rather than
a separate shell to run on top of a version of MS-DOS.
John Savard
As I recall, the official definition of "ancient history" is as
follows:
Prehistory was when people didn't know how to write yet.
The oldest possible history is "ancient", we don't have any kind of
really old history that is older than ancient.
So ancient history started when Mesopotamia, Egypt, and China first
started writing.
Ancient history was followed by Mediaeval history, and the transition
between the two is marked by the downfall of the western Roman Empire.
So the Dark Ages were at the beginning of Mediaeval history,
definitely not in ancient history.
John Savard
This is true enough.
If one could build a vacuum tube good enough to use in a radio,
though, it could be used in a computer. The idea of running vacuum
tubes well below their rated specs for radio service to make them more
reliable is the key here, not the improvements in the quality of
vacuum tubes later achieved by IBM.
But a glassblower and some metalworking technology and a vacuum pump
are *not* all that's needed to make even a barely-useful radio tube.
To me, the most obvious omission is that the cathode needs a special
coating to make it efficiently emit electrons when heated - somewhat
analogous to the use of a Welsbach mantle in a lantern.
I could throw in the mica insulators too, that make everything stand
up, but that's nit-picking, being a small detail people would pick up
on their own easily enough. And miniature tubes dispensed with the
Bakelite base... the point is, once one has the *basics*, then one can
experiment and relearn the "black art" over time, particularly if one
has competition and an economic incentive.
John Savard
Tubes were before my time so I can't really speak from experience, but
I have to wonder if this is true. In the solid-state world saturating
logic is a whole lot easier than a good analog amplifier; if you can
amplify in the middle of the range then doing a switch is trivial.
There are stories (again before my time) about early-days CDC people
going dumpster-diving behind semiconductor fabs grabbing transistors
that had failed QC.
IBM working with tube companies to develop cheaper, lower-quality
components would actually make more sense to me.
> Eugene Miya wrote:
> >
> ... snip ...
> >
> > My favorite quote which I wish I had heard and remembered in 4th
> > grade would have to be:
> >
> > From a long view of the history of mankind - seen from, say, ten
> > thousand years from now, there can be little doubt that the most
> > significant event of the 19th century will be judged as Maxwell's
> > discovery of the laws of electrodynamics. The American Civil War
> > will pale into provincial insignificance in comparison with this
> > important scientific event of the same decade.
> >
> > R. P. Feynman, Lectures on Physics, Vol. II, Addison-Wesley, 1964, page
> > 1-11.
> >
> > That's what has to survive.
>
> I sat and stared at that for several minutes, trying to come up
> with a valid competitor. I failed. The Michaelson-Morley
> experiment was considered, since it affected much later, but it is
> not epochal. Rigor in mathematics was another candidate.
20th century Quantum Mechanics and Relativity, nuclear "weapons". OTOH
probably something unnoticed by most people today.
--
"The power of the Executive to cast a man into prison without formulating any
charge known to the law, and particularly to deny him the judgement of his
peers, is in the highest degree odious and is the foundation of all totali-
tarian government whether Nazi or Communist." -- W. Churchill, Nov 21, 1943
> here were programs that ran on top of PC-DOS or MS-DOS which provided
> an enhanced user environment.
Windows 3.x among others. The graphics did look like an explosion in a
Disney Animation factory.
> In article <1171651574....@a75g2000cwd.googlegroups.com>,
> hanc...@bbs.cpcn.com (hancock4) writes:
>
> > Frankly I don't think there'd be much interest nor use of computer
> > history. It's really a separate subject.
> >
> > In other fields, how much history is taught? Do engineers spend time
> > learning about long obsolete methods of bridge construction (beyond a
> > general intro)? Do doctors learn about long obsolete healing methods
> > (beyond a general intro)?
> >
> > I think predecessor techniques ought to be taught if they have
> > relevance to modern day problem solving.
>
> They have more relevance than many people think - if for no other
> reason than that they colour existing designs. Even brand-new
> designs would look much different if they weren't influenced -
> for good or bad - by what came before. I think it's good that
> people learn the fundamentals of design principles that remain
> valid much longer than our planned-obsolescence generation believes.
>
> "Those who ignore history are doomed to repeat it."
In Summer school.
> "Those who ignore history are doomed to repeat it."
"History would be a better teacher if it weren't so stupefyingly
repetitious"
--
C:>WIN | Directable Mirror Arrays
The computer obeys and wins. | A better way to focus the sun
You lose and Bill collects. | licences available see
| http://www.sohara.org/
I work in a school as the ICT Technician. The pupils will tell you that
Bill Gates invented both the PC and the internet. The older ones have
heard of Windows NT & 98 (which we used to have 3 years ago), while the
younger ones think something called XP is all that has ever run on
computers. The simple fact that all of the systems that they use in
school are Linux seems to have escaped them. If I show them a picture of
an IBM 360, which was the first system I worked on, they don't believe
it's a computer.
If anyone knows of a really good picture of a CDC Cyber 173, which is what
we had when I was at uni, I'd love to show them that.
I said to one the other day (he's about 15) that I built my first computer
in high school, when I was his age. He said that's not impressive, as
he had been building computers since he was 10. I then pointed out that
by "building a computer" I didn't mean going to a shop and buying a
motherboard, some memory, disc drive, etc and pluging it all together.
Oh no, I mean soldering wires to relays, toggle switches and light bulb
holders, and standing well back when you first powered it up. We had 50V
running through the busbar, which got a bit warm sometimes.
--
Trog Woolley | trog at trogwoolley dot com
(A Croweater back residing in Pommie Land with Linux)
Isis Astarte Diana Hecate Demeter Kali Inanna
I would think a lot of them would get it wrong. Those were making
it have rewritten it. :-)
But computing is technolgoy. I didn't get math history in my
math classes. They were little side bars in the texts and a few
comments from the instructor, if s/he cared--most didn't care.
Science is a different story. You get the history, kind of,
because of what you have to learn first before you can get
to the more advanced step.
/BAH
This is the strange thing about what is getting taught. The fundamentals
aren't getting taught.
> After all, we still
>use programming languages whose design started 50 years ago, and OSes
>whose design started 30 or 20 years ago (it's like if M.D. or
>engineers used tools invented 700 years ago).
>
>
>> I think predecessor techniques ought to be taught if they have
>> relevance to modern day problem solving. Perhaps doctors should learn
>> about house calls and the techniques of the old fashioned doctors seen
>> in the movies so as to understand the importantance of merely
>> comforting the patient beyond medicine. If a seldom used old drug or
>> treatment still has value in some cases, then it still should be
>> taught (sulfa drugs are still needed as anti-biotics occassionally).
>> Computer people should be taught basic concepts of cateogrization,
>> sorting, collating, and tabulation, even without a tab machine
>> process.
>
>Yes, but having access to the actual hardware is a good pedagogical
>device.
To learn well, I think it takes hand manipulations. I can't
think of a better way to learn how to manipulate bits than
to toggle switches and have instantaneous feedback of
blinkenlights or smoke or screeches.
>
>
>> However, there is no need to teach about the conditions of vacuum
>> tubes or drum memory or wiring a plugboard.
>
>Any interested student will learn about them by himself anyways.
You assume that the student will hear about these things. They are
not, so how can they investigate something they don't know they don't
know?
>
>
>Perhaps the big difference between CS and any other domain, is that
>more people start to learn CS stuff by themselves than MD or
>architecture. It's easier to write a computer program at 10 than
>it is to do an appendicectomy.
That is because not much damage can be done. Testing and making
mistakes are not a danger to one's breathing continuation.
/BAH
Nope, not in the computing biz. Unfortunately, it will be done
on the job.
/BAH
Nope. Note that not even Microsoft considers it an OS.
/BAH
Start a building project just like when you were a kid. Try to
let the kiddies play, too :-).
/BAH
Has anyone thought about what should be covered in these courses? Right
now it's up to the prof., but maybe someone should consider a standard
curriculum, line the ACM has for CS.
What did QM and Relativity (Special or General) have to do with developing
the first atom bomb?
/BAH
> hanc...@bbs.cpcn.com writes:
>
>
>>On Feb 16, 3:24 pm, CBFalconer <cbfalco...@yahoo.com> wrote:
>>
>>>Well, you can build a vacuum tube with much less technology.
>>>Something like an adept technician, a glassblower, and a vacuum
>>>pump. Could come in handy.
>>
>>Sorry to nitpick, but actually you couldn't, at least not for digital
>>service.
>
>
> I'll have to agree with Hancock here.
>
> With not but an adept tech, a glass blower, and a vaccum pump, you
> could probably build a low-mu triode good to a megahertz, but
> generating high performance vacuum tubes was a black art that
> continuously developed over 60 years.
>
> It seems to me that every generation see the technology they grew up
> with as "obvious" and "trivial", whether it was the Steam Engine, the
> 6SN7, the 74LS245 or the C&T entire-PC-in-one-custom-ASIC.
This is what makes the history of technology interesting. You
appreciate current stuff a lot more whan you see all the false starts
and dead-ends, and all the problems that had to be solved to get where
we are.
The Antikithera mechanism seems to be as good as early clocks 1000+
years later, lacking only spring-steel to make it self-moving. On the
other hand, since all the gears had to be cut by hand it was almost
necessarily a one-off piece. How much math had to be invented to build it?
> In article <1171651574....@a75g2000cwd.googlegroups.com>,
> hanc...@bbs.cpcn.com (hancock4) writes:
>
>
>>Frankly I don't think there'd be much interest nor use of computer
>>history. It's really a separate subject.
>>
>>In other fields, how much history is taught? Do engineers spend time
>>learning about long obsolete methods of bridge construction (beyond a
>>general intro)? Do doctors learn about long obsolete healing methods
>>(beyond a general intro)?
>>
>>I think predecessor techniques ought to be taught if they have
>>relevance to modern day problem solving.
>
>
> They have more relevance than many people think - if for no other
> reason than that they colour existing designs. Even brand-new
> designs would look much different if they weren't influenced -
> for good or bad - by what came before. I think it's good that
> people learn the fundamentals of design principles that remain
> valid much longer than our planned-obsolescence generation believes.
>
My last post on this topic, I promise;-) This is a very good point.
Computer architecture started off as "100 flowers", and seems to have
gotten channeled into a very few streams, to mix metaphors. IA-32, for
example, is "fast" only because a tremendous amount of effort has gone
into making it so. Who's to say that, given the same effort, some other
current or past architectures might not have been as fasty or faster?
A path taken years ago because of some technological limitation that has
since been superceded should probably be looked at from time to time and
not taken for granted. For example, lots of things were done in the old
days because of limitations on the number of gates on a chip that are
simply not relevant.
> [Please do not mail me a copy of your followup]
>
> Peter Flass <Peter...@Yahoo.com> spake the secret code
> <45d59c23$0$24709$4c36...@roadrunner.com> thusly:
>
>
>>The fallacy is that windoze is *not* an operating system, at least until
>>NT, and probably later.
>
>
> Correct. Windoze is not an operating system.
>
> However, Windows is an operating system.
Nope, Windows(tm) is a GUI on top of an OS, originally DOS, but now
some rip-off from DEC.
He limited it to the 19th century.
The Special Theory of Relativity led to the equation E=mc^2. This
allowed the energy yield of a fission reaction to be estimated from
measurements of isotopic masses.
The relationship of Quantum Mechanics to the development of atomic
weapons is considerably more far-reaching.
When radioactivity was first discovered, it was a mysterious
phenomenon. Thus, we still use the name "X-rays" for the form of
electromagnetic radiation discovered and produced by Roentgen.
The same scientists who were making theoretical advances in quantum
mechanics, and using it to explain the structure of the electron
shells of the atom, were also working to make sense out of the
discoveries related to radioactivity. For just one example, the "half-
life" of a radiosotope derives from the properties of its potential
well in relation to the phenomenon of quantum tunneling.
It is true that much of the work that led to the discovery of the
neutron, and the recognition of the existence of nuclear fission, was
experimental and pragmatic, and did not depend greatly on quantum
mechanics. But there was still a relationship, even if the flow of
ideas mostly went the other way; the new discoveries about the atom
were a tremendous stimulus to the field of quantum mechanics.
John Savard
> This is what makes the history of technology interesting. You
> appreciate current stuff a lot more whan you see all the false starts
> and dead-ends, and all the problems that had to be solved to get where
> we are.
>
> The Antikithera mechanism seems to be as good as early clocks 1000+
> years later, lacking only spring-steel to make it self-moving. On the
> other hand, since all the gears had to be cut by hand it was almost
> necessarily a one-off piece. How much math had to be invented to build it?
>
>
And often, design ends up being serial. So this gets added to that, and
then someone else comes along and adds to that.
Sometimes it's helpful to go back to the beginning, or around then, and
revisit the early things. Then maybe one of those early points is a start
for a whole new branch of design, rather than merely extending the previous
branch.
In some cases, something comes along, and then is discarded because it's
too hard to implement at the time. but when revisited decades later, other
things have changed to make that a far more viable thing.
But if history/the basics aren't taught or learned, then few will go
off and find those early things to pursue now. They won't know there
was something before last week.
IN other cases, they'll be doomed to some bulky design because that's
what's current, while someone older who live through earlier designs
could instantly suggest a far simpler solution from an earlier era.
I saw this happen in one of the sci.electronics.* newsgroups a couple
of years ago. Someone started a long thread, admittedly he did return
to it, asking about how to do something. But only with time was his
real intent revealed, after all kinds of fancy and more recent solutions
were offered up. ANd every time someone suggested something different,
the original poster would go off on that tangent, because he didn't
have the background or history to evaluate that specific bit, so he
just followed. Eventually, it turned out to be a really simple solution
from days gone by, which would have been obvious if he'd been looking over
older books (not "learning history", but reading material from the time
that later became history), but by now it had become so foreign that he
had a hard time wrapping his mind around it.
IN some ways, history shouldn't be about dates and such, though the
general timeline is important, but to give an overview of what has
happened, so when something from the past is needed, someone will have
a tendency to check out the past when looking for solutions.
Michael
So the 5th Century is the official transition.
I thought the Dark Ages started later, from the decline of the
Byzantines and rise of the Turks until the start of the Renaissance.
--
Thanks. Take care, Brian Inglis Calgary, Alberta, Canada
Brian....@CSi.com (Brian[dot]Inglis{at}SystematicSW[dot]ab[dot]ca)
fake address use address above to reply
Er, they were the theoretical underpinnings of a _major_ engineering
effort that was the Manhattan Project. On the order of _Vista_
apparently. >;)(
> So the 5th Century is the official transition.
> I thought the Dark Ages started later, from the decline of the
> Byzantines and rise of the Turks until the start of the Renaissance.
The Renaissance was partially led by and perhaps sparked by the flight
of scholars from Byzantium to the West, IIRC. Compare to the rise of
American physics and chemistry from the massive influx of German
physicists and chemists before WWII. In fact the American post War pre
eminence in many other fields can not be explained without mentioning
this influx of mostly unwanted Germans.
<snip>
> >
>
> My last post on this topic, I promise;-) This is a very good point.
> Computer architecture started off as "100 flowers", and seems to have
> gotten channeled into a very few streams, to mix metaphors. IA-32, for
> example, is "fast" only because a tremendous amount of effort has gone
> into making it so. Who's to say that, given the same effort, some other
> current or past architectures might not have been as fasty or faster?
Many people think PPC could have been a contender. But Intel had so much
money from personal computer chips that no one wanted to compete in
chips for personal computers. I would have thought IBM would have done
it for corporate pride and morale reasons and to keep their chip
designers on their toes, but not.
> A path taken years ago because of some technological limitation that has
> since been superceded should probably be looked at from time to time and
> not taken for granted. For example, lots of things were done in the old
> days because of limitations on the number of gates on a chip that are
> simply not relevant.
--
> In some cases, something comes along, and then is discarded because it's
> too hard to implement at the time. but when revisited decades later, other
> things have changed to make that a far more viable thing.
Computers are an interesting example. Babbage comes to mind.
I think its even worse that that. In Manchester I can visit our museum of
Science and see cotton machinery from the last century, and equipment from
the early part of this. But the only working computer is a replica of
"Babe". I think that the mainframe has been forgotton,,,,
> David
>
there was large amount of corporate politics ... remember there was
huge organization that was selling intel-based PC products.
ppc was another 801 follow-on to romp (pc/rt) and rios (rs/6000). lots
of people thot ppc would be a natural way of risc competing with '86
processors. however there were large segments that effectively viewed
it as competition.
various posts with old email mentioning 801, iliad, romp, rios, etc
http://www.garlic.com/~lynn/lhwemail.html#801
and general posts mentioning 801, iliad, fort knox, romp, rios,
somerset, etc
http://www.garlic.com/~lynn/subtopic.html#801
the other issue in that time-frame ... as i've mentioned before, the
SAA effort was effectively attempting to maintain the terminal emulation
paradigm for PCs
http://www.garlic.com/~lynn/subnetwork.html#emulation
limiting the per adapter thruput (like with the discussion of LAN
adapter cards) for desktop machines ... to what was needed for
terminal emulation ... helped box-in emerging client/server and 3-tier
architecture
http://www.garlic.com/~lynn/subnetwork.html#3tier
part of the issue limiting rs/6000 to PC adapter cards ... was that it
not only limited the desktop thruput ... but also thruput of server &
3-tier configurations (again restricting the transition away
from terminal emulation).
various specific posts mentioning rs/6000 being pressured into using
various PC adapter cards (LAN, disk, display, etc) ... joke was that
you too could have rs/6000 with thruput of a PC
http://www.garlic.com/~lynn/2001j.html#20 OT - Internet Explorer V6.0
http://www.garlic.com/~lynn/2002g.html#9 IBM MIcrochannel??
http://www.garlic.com/~lynn/2004p.html#59 IBM 3614 and 3624 ATM's
http://www.garlic.com/~lynn/2005h.html#12 practical applications for synchronous and asynchronous communication
http://www.garlic.com/~lynn/2005q.html#20 Ethernet, Aloha and CSMA/CD -
http://www.garlic.com/~lynn/2005q.html#21 Ethernet, Aloha and CSMA/CD -
http://www.garlic.com/~lynn/2005q.html#38 Intel strikes back with a parallel x86 design
http://www.garlic.com/~lynn/2005u.html#50 Channel Distances
http://www.garlic.com/~lynn/2006k.html#42 Arpa address
http://www.garlic.com/~lynn/2006l.html#35 Token-ring vs Ethernet - 10 years later
http://www.garlic.com/~lynn/2006l.html#36 Token-ring vs Ethernet - 10 years later
http://www.garlic.com/~lynn/2007b.html#46 'Innovation' and other crimes
that strategy somewhat restricted rs/6000 to numerical intensive
applications ... which wasn't a particularly large market ... and
didn't already have a large corporate install base.
there was some conjecture that similar objectives were part behind
taking ha/medusa scaleup away from us and moving it to another
organization.
referenced here
http://www.garlic.com/~lynn/95.html#13
http://www.garlic.com/~lynn/96.html#15
we were looking at doing as much scaleup in the commercial market
segment as in the numerical intensive market segment. the resulting
transfer eventually announced a product address only at the numerical
intensive market segment. misc. past posts with old email discussing
ha/medusa scaleup
http://www.garlic.com/~lynn/lhwemail.html#medusa
similar observations could also be made about cancelling our
activities for high-speed NSFNET backbone ... even tho there were
extensive lobbying efforts by NSF ... including all the way up to the
director of NSF communicating with corporate CEO/chairman ... that
what we already had running was at least five years ahead of all bid
submissions to build something new (i've made various comments in the
past that while tcp/ip was the technology basis for internetworking
... the NSFNET backbone was the operational basis for internetworking
and eventually the modern internet). misc. past posts with old email
mentioning various high-speed networking related activity
http://www.garlic.com/~lynn/lhwemail.html#nsfnet
You're lucky in that you have something to see. Here in Brum the computers,
one of which is an ICL 1900 I think, are squirrelled away in the museum store
never to be on display ever again. They used to be on display, but they moved
our fabulous Museum of Science and Industry to something called Thinktank, and
only moved about 20% of the exibits to the new building. They also made the new
place inaccessible and charge people to visit (the Science Museum was free) and
they wonder why hardly anybody goes.
IBM never seems to do anything logical. How the heck have they been so
successful? I always thought that they should make more effort to push
their own products, and have one support the other.
Ever hear of 615? Dilbert is not fiction.
<snip>
--
Keith
> Peter Flass <Peter...@Yahoo.com> wrote:
>>
>> IA-32, for example, is "fast" only because a tremendous amount of
>> effort has gone into making it so. Who's to say that, given the
>> same effort, some other current or past architectures might not
>> have been as fasty or faster?
>
> Many people think PPC could have been a contender. But Intel had so much
> money from personal computer chips that no one wanted to compete in
> chips for personal computers. I would have thought IBM would have done it
Huh? Most years, the PowerPC is faster (also cooler).
It's currently the basis of most of the top supercomputers.
And the most recent thing is IBM's Cell architecture.
And while PowerPC is also very popular for embedded systems,
it's lost (IBM didn't seem to want) the laptop market,
which is why Apple switched over.
re:
http://www.garlic.com/~lynn/2007d.html#43 Is computer history taugh now?
pressuring RS/6000 to use PS/2 adapter cards ... was "helping your
brethren" ... frequently the issue was who was to help who. one might
claim that too much helping ... could result in having hodge-podge of
pieces that weren't designed for the targeted market (stuff that had
been designed for a totally different market).
Another possible way of viewing the situation is that lots of times
efforts were being pressured into supporting major installed legacy
operations ... at the expense of being able to agilely move into new
markets.
There has been quite a bit written about original acorn effort
starting out as independent business operation ... not having to worry
about compromising as part of supporting existing legacy
operations. However, once a major market segment had been established
... especially in the scenario of terminal emulation
http://www.garlic.com/~lynn/subnetwork.html#emulation
... lots of pressure mounted for other products to not be inconsistent
and/or impact that installed product base.
another possible scenario that i've mentioned before was the original
acorn effort was not looking at doing its own software ... somewhat as
a result a west coast group formed to provide software for the
product. at some point, the acorn effort changed their mind and
decided that they also wanted to "own" their own software (even if
that met going with outside companies under contractual relationships
... eliminating possibility that they cede control to other internal
organizations). misc. past posts mentioning ...
http://www.garlic.com/~lynn/2002g.html#79 Coulda, Woulda, Shoudda moments?
http://www.garlic.com/~lynn/2005q.html#24 What ever happened to Tandem and NonStop OS ?
http://www.garlic.com/~lynn/2005r.html#8 Intel strikes back with a parallel x86 design
http://www.garlic.com/~lynn/2006p.html#41 Device Authentication - The answer to attacks lauched using stolen passwords?
http://www.garlic.com/~lynn/2006y.html#29 "The Elements of Programming Style"
8-bit processors were a relatively "late" development - merely an
extension of the hardware architectures that preceded them, and unique
only in their physical size.
History deals with the major events that have influenced the course of
events to the present. In the case of computers, with the exception of
Charles Babbage and Hermann Hollerith, all the major influencing
developments in computer hardware didn't start until the late 1930's and
1940's (Atanasoff & Berry, Eckert & Mauchly, Von Neuman, ENIAC, MARK I,
Zuse, & others) followed by advent of commercial computers in 1950's
from UNIVAC, IBM, & others. If you want the history of the programming
and algorithmic side of computing then you may also have to throw in the
Greeks and centuries of developments in Mathematics, because much of
what was done starting in the 1940's to actually utilize computers and
understand their capabilities drew heavily on that background,
especially in areas of Logic & Automata Theory, Numerical Analysis,
Discrete Analysis, & Queuing Theory.
Some familiarization with the above history used to be required for
Graduate study in Computer Science, and hopefully still should be.
There is unfortunately not enough time in typical undergraduate programs
to deal with history.
IBM's z/OS (a flavor of MVS) flagship mainframe Operating System and
z-architecture hardware is simply an evolutionary growth of the OS/360
Operating System and S/360 hardware from the mid 1960's and as such
still builds on design concepts that are over 40 years old, enhanced
with the accumulated experience of 40 years. MS Windows history, by
comparison, is at best 15 years old. Many of its developers, being
mostly ignorant the hardware and software developments that preceded the
PC, have been condemned to re-invent multi-tasking, multi-processing,
memory management, etc. and are still struggling mightily with issues of
reliability and security.
--
Joel C. Ewing, Fort Smith, AR jREMOVEc...@acm.org
somewhat related topic drift in this old email (talking about MIP Envy
and possible ways that "small" processor evolution would follow)
http://www.garlic.com/~lynn/2007.html#email801006
in this post from several weeks ago
http://www.garlic.com/~lynn/2007.html#1 "The Elements of Programming Style"
along with this related old email (in the same post)
http://www.garlic.com/~lynn/2007.html#email801016
this might be considered evolution of large clusters of mid-range
4341s ... mentioned in these collected old emaild
http://www.garlic.com/~lynn/lhwemail.html#4341
with workstations and larger PCs later taking over that market segment.
and the MIP Envy topic wouldn't be complete w/o these more
recent postings mentioning Jim
http://www.garlic.com/~lynn/2007d.html#4 Jim Gray Is Missing
http://www.garlic.com/~lynn/2007d.html#6 Jim Gray Is Missing
http://www.garlic.com/~lynn/2007d.html#8 Jim Gray Is Missing
http://www.garlic.com/~lynn/2007d.html#17 Jim Gray Is Missing
http://www.garlic.com/~lynn/2007d.html#33 Jim Gray Is Missing
Given the collapse of the Russian economy after the fall of Communism,
despite _Scientific American_ taking an opposing editorial position,
we really should have tried harder to recruit as many as possible of
the qualified professors in scientific disciplines in the former
Soviet Union.
As well as enriching our own country - particularly if it is done by
expanding science education, and not throwing American professors out
of work - this would have helped ensure Russia would have less
opportunity to menace world peace in the future.
A new American renaissance fueled by scholars escaping Russia would
have been a nice thing to have.
John Savard
>
> "dk" <sa...@kanecki.com> writes:
>> I am wondering if computer is still taught. It seems to me that it is
>> limited to only 5 years back. The reason I say this, is that in some
>> academic groups, some people think windows is the first pc operating
>> system, forgetting about cp/m and others.
>
> cms was definitely personal computing from mid-60s ...
But what was the physically smallest and cheapest computer CMS ran on?
Could a hobbyist have owned one?
This is why I don't buy the PDP-8 or the PDP-11 as a PC: The whole point
of the 'PC Revolution' was that you could build a useful computer out of
parts that fit on a hobbyist workbench and within a hobbyist budget. You
can play games about whether a useful PDP-8 installation could fit on a
desk, but that is completely and utterly missing the point. A Personal
Computer is PERSONAL.
--
My address happens to be com (dot) gmail (at) usenet (plus) chbarts,
wardsback and translated.
It's in my header if you need a spoiler.
This recently caused me some problems. Some years ago we bought a
couple of PS-580's with a Microchannel bus. We recently considered
installing a recent version of AIX to use these as special-purpose
machines, only to discover that AIX no longer supports Microchannel...
for a little additional (old email) drift:
From: wheeler
Date: 03/15/85 09:22:19
looks like XXXXXX will have to handle presentation to Bloch/NSF on
tuesday. YYYYYY wants to hold a meeting all next week on vlsi
processor clusters in ykt. Packaging, systems, architecture, straight
370, 370/801 mixed, and dedicated 801 systems, etc.
... snip ...
somewhat related to this old email
http://www.garlic.com/~lynn/2007c.html#email841016
in this post
http://www.garlic.com/~lynn/2007c.html#50 How many 36-bit Unix ports in the old days?
part of this was I had written a series of papers starting nearly a
year earlier on the concept ... previously referenced here
http://www.garlic.com/~lynn/2004m.html#17 mainframe and microprocessor
and referenced in this recent post
http://www.garlic.com/~lynn/2007c.html#7 Miniature clusters
similar, but different to ha/medusa
http://www.garlic.com/~lynn/lhwemail.html#medusa
and of course, various old emails mentioning director of NSF,
NSFNET, etc.
http://www.garlic.com/~lynn/lhwemail.html#nsfnet
other old email mentioning 801
http://www.garlic.com/~lynn/lhwemail.html#801
and other posts about 801
http://www.garlic.com/~lynn/subtopic.html#801
I think the internal competition is one way of keeping folks on their feet.
The sales guys have a range of product...
perhaps they are successful because of internal competition. I think that
lack of competition is one of Microsoft problems...
Firstly "ran" implies its totally dead. Thats not true, we still run CMS
where I work, but not for many users...
In its hey day the XT/370 was the smallest and cheapest total cost machine
but that was late in the product life cycle.
I think the cheapest machine "of its time" was the 4331 which would allow
> Could a hobbyist have owned one?
I doubt it. It was far too expensive for even major use. However VM/CMS on a
4331 would allow a small business access to "personal computing"...
http://www-03.ibm.com/ibm/history/exhibits/mainframe/mainframe_PP4331.html
>
>"Chris Barts" <puonegf...@tznvy.pbz> wrote in message
>news:pan.2007.02.18....@tznvy.pbz...
>> On Thu, 15 Feb 2007 19:55:07 -0700, Anne & Lynn Wheeler wrote:
>> > cms was definitely personal computing from mid-60s ...
>>
>> But what was the physically smallest and cheapest computer CMS ran on?
Personal business computing. IBM 3x0; 9370 was probably the smallest,
cheapest, mini-like configuration, other than limited XT/AT/370.
>Firstly "ran" implies its totally dead. Thats not true, we still run CMS
>where I work, but not for many users...
>In its hey day the XT/370 was the smallest and cheapest total cost machine
>but that was late in the product life cycle.
>I think the cheapest machine "of its time" was the 4331 which would allow
>
>> Could a hobbyist have owned one?
>
>I doubt it. It was far too expensive for even major use. However VM/CMS on a
>4331 would allow a small business access to "personal computing"...
>
>http://www-03.ibm.com/ibm/history/exhibits/mainframe/mainframe_PP4331.html
>
>>
>> This is why I don't buy the PDP-8 or the PDP-11 as a PC: The whole point
>> of the 'PC Revolution' was that you could build a useful computer out of
>> parts that fit on a hobbyist workbench and within a hobbyist budget. You
>> can play games about whether a useful PDP-8 installation could fit on a
>> desk, but that is completely and utterly missing the point. A Personal
>> Computer is PERSONAL.
ISTM you're defining a self-built home computer. That's a matter of
skill and price. Apples, TRS-80s, IBM PCs, etc. were machines anyone
could buy and use if they had ~$2000.
> Walter Bushell <pr...@panix.com> writes:
>> In article <1171592326....@a75g2000cwd.googlegroups.com>,
>> "dk" <sa...@kanecki.com> wrote:
>>> The reason I say this, is that in some
>>> academic groups, some people think windows is the first pc operating
>>> system, forgetting about cp/m and others.
>
>> cp/m was a late comer.
>
> I don't see how you can say that. Notice that he said "PC operating system."
>
> If "PC" means the IBM Personal Computer, then MSDOS (not Windows)
It does today. At the time, "personal computers" didn't have to come
from IBM. Apple made some. Heathkit made some. Commodore and Vic-20
made appearances in the same space. I don't remember when Amiga first
showed up, and don't care enough to look it up.
The IBM marketing department hit a home run on naming, but so what?
HDOS for systems like the H-89 were certainly very early to the party,
and had a lot of advantages over later attempts, such as DOS.
> was basically the first DOS.
Disagree. MS-DOS was misnamed. It wasn't an operating system at all,
just a very limited shell and a primitive file system.
> If "PC" refers to any home microcomputer, as it sometimes did in the
> pre-IBM-PC days, then CP/M has a pretty good case for being one of the
> first DOSes.
No, others were earlier. CP/M was one of the most popular to be sure.
In fact, when DOS came out, people claimed that CP/M could never die,
because there were too many applications available for it. Similar
things are said about windows today, I hope they are false for similar
reasons.
> (Again, I'm excluding the Micro-Soft BASIC that came on
> paper tape, cassette, or ROM.) Written in 1974, it ran on the Altair
> and predates every other microcomputer DOS that I know of.
That's an interpreter, and has basically none of the features common to
operating systems. It was also stolen (like DOS) for the most part,
and made usable after MS stole Heathkit's top developer out of Benton
Harbor.
--
Randy Howard (2reply remove FOOBAR)
"The power of accurate observation is called cynicism by those
who have not got it." - George Bernard Shaw
> "Chris Barts" <puonegf...@tznvy.pbz> wrote in message
> news:pan.2007.02.18....@tznvy.pbz...
>>
>>But what was the physically smallest and cheapest computer CMS ran on?
>
>
> Firstly "ran" implies its totally dead. Thats not true, we still run CMS
> where I work, but not for many users...
We still do much of our production work on CMS. Maybe not 100's of
users, but at least 10's.
> In its hey day the XT/370 was the smallest and cheapest total cost machine
> but that was late in the product life cycle.
> I think the cheapest machine "of its time" was the 4331 which would allow
>
>>
>>Could a hobbyist have owned one?
Right now it runs fine of my Pentium (233?) at home. I get about the
same MIPS as a 4331.
>>This is why I don't buy the PDP-8 or the PDP-11 as a PC: The whole point
>>of the 'PC Revolution' was that you could build a useful computer out of
>>parts that fit on a hobbyist workbench and within a hobbyist budget. You
>>can play games about whether a useful PDP-8 installation could fit on a
>>desk, but that is completely and utterly missing the point. A Personal
>>Computer is PERSONAL.
If I owned one, it would be PERSONAL, what's the terminology problem.
In size and capability a -8 or -11 certainly fits the definition. The
problem was price.
> fOn Sun, 18 Feb 2007 17:46:05 -0000 in alt.folklore.computers, "David
> Wade" <g8...@yahoo.com> wrote:
>
>>
>>"Chris Barts" <puonegf...@tznvy.pbz> wrote in message
>>news:pan.2007.02.18....@tznvy.pbz...
>>>
>>> This is why I don't buy the PDP-8 or the PDP-11 as a PC: The whole point
>>> of the 'PC Revolution' was that you could build a useful computer out of
>>> parts that fit on a hobbyist workbench and within a hobbyist budget. You
>>> can play games about whether a useful PDP-8 installation could fit on a
>>> desk, but that is completely and utterly missing the point. A Personal
>>> Computer is PERSONAL.
>
> ISTM you're defining a self-built home computer. That's a matter of
> skill and price. Apples, TRS-80s, IBM PCs, etc. were machines anyone
> could buy and use if they had ~$2000.
Note how you said 'and price'. Until RAM got cheap enough people could buy
more than a few bytes without taking out a second mortgage, PCs were a
pipe dream. Ditto for the cost of powering and cooling the computer (a PC
is no worse than a TV set in this regard), and the cost of finding a
physical space for it to live (a PC demands essentially zero here if you
already have a home). The skill required is important but the price is the
defining characteristic.
Andrew Swallow
> David Wade wrote:
>
>> "Chris Barts" <puonegf...@tznvy.pbz> wrote in message
>> news:pan.2007.02.18....@tznvy.pbz...
>
>>>This is why I don't buy the PDP-8 or the PDP-11 as a PC: The whole point
>>>of the 'PC Revolution' was that you could build a useful computer out of
>>>parts that fit on a hobbyist workbench and within a hobbyist budget. You
>>>can play games about whether a useful PDP-8 installation could fit on a
>>>desk, but that is completely and utterly missing the point. A Personal
>>>Computer is PERSONAL.
>
> If I owned one, it would be PERSONAL, what's the terminology problem.
> In size and capability a -8 or -11 certainly fits the definition. The
> problem was price.
"The very rich are different from you and I."
"Yes. They have their own minicomputers."
More seriously, if you bought it with your own money as opposed to going
through a corporate funds-allocation procedure it would indeed be your
personal computer. But that isn't what the majority of the world means
when they say personal computer. They mean a computer the average duffer
in schluburbia can buy after seeing a couple glossy magazines with
breathless/brainless copy. The existence of such machines is dependent
upon RAM density, power consumption, and how much cubic the whole package
demands. (A disk drive that takes up as much as a washing machine is not
in the cards, even if the CPU will fit on a desk.) A PDP-8 doesn't cut it
because it wasn't nearly miniaturized enough even if it was ever sold for
no more than the Altair 8800.
I know I missed a chance to purchase a 4331 for $200.
Still, if someone asks "what was the first operating system for a
personal computer", to answer "OS/360, since it was written before any
of the other candidates, and I have it running under Hercules on my
personal computer at home right now" is to fail to provide the
information requested.
A question of the form "what was the first operating system for a
personal computer" is assumed to mean, in the absence of context
specifying otherwise, "what was the first operating system which, at
the time it was first offered for sale, distribution, or use, ran on a
computer that was sold as and thought of as a computer for personal
use at that time".
This doesn't mean that it isn't appropriate to challenge
preconceptions and barriers. Thus, the site that points out that
before the IBM PC, there was the Apple ][; before the Apple ][, there
was the Altair 8800; before the Altair 8800, there was the Mark-8;
before the Mark-8, there was the Kenbak-1; before the Kenbak-1, there
was the Geniac; before the Geniac, there was Simon...
http://www.blinkenlights.com/pc.shtml
performs a useful service in stretching our horizons and reminding us
of the extent of the history of the field, but there is also a place
for using conventional thinking to understand what people are asking
about when they ask questions.
John Savard
It is a question of WHEN. By the early 1980s, the VT78 and its variants
such as the WPS-8 were actually very nice personal computers, with
extremely useful software. But digital had 4 or 5 contenders and
did not put adequate weight behind any of them.
A VT78 looked like a VT52 terminal with dual floppy drive.
It ran the full OS/8 with Fortran, TECO and RUNOFF.
/ Lars Poulsen
> ISTM you're defining a self-built home computer. That's a matter of
> skill and price. Apples, TRS-80s, IBM PCs, etc. were machines anyone
> could buy and use if they had ~$2000.
TRS-80s were a lot less than $2000 - even the stunningly powerful
16K Level II was less than $1000 complete with display and cassette
recorder. With an expansion interface and a disc drive you were getting
into the $2000 region though.
--
C:>WIN | Directable Mirror Arrays
The computer obeys and wins. | A better way to focus the sun
You lose and Bill collects. | licences available see
| http://www.sohara.org/
David Ahl tried to talk DEC into making a smaller PDP-8, he did create a
prototype, when he worked there in some section working on getting
minicomputers into schools.
ANd of course, people did pursue non-home-computers in their quest for a
personal computer. Before there was the Mark-8 and the Scelbi, people were
putting together surplus sytems and getting computers any way they could.
There was the "Amateur Computer SOciety", including a "Jim Gray" (and I
keep wondering if it's the same one that got lost recently while sailing).
It was a pretty small assortment, but they nevertheless had their own
computers at a time when most people wouldn't dream of such a thing.
Even when the "home computers" hit, some went after minicomputers. A
friend of mine bought a used PDP-8 in the post-Altair seventies. There were
lots of people scheming to get an LSI-11 board so they could run PDP-11
software, including the SOuthern California Computer SOciety's group buy.
Michael
IIRC Heathkit did.
--
<http://www.cs.auckland.ac.nz/~pgut001/pubs/vista_cost.txt>
<http://www.securityfocus.com/columnists/423>
"A man who is right every time is not likely to do very much."
-- Francis Crick, co-discover of DNA
"There is nothing more amazing than stupidity in action."
-- Thomas Matthews
> Walter Bushell wrote:
> > The Renaissance was partially led by and perhaps sparked by the flight
> > of scholars from Byzantium to the West, IIRC. Compare to the rise of
> > American physics and chemistry from the massive influx of German
> > physicists and chemists before WWII. In fact the American post War pre
> > eminence in many other fields can not be explained without mentioning
> > this influx of mostly unwanted Germans.
>
> Given the collapse of the Russian economy after the fall of Communism,
> despite _Scientific American_ taking an opposing editorial position,
> we really should have tried harder to recruit as many as possible of
> the qualified professors in scientific disciplines in the former
> Soviet Union.
>
> As well as enriching our own country - particularly if it is done by
> expanding science education, and not throwing American professors out
> of work - this would have helped ensure Russia would have less
> opportunity to menace world peace in the future.
Plus fewer to take lucrative work in Iran or South Korea or such.
> A new American renaissance fueled by scholars escaping Russia would
> have been a nice thing to have.
>
> John Savard
--
"The power of the Executive to cast a man into prison without formulating any
charge known to the law, and particularly to deny him the judgement of his
peers, is in the highest degree odious and is the foundation of all totali-
tarian government whether Nazi or Communist." -- W. Churchill, Nov 21, 1943
> > As well as enriching our own country - particularly if it is done by
> > expanding science education, and not throwing American professors out
> > of work - this would have helped ensure Russia would have less
> > opportunity to menace world peace in the future.
>
> Plus fewer to take lucrative work in Iran or South Korea or such.
I trust you meant Iran or _North_ Korea.
John Savard
> Peter Flass wrote:
>
>>If I owned one, it would be PERSONAL, what's the terminology problem.
>>In size and capability a -8 or -11 certainly fits the definition. The
>>problem was price.
>
>
> I know I missed a chance to purchase a 4331 for $200.
Wow! Of course it would have cost a fortune to move, but IIRC it ran on
standard household current and didn't require A/C or a raised floor. Nice.
>
> A question of the form "what was the first operating system for a
> personal computer" is assumed to mean, in the absence of context
> specifying otherwise, "what was the first operating system which, at
> the time it was first offered for sale, distribution, or use, ran on a
> computer that was sold as and thought of as a computer for personal
> use at that time".
Back in the early 70s before the world in general (and myself) was
familiar with PCs, a CS prof at RIT had a small -11 unix system. The
whole thing sat on a typewriter table in his office so it could be
wheeled around. I really wanted one:-). What is the timeline of unix
vs. micros? I would suggest that maybe unix is the first PC operating
system.
UNICS was started in September 1969.
http://www.levenez.com/unix/history.html#01
The first micro-computer (PC), named Micral-N has been made by André
Truong and François Gernelle in 1972 (three years before the
Altaïr). The first system was delivered on January 15th 1973. There
was no OS on it, the first application being cross developed on the
bare hardware.
http://mapage.noos.fr/ahti/CNAM%2090.pdf
--
__Pascal Bourguignon__ http://www.informatimago.com/
Pour moi, la grande question n'a jamais été: «Qui suis-je? Où vais-je?»
comme l'a formulé si adroitement notre ami Pascal, mais plutôt:
«Comment vais-je m'en tirer?» -- Jean Yanne
Not in 1980. But when the LSI11 chipset came out it was an
option. ISTR that was around 1983, but that could be another
"senior moment".
The LSI11 was priced for a different world, though.
They _could_ have made a single-board PDP11 around 1984, and run a
server-based OS on that, with some network NIC on-board.
This would have made a stellar PC-server and "glue box" to
3270, serial ports and a good disk server. Such boxes regularly
moved for $5000+.
Come to think of it, an 80186 could also have fit the niche
as well. I saw some very good MP/M based systems around this
time. With DEC's clout they could have made an impact.
Novell took that niche when the 80286 came out in volume.
DEC was a lost cause by 1983; when they failed to grok the
PC revolution.
-- mrr
> In that case, the first operating system for the IBM Personal
> Computer, in all its 4.77 MHz 8088 glory, was PC-DOS, written by
> Microsoft, and later sold directly to the public by Microsoft under
> the name MS-DOS. CP/M-86 was another operating system later offered
> for that machine.
Ahem! Once again, someone has fallen for Microsoft's claims of
innovation. They did not write PC-DOS; they bought QDOS (Quick
and Dirty Operating System, a blatant CP/M ripoff) from Seattle
Computer Products, hacked in some gratuitous brain damage, and
sold the result as their own.
--
/~\ cgi...@kltpzyxm.invalid (Charlie Gibbs)
\ / I'm really at ac.dekanfrus if you read it the right way.
X Top-posted messages will probably be ignored. See RFC1855.
/ \ HTML will DEFINITELY be ignored. Join the ASCII ribbon campaign!
> On Fri, 16 Feb 2007 01:49:41 -0600, David Librik wrote
> (in article <er3nml$s2e$1...@reader2.panix.com>):
>
>> If "PC" means the IBM Personal Computer, then MSDOS (not Windows)
>
> It does today. At the time, "personal computers" didn't have to come
> from IBM. Apple made some. Heathkit made some. Commodore and Vic-20
> made appearances in the same space. I don't remember when Amiga first
> showed up, and don't care enough to look it up.
Late 1985. Commodore did a presentation here in Vancouver in November
of that year; I fell in lust and bought my own A1000 in March of 1986.
And when Heathkit launched their computer line in 1977, that included
a computer using the LSI-11.
Michael
Andrew Swallow
except that was still the relatively small, insular home hobby market.
past posts in this thead:
http://www.garlic.com/~lynn/2007d.html#41 Is computer history taugh now?
http://www.garlic.com/~lynn/2007d.html#43 Is computer history taugh now?
http://www.garlic.com/~lynn/2007d.html#44 Is computer history taugh now?
http://www.garlic.com/~lynn/2007d.html#45 Is computer history taugh now?
http://www.garlic.com/~lynn/2007d.html#47 Is computer history taugh now?
this particular aspect was also somewhat touched on in these posts
in this thread:
http://www.garlic.com/~lynn/2007.html#1 "The Elements of Programming Style"
http://www.garlic.com/~lynn/2007.html#13 "The Elements of Programming Style"
including comment in this old email about collections of "small"
processors eventually starting to impact glasshouse operation:
http://www.garlic.com/~lynn/2007.html#email801006
i've frequently claimed that where the major pc maket segment
developed was in huge commercial orders for dump terminal 3270
replacement i.e. for about the same price as a 3270 terminal could get
a machine with a single desktop footprint that did both mainframe
terminal emulations and some amount of local computing. with that
enormously growing install base ... it became much more attractive to
write software applications for that install base ... as well
significant incentive for competitive clone builders. the price
competition from clone builders also helped accelerate its
attractiveness for the home market. at some point it became positive
feedback (snowball) effect, with the size of the install base fueling
both application development and price competition ... and the growth
in application development and price competition helping fuel the
increase in the install base
one of the other initial market uptake silver bullets, besides
terminal emulation, was spreedsheet application. an issue was reaching
enuf market mass to creating effectively nearly self-sustaining
market. at some point the market was large enuf that instead of having
to borrow from other activities ... things were developed wholely
based on that specific market.
the analogous scenario was the ignition of the consumer electronic
market ... cdroms for PCs became highly attactive because of their
enormous market position. i've commented before about in mid-80s,
finding $300 consumer cdrom having better technology than a $20k
device developed specifically for the computer market.
this was also somewhat the basis for the HDTV standards wars ...
circa 1990 involving dept. of commerce and others ... there was fear
that who ever won the HDTV market segment would have such an
electronics development base ... that they would also be able to take
over the whole computer market.
misc. past posts mentioning consumer electronic $300 cdrom from mid-80s:
http://www.garlic.com/~lynn/2001f.html#35 Security Concerns in the Financial Services Industry
http://www.garlic.com/~lynn/2001j.html#23 OT - Internet Explorer V6.0
http://www.garlic.com/~lynn/2001n.html#77 a.f.c history checkup... (was What specifications will the standard year 2001 PC have?)
http://www.garlic.com/~lynn/2003o.html#54 An entirely new proprietary hardware strategy
http://www.garlic.com/~lynn/2004o.html#43 360 longevity, was RISCs too close to hardware?
http://www.garlic.com/~lynn/2004o.html#44 360 longevity, was RISCs too close to hardware?
http://www.garlic.com/~lynn/2005n.html#27 Data communications over telegraph circuits
http://www.garlic.com/~lynn/2006.html#45 IBM 610 workstation computer
http://www.garlic.com/~lynn/2006q.html#62 Cray-1 Anniversary Event - September 21st
misc. past posts mentioning hdtv and various perceive market implications
(including some overlap with the cdrom posts)
http://www.garlic.com/~lynn/2000e.html#11 Is Al Gore The Father of the Internet?^
http://www.garlic.com/~lynn/2001.html#73 how old are you guys
http://www.garlic.com/~lynn/2001b.html#2 FCC rulemakings on HDTV
http://www.garlic.com/~lynn/2001j.html#23 OT - Internet Explorer V6.0
http://www.garlic.com/~lynn/2006.html#45 IBM 610 workstation computer
http://www.garlic.com/~lynn/2006q.html#62 Cray-1 Anniversary Event - September 21st
UCSD p-System was also available
>
>
> John Savard
>
You don't think computing is science? Anyone in computing should have some
curiosity about how they got started.
Robert