: > When you think that Win95
: > needs 16 megs to run well, it really is amazing that the mac was
: > able to squeeze a GUI in 128K!!!!!!
: ^^^
: Human Interface. Windows is a GUI.
: --
: Chris Thomas, c...@best.com
human interface, GUI, whatever...
can you give one significant difference
between the two?
--
Disclaimer: Any resemblance between the above views and those of my
employer, my terminal, or the view out my window are purely
coincidental. Any resemblance between the above and my own views is
non-deterministic. The question of the existence of views in the
absence of anyone to hold them is left as an exercise for the reader.
The question of the existence of the reader is left as an exercise for
the second god coefficient. (A discussion of non-orthogonal,
non-integral polytheism is beyond the scope of this article.)
> When I am not reading this newsgroup, I design microprocessor circuits,
and I
> can
> assure you that the world's largest microprocessor manufacturer does
not
> implement
> everything in NAND gates.
Is that your company's marketing slogan? "We make the world's largest
microprocessors"? I kinda like that, although I'll bet it doesn't bring
customers in in droves.
: > Now admittedly the CS in my degree was for algorithms & programming, not
: > ECE, but aren't all hardware implementations of gates made solely of NAND
: > gates? Or am I misremembering that the specifications for everything is
: > done only in NAND gates?
: Any logic function can be built in NAND gates, but this is similar to using a Turing
: Machine for all computer functions. Sure, it can be done, but no current practical
: system does it that way. Back in the 60's when RCA was still building computers,
: they developed an all-NAND system. The idea was that by optimizing the NAND gate to
: the utmost, they could make a computer with fewer different types of components and
: thereby beat the competition. Remember that this was in the days when a flip-flop
: was a plug-in card by itself.
: When I am not reading this newsgroup, I design microprocessor circuits, and I can
: assure you that the world's largest microprocessor manufacturer does not implement
: everything in NAND gates.
HAHAHAHAHA!
HA!
i just had my dose of ece last semester and what i gathered was
that it was more efficient (smaller,faster) for the chip fabs
to do nand implementations than to do other gates - they may
use and or and not, but they build them with nands
[snip]
>i just had my dose of ece last semester and what i gathered was
>that it was more efficient (smaller,faster) for the chip fabs
>to do nand implementations than to do other gates - they may
>use and or and not, but they build them with nands
This is incorrect. When you are designing the transistor level circuit, you
design the circuit that implements the boolean function, and it is not just the
individual logic gate functions all hooked together. If your school has a VLSI
design course, ask the professor who teaches it about the relationship between
transistor level design and logic gate level design.
---
My opinions do not reflect those of my employer.
Steven M. Demko LSI Logic
Mask Designer 1501 McCarthy Blvd.
(408) 433-4398 Milpitas, CA 95035
>And, BTW, Windows95 doesn't run well under ANY conditions. Upgrade the machine
>to 32MB, P133 and it COULD run well (if it wasn't for the many bugs and
>flaws), but then that's the ideal machine for Windows NT...
Actually, that's pretty close to the spec for the only machines I've
ever seen that could run 95 at any acceptable speed -- Compaq 133mHz
with 48 meg of RAM. The lowly Pentium/100 with 16 meg that's rapidly
become the stock machine these days can't cut it.
Turtle
-------------------------------------------------
George Orwell was wrong. John Brunner was right.
-------------------------------------------------
Here's an excerpt from a recent column.
Excerpted from: CAREER OPPORTUNITIES IN COMPUTING -- AND MORE (1/19)
<http://nytsyn.com/live/Gates/019_011996_094929_4351.html>
By BILL GATES
c.1996 Bloomberg Business News
[...]
QUESTION: I read in a newspaper that in 1981 you said, ``640K of memory should
be enough for anybody.'' What did you mean when you said this?
ANSWER: I've said some stupid things and some wrong things, but not that. No
one involved in computers would ever say that a certain amount of memory is
enough for all time.
The need for memory increases as computers get more potent and software gets
more powerful. In fact, every couple of years the amount of memory address
space needed to run whatever software is mainstream at the time just about
doubles. This is well-known.
When IBM introduced its PC in 1981, many people attacked Microsoft for its
role. These critics said that 8-bit computers, which had 64K of address space,
would last forever. They said we were wastefully throwing out great 8-bit
programming by moving the world toward 16-bit computers.
We at Microsoft disagreed. We knew that even 16-bit computers, which had 640K
of available address space, would be adequate for only four or five years. (The
IBM PC had 1 megabyte of logical address space. But 384K of this was assigned
to special purposes, leaving 640K of memory available. That's where the
now-infamous ``640K barrier'' came from.)
A few years later, Microsoft was a big fan of Intel's 386 microprocessor chip,
which gave computers a 32-bit address space.
Modern operating systems can now take advantage of that seemingly vast
potential memory. But even 32 bits of address space won't prove adequate as
time goes on.
Meanwhile, I keep bumping into that silly quotation attributed to me that says
640K of memory is enough. There's never a citation; the quotation just floats
like a rumor, repeated again and again.
--------------------------------- end excerpt ---------------------------------
Does anyone have the cite for the first time this statement was attributed to
Bill Gates?
--
---- Tom Betz --------- <http://www.pobox.com/~tbetz> ------ (914) 375-1510 --
tb...@pobox.com | We have tried ignorance for a very long | tb...@panix.com
------------------+ time, and it's time we tried education. +-----------------
-- Computers help us to solve problems we never had before they came along. --
> Does anyone have the cite for the first time this statement was attributed
> to Bill Gates?
I'm sure it's not the first attribution, but the oldest one I could find
was this one from a 1988 issue of InfoWorld:
"Memory is a bit different, however. Microsoft Corp. chairman Bill Gates
once said 640K of memory was more than anyone needed. He was wrong.
Nobody realized, however, that the 20 bits of addressing in the AT
wouldn't be enough . . ."
- snopes
+------------------------------------------------------------------+
| snopes reserves the right to make improvements in the article |
| packaged with this .sig at any time and without notice. |
+------------------------------------------------------------------+
: c.1996 Bloomberg Business News
: [...]
: QUESTION: I read in a newspaper that in 1981 you said, ``640K of memory should
: be enough for anybody.'' What did you mean when you said this?
:
: ANSWER: I've said some stupid things and some wrong things, but not that. No
So who is Bill Gates interviewing here?
-Joachim.
I always thought he was talking about his monthly bonus, not computer
memory...
Adam "...and it damn well should be..." Klyce
>When IBM introduced its PC in 1981, many people attacked Microsoft for its
>role. These critics said that 8-bit computers, which had 64K of address space,
>would last forever. They said we were wastefully throwing out great 8-bit
>programming by moving the world toward 16-bit computers.
And they were right, too :-) 8-bit computers are still with us; it
was pointed out in a review of the Amstrad PcW16 that the complete computer
with software costs less than a copy of Microsoft Office. And how long did
it take to get an MS OS to multitask properly? 8-bit OSs had been doing it
since MP/M.
--
John Elliott.
--
-------------------- http://sable.ox.ac.uk/~sjoh0132/ ---------------------
John Elliott |BLOODNOK: "But why have you got such a long face?"
|SEAGOON: "Heavy dentures, Sir!" - The Goon Show
:-------------------------------------------------------------------------)
>: [...]
>: QUESTION: I read in a newspaper that in 1981 you said, ``640K of memory should
>: be enough for anybody.'' What did you mean when you said this?
>:
>: ANSWER: I've said some stupid things and some wrong things, but not that. No
>So who is Bill Gates interviewing here?
I don't see how he can get out of this one now... it's been reported
from too many otherwise-reliable sources. I used to have a copy of a
very early requote of it, but cannot find it now. It was in response
to questions about the DOS memory architecture, and of course BG was
defending his/MS's choice.
Turtle
----------------------------------------------------------------------
* Visit the Weightless Dog Home Page! http://www.charm.net/~turtle *
*** Ask me about Lotus Notes, digital video, Volvos and the blues! ***
----------------------------------------------------------------------
>> very early requote of it, but cannot find it now. It was in response
>> to questions about the DOS memory architecture, and of course BG was
>> defending his/MS's choice.
>Except, of course, the 640Kb limit has nothing to do with BG or MS,
>nor in fact much to do with MS-DOS. PC-DOS (and MS-DOS for IBM PC
>clones) _enforces_ the 640Kb limit of the machine's architecture.
>The actual limit comes from the choice of the 8086/8088 and the
>hardware mapping of the machines subsytsems, notably the CGA
>addressing.
>No decision of BG or MS could have changed this on the IBM< PC.
>On other hardware, such as the Victor 9000/ACT Sirius, MS-DOS
>had no 640Kb limit, if enough RAM was fitted then MS-DOS (or
>CP/M-86) could have over 900Kb available.
Was the CGA available in the early PC days (the five-slot, 64k
motherboard-with-cassette-port)? I don't remember hearing about it
until mid-1982. I was aware at the time that other clones, like the
Sanyo, allowed for around 768K, but at the time, even 64k was
considered ample, so nobody questioned his comment then. Hell, a lot
of machines in 1981 were leaving the factory with 16K and people
thought that was fine... until the Commodore 64 came out.
So the President of IBM once thought there MIGHT be a use for three
or four mainframe computers throughout the world. So fifteen year old
predictions turn out to be wrong. Point being?
Donald W. McArthur
Don...@ix.netcom.com
****************************
"My wife and I tried two or three times
in the last forty years to have breakfast
together, but it was so disagreeable
we had to stop."
Winston Churchill
****************************
Visit me at:
http://www.vistech.net/users/donw/misant.html
"The Misanthropyst - Celebrating The
Evil And Stupidity Of Mankind"
: So the President of IBM once thought there MIGHT be a use for three
: or four mainframe computers throughout the world. So fifteen year old
: predictions turn out to be wrong. Point being?
: Donald W. McArthur
: Don...@ix.netcom.com
: ****************************
: "My wife and I tried two or three times
: in the last forty years to have breakfast
: together, but it was so disagreeable
: we had to stop."
: Winston Churchill
: ****************************
: Visit me at:
: http://www.vistech.net/users/donw/misant.html
: "The Misanthropyst - Celebrating The
: Evil And Stupidity Of Mankind"
point being that Wi...@microsoft.com is being a revisionist and
denying that he ever said the offending phrase. THAT is the point
Nubus was invented well after the Mac 128K was first designed.
> I have also heard that the original Mac was designed with 128K
> of memory with NO expansion capabilities (against the designers
> protests) but they went ahead and made upgrade to 512K possible
> on their own. Is there any truth to this?
Yes.
---
Ron Nicholson mailto:r...@sgi.com http://www.nicholson.com/rhn/
#include <canonical.disclaimer> // only my own opinions, etc.
> So fifteen year old predictions turn out to be wrong. Point being?
Point being, that for the head of a company whose major selling point
is "we're the future" to have made such a serious miscalculation of
the market strikes people as a tad ironic.
I'm not claiming Bill Gates ever actually said this, but the "nobody
will ever need more than 640K" statement was outdated nearly as soon
as it was uttered and was certainly proved wrong long before fifteen
years had elapsed.
- snopes
+--------------------------------------------------------------------------+
| snopes was laboratory tested for safety. As with any other poster, |
| should irritation or discomfort result, discontinue reading immediately. |
+--------------------------------------------------------------------------+
YES! The Original mac had 128K. It was upgradeable to 512K, but
not by yourself. You had to send it back to the factory for the
upgrade. The mac abandoned the old "hacker" mentality of the
Apple II. It was supposed to be like an appliance--plug it in and
use it (I think they compared it to a toaster). Apple saw no need
for people to go "messing around in there" (A big mistake). As
far as the RAM thing goes, Creative Computing magazine recommended
the upgrade for performance reasons. When you think that Win95
needs 16 megs to run well, it really is amazing that the mac was
able to squeeze a GUI in 128K!!!!!!
--
-------------------------------------------- /\
This signature under construction! /O \
/ |\ \
/ / \@ \
> The 640K limit is a consequence of where IBM chose to memory-map
>the video adapters in the original PC, not any design decision on
>the part of the designers of DOS.
True. My DEC Rainbows had 896k of RAM in them; the video was
handled by the other CPU.
--
A host is a host from coast to coast.................wb8foz@nrk.com
& no one will talk to a host that's close...........(v)301 56 LINUX
Unless the host (that isn't close).........................pob 1433
is busy, hung or dead........vr vr vr vr.................20915-1433
: Point being, that for the head of a company whose major selling point
: is "we're the future" to have made such a serious miscalculation of
: the market strikes people as a tad ironic.
The "we're the future" argument is more than a bit ironic all in itself.
Microsoft's pattern is to enter an established but not yet stable market,
buy one of the smaller players, and turn its product into a behemoth.
--------------------------------------------------------------------------
"The original seven words were, shit, piss, fuck, cunt, cocksucker, mother-fucker, and tits. Those are the ones that will curve your spine,
grow hair on your hands and (laughter) maybe, even bring us, God help
us, peace without honor (laughter) um, and a bourbon."
--George Carlin
From FCC V. Pacifica Foundation, 438 U.S. 726 (1978)
--------------------------------------------------------------------------
Andy Walton * att...@mindspring.com * http://www.mindspring.com/~atticus/
<snip!>
> "Everything that can be invented has been invented."
> --Charles H. Duell, Commissioner, U.S. Office of Patents,
> 1899.
+++
Doesn't somebody have a cite for this one? snopes?
telneting from the Isles of Nippon...
Japanese Observation: When you go to a Japanese restaurant in
America, they play Japanese music. Go to
one in Japan, and they play Western music.
dino "been there, done that"
1) Modern processors are made using CMOS logic. NAND gates are smaller
and faster than equivalent NOR gates in CMOS. AND and OR are implemented
by putting an inverter (NOT) after a NAND or NOR gate.
2) Any logic function may be implemented using only NAND gates.
3) However, it is not true that all hardware implementations of logic
circuits use only NAND gates. For example Xilinx 4000 series Field
Programmable Gate Arrays (FPGAs) implement logic using RAM.
4) Modern hardware specifications are done using hardware description
languages, such as VHDL. Logic is expressed in the form of equations
which are synthesized to whatever basic functions are used in the
particular implementation.
For cites, there are numerous textbooks on VHDL, and see the Xilinx
databooks for the details on their particular FPGA implementation.
Mr. Drew "640k CLBs should be enough for anybody"
> Now admittedly the CS in my degree was for algorithms & programming, not
> ECE, but aren't all hardware implementations of gates made solely of NAND
> gates? Or am I misremembering that the specifications for everything is
> done only in NAND gates?
The e-theory that they're (that's the ubiquitous THEY) teaching nowadays
is mixed-logic notation, with heavy emphasis on the universally-utilized
NAND gate.
Threw me for a loop when someone asked me about it ... I *knew* that
everything was NAND, but I didn't realize that they'd actually start to
*teach* it to prospective engineers.
-- Chris (wil...@fsr.com) junior-junior ass't sysadmin/tech support
http://www.fsr.com/~wileyc/
> In article <4fot3n$cuc$1...@mhadf.production.compuserve.com>,
> 10211...@CompuServe.COM says...
>
> >When you think that Win95
> >needs 16 megs to run well, it really is amazing that the mac was
> >able to squeeze a GUI in 128K!!!!!!
>
> No, not quite. The Macintosh interface is in ROM (or at least it was back
> then), so they didn't actually "squeeze a GUI in 128K".
Pretty close, though. The ROM was only 64K, so even counting the ROM the
machine had only 192K.
Since the RAM had to include the OS, and application program and the
screen buffer (22K all by itself), it wouldn't be too far a stretch to say
that, in fact, the GUI was sqeezed into 64K of ROM.
-Ron Hunsinger
Welll.... It depends on whether on not you include the wieght of the
manuals and media. To make it fair, we really should require all online
help to be available in printed form - after all, did they have any such
wuss-thing as on-line help for the ENIAC?
pdt.
> In article <4fsrdv$g...@lace.colorado.edu>,
> dino <di...@euclid.Colorado.EDU> wrote:
> >
> >> "Everything that can be invented has been invented."
> >> --Charles H. Duell, Commissioner, U.S. Office of Patents,
> >> 1899.
> >
> >Doesn't somebody have a cite for this one? snopes?
>
> Check the latest Scientific American, which debunks this. Bill Gates
> apparently cited the apocryphal quote in his book, and SciAm waxed
> hilarious at his expense.
More specifically:
Scientific American
February 1996, volume 274, number 2
page 32B
"Bill Gates Apocryphal History"
The author (John Horgan) references an investigative piece
Eber Jeffery
Nothing Left to Invent
in "Journal of the Patent Office Society"
1940
Drew "probably meant nothing better than digital watches" Lawson
--
| I have no genitalia
Drew Lawson | I sold my kids for cheese
dla...@aimnet.com | I love my blow-up doll, so
| bring out those cameras, please