I raised this in another newsgroup but I thought it might make for an
interesting discussion here.
I was wondering how things would have been different if Commodore
switch gears in the 1990's and moved to becoming an all PC
manufacturer. In other words, if they ditched the Amiga after the
A3000, never released the A4000, and maybe turned the A4000 designs
into a PC product (Intel CPU, IDE interface, 72-pin SIMM's, SVGA
display), would they still be alive today? Could they have succeeded
purely as a PC company?
It would have been interesting to see an Intel-based A4000 and adapted
some of the technology to run on Windows NT and, later, Windows 95.
For example, using the Paula for the on-board audio. Was something
like this ever in the cards or even possible?.
Maybe someone "in the know" at Commodore could share their
insights...someone like Dave Haynie?
...and don't get me wrong, I loved the Amiga line. In fact, I've owned
every single Amiga from the A1000 all the way to the A4000 (I didn't
own a CDTV, CD32, A2500, or any of the towers).
MikeC
>I raised this in another newsgroup but I thought it might make for an
>interesting discussion here.
>I was wondering how things would have been different if Commodore
>switch gears in the 1990's and moved to becoming an all PC
>manufacturer. In other words, if they ditched the Amiga after the
>A3000, never released the A4000, and maybe turned the A4000 designs
>into a PC product (Intel CPU, IDE interface, 72-pin SIMM's, SVGA
>display), would they still be alive today? Could they have succeeded
>purely as a PC company?
I don't know, but as I recall CBM tried very hard to sell pc's and I was
told they were unsuccessful and it cost them a lot of money before they
realized their mistake.
Only a time-addled recollection, and it doesn't really answer your question.
All the best,
Angus Manwaring. (for e-mail remove ANTISPEM)
I need your memories for the Amiga Games Database: A collection of Amiga
Game reviews by Amiga players http://www.angusm.demon.co.uk/AGDB/AGDB.html
I remember seeing some "Commodore PC" adds in magazines a loooooong
time ago. If I recall correctly, they were comparably equipped to a lot
of the other clones out there, but they were slightly more expensive...
charging for the Name possibly?
I mentioned this in c.s.a.advocacy a couple days ago... but I think
that if C= would have taken a different approach after the A3000, things
would be very different.
AGA was two years too late, and four years too little. Instead of
developing AGA, they should have either a) went to the PCI standard,
thus allowing OTHER companies to incur the cost of advancing the
graphics, or b) ported AOS to run on Intel-compatible processors, which
would have had the same effect. Other companies would incur the cost of
advancing graphics, audio, peripherals.
WHICH, would have left C= with (potentially) bucket loads of money for
MARKETING of a gaming system, such as the CD32... only with different
graphic archetecture of course. They could have provided the OS and the
hardware, and left somebody like 3DFX to provide the graphics...
This might have saved BOTH companies.
Would AOS on an Intel have replaced Windows? Doubtful... BUT, it
could have (if done early enough in the game, and with proper
development and marketing) had the effect of a "Home User's OS" and a
"Business User's OS"... ie, AOS and MS-Windows.
IBM tried this with OS-2, I think, but I don't remember seeing an real
marketing for it.
Ryan
> On 24-Apr-04 04:03:13, mikec said
> >Hi Everyone,
>
> >I raised this in another newsgroup but I thought it might make for an
> >interesting discussion here.
>
> >I was wondering how things would have been different if Commodore
> >switch gears in the 1990's and moved to becoming an all PC
> >manufacturer. In other words, if they ditched the Amiga after the
> >A3000, never released the A4000, and maybe turned the A4000 designs
> >into a PC product (Intel CPU, IDE interface, 72-pin SIMM's, SVGA
> >display), would they still be alive today? Could they have succeeded
> >purely as a PC company?
>
>
> I don't know, but as I recall CBM tried very hard to sell pc's and I was
> told they were unsuccessful and it cost them a lot of money before they
> realized their mistake.
IIRC it was actually the PC side of the company which was responsible for them
going under. The Amiga side, even at their demise, was still profitable (though
not enough to finance the rest).
>Hi Everyone,
>I raised this in another newsgroup but I thought it might make for an
>interesting discussion here.
>I was wondering how things would have been different if Commodore
>switch gears in the 1990's and moved to becoming an all PC
>manufacturer. In other words, if they ditched the Amiga after the
>A3000, never released the A4000, and maybe turned the A4000 designs
>into a PC product (Intel CPU, IDE interface, 72-pin SIMM's, SVGA
>display), would they still be alive today? Could they have succeeded
>purely as a PC company?
Uhm, the only stuff C= was making money from in the end was the Amigastuff,
they bleeded ALOT from the PC's that was overpriced and underpowered
compared with other companys PC's..
A better question is: If C= never went into that PC deadend and put
all their R&D money into the Amigaline instead, would they still
be alive today ?
..Probably..
--
| Apollo fastslot accelerators page - Http://www.canit.se/~glenn/apollo.html |
|----------------------------------------------------------------------------|
| ___ | Email å§£ Sha...@bay-watch.com |
| / __\ __ | Homepage å§£ http://www.canit.se/~glenn |
| __ / /__ / /__ ____ ____ __ | IRC å§£ XT600 @ IRC-net |
| (__/ /_ // / -_) _ ) _ )__) | Amiga - Silicon Graphics - 8bit comps. |
| \___//_/\__/_//_/_//_/ | T h e K i n g d o m o f S w e d e n |
> IIRC it was actually the PC side of the company which was responsible for them
> going under. The Amiga side, even at their demise, was still profitable (though
A couple of people have made this same statement but I'm not sure if
that's really true. I'd love for some insider to provide any
information that can back that assumption.
I can accept the fact that their earlier PC's probably costs Commodore
a lot in terms of R&D and manufacturing but, at least in Canada, they
switched over to just repackaging some cheap clones from PCChips
(http://www.pcchips.com.tw). I'm sure this made things fairly
profitable for them.
This is something that even companies like IBM do (or did). For
example, I bought a Pentium 3-667 MHz NetVista Desktop from IBM a few
years ago and to my surprise, pretty much nothing in the system was
from IBM except the DeskStar hard drive. The motherboard was a
Gigabyte (with integrated video/audio), the CD-ROM was from Lite-On
and the disk drive was Alps Electronics. I forget who made the power
supply but it wasn't IBM. A friend of mind had an older NetVista that
used a video card from Number 9.
Again, Commodore adopted a similar model (at least in Canada) and I'm
sure they made a boatload of money from selling cheap clones at
brand-name prices.
Mike
> Would AOS on an Intel have replaced Windows? Doubtful... BUT, it
I never suggested porting AOS to Intel to replace Windows. In fact,
what I was suggesting was doing things the other way around...porting
the hardware over to run off Intel and Windows.
I was wondering if Commodore could have adapted the Amiga's hardware
technology into an Intel platform and provided the appropriate drivers
for Windows. For example, could they have created Windows drivers for
the "Paula" sound processor. Maybe they could have crushed Creative
Lab's SoundBlaster, which at the time was just horrible. Same thing
with the video processors. Maybe AGA or AAA could have had Windows
drivers and later transitioned to RTG as better 3rd video cards came
around.
I guess what I'm asking, ultimately, is how hard or possible would it
have been to take the A4000 and replace the 68040 daughtercard with
one based on the Intel 80486, develop Windows drivers and run NT as
the OS?
Hmm...come to think of it, I think SGI tried something similar with
their 320 and 540 Visual Workstations. Would this type of move saved
Commodore?
MikeC
> Hi John,
>
> > IIRC it was actually the PC side of the company which was responsible for them
> > going under. The Amiga side, even at their demise, was still profitable (though
>
> A couple of people have made this same statement but I'm not sure if that's
> really true. I'd love for some insider to provide any information that can
> back that assumption.
At the time Commodore started to go down there was lots of stuff stating just
this in the industry press. Some of it was from the journos and some from
interviews with people involved in the Amiga side of the company. Whether they
were telling the truth or not may be another matter but that's what they all
claimed. I would say though, that I'd find it hard to believe that they all were
lying. I've probably got many of the articles lying around in my back collection
of mags but I just haven't got the time or desire to root through them to give
specific details of who said what.
Sorry, but the whole original question is a bit of a non-starter for me anyway
since it makes many assumptions which may well not have happened even had
Commodore survived. Would we be here (or Commodore even have existed) if the
Dinosaurs hadn't died out? - Who cares; they did, we're here, C'est la vie.
>> Would AOS on an Intel have replaced Windows? Doubtful... BUT, it
>I never suggested porting AOS to Intel to replace Windows. In fact,
>what I was suggesting was doing things the other way around...porting
>the hardware over to run off Intel and Windows.
>I was wondering if Commodore could have adapted the Amiga's hardware
>technology into an Intel platform and provided the appropriate drivers
>for Windows. For example, could they have created Windows drivers for
>the "Paula" sound processor. Maybe they could have crushed Creative
>Lab's SoundBlaster, which at the time was just horrible. Same thing
The old soundblasters indeed was bad, but even if Paula have a very nice
crisp sound its still an old 8bit chip, C= had better stuff in the pipe
but as usual it got canned..
>with the video processors. Maybe AGA or AAA could have had Windows
>drivers and later transitioned to RTG as better 3rd video cards came
>around.
AGA wasnt even very good when it was new, AAA probably would have been
but why adapt it to x86 ? would u implement a new bus for it too ?
(however AGA was good for some stuff, for example sidscrolling shootemups,
and displaying photorealistic images for a nice price.. 24bit gfxcards
for PC was extremly expensive at this time..)
>I guess what I'm asking, ultimately, is how hard or possible would it
>have been to take the A4000 and replace the 68040 daughtercard with
>one based on the Intel 80486, develop Windows drivers and run NT as
>the OS?
What would u gain with that ? you would combine the bad stuff of a PC
with the bad stuff of the Amiga.
>Hmm...come to think of it, I think SGI tried something similar with
>their 320 and 540 Visual Workstations. Would this type of move saved
>Commodore?
..And theese machines never earned one cent for SGI, even if they where
impressive for some stuff, the 320 (and some other model) used the
UMA just like the O2 (wich is a "real" MIPS based SGI) wich could
be used for some specific stuff wich would be impossible on other solutions.
The O2 is nearly 10 years old today, still it can do some stuff that
no modern PC can do today no matter how much money you spend on it.
(btw the x86 based machines from SGI is not very supported today..)
> AGA wasnt even very good when it was new, AAA probably would have been
> but why adapt it to x86 ? would u implement a new bus for it too ?
I dunno...I'm not an engineering or hardware specialist. I was just
wondering if any of this is "technically" possible and whether or not
this would have saved Commodore by getting the Amiga into more
people's hands.
>What would u gain with that ? you would combine the bad stuff of a PC
>with the bad stuff of the Amiga.
I'm not sure if that's all "bad stuff." I guess it depends on your
perspective. After Commodore died, a lot of people switched over to
Windows/Intel anyway.
> And theese machines never earned one cent for SGI, even if they where
They may have been priced way out of range for most people. Anyone
with SGI experience knows that anything related to SGI comes at a
heavy premium where as PC owners have the luxury of buy cheap
components to get something functional going. I'm sure there was
pressure from companies like Intergraph.
SGI doesn't seem to be a company that does well when facing direct
competition and they moved themselves into a position where direct
competition doesn't really exist to the same degree as it does in most
"PC" markets (e.g. home/business). I'm sure SGI will eventually fail
in the desktop/workstation market as PC's and Mac's catch up.
> The O2 is nearly 10 years old today, still it can do some stuff that
> no modern PC can do today no matter how much money you spend on it.
I use to own a O2 (225 or 250 MHz) a few years back, which I got for
really cheap. Even though it was loaded up with RAM and had a large
HD, the whole system seemed sluggish. It seemed to be on par with my
old Pentium Pro 200 and that's not saying much. I eventually traded it
for a P3-800 system with no regrets.
MikeC
>Hi Glenn,
>> AGA wasnt even very good when it was new, AAA probably would have been
>> but why adapt it to x86 ? would u implement a new bus for it too ?
>I dunno...I'm not an engineering or hardware specialist. I was just
>wondering if any of this is "technically" possible and whether or not
>this would have saved Commodore by getting the Amiga into more
>people's hands.
Everything is possible with enough money :)
>>What would u gain with that ? you would combine the bad stuff of a PC
>>with the bad stuff of the Amiga.
>I'm not sure if that's all "bad stuff." I guess it depends on your
>perspective. After Commodore died, a lot of people switched over to
>Windows/Intel anyway.
Actually many of the stores that sold Amigastuff had their best times
right after C= died.. so it took some years.
>> And theese machines never earned one cent for SGI, even if they where
>They may have been priced way out of range for most people. Anyone
They was aimed at the professional market, and priced after that,
however the big cost was the development.
>with SGI experience knows that anything related to SGI comes at a
>heavy premium where as PC owners have the luxury of buy cheap
>components to get something functional going. I'm sure there was
>pressure from companies like Intergraph.
Anyway the whole x86-trip was as useful as dumping all the money
into a big hole or something..
>SGI doesn't seem to be a company that does well when facing direct
>competition and they moved themselves into a position where direct
>competition doesn't really exist to the same degree as it does in most
>"PC" markets (e.g. home/business). I'm sure SGI will eventually fail
>in the desktop/workstation market as PC's and Mac's catch up.
They already done that unfortunatly, SGI have the same problem as
C= had, they dont focus at the stuff that their customers want,
and put the R&D money on stuff that noone buys.
>> The O2 is nearly 10 years old today, still it can do some stuff that
>> no modern PC can do today no matter how much money you spend on it.
>I use to own a O2 (225 or 250 MHz) a few years back, which I got for
>really cheap. Even though it was loaded up with RAM and had a large
>HD, the whole system seemed sluggish. It seemed to be on par with my
>old Pentium Pro 200 and that's not saying much. I eventually traded it
>for a P3-800 system with no regrets.
The slower CPU's are pretty slow in the O2, especially if they lack
a secondary cache.
However, it can do some stuff that are really impressive and as I
daid, no PC can do even today, not even with extremly expensive cards.
(Not counting the SGI 320 as a PC here..)
After all, Commodore largely stumbled and bumbled into it's successes. That
the PET premiered in a WOOD case maybe should have been a clue.. it
succeeded because.. well, what else WAS there? The (16) color VIC-20
stunned everyone and made Jack Tramiel look like a genius. It paved the way
for the C=64, but it also should have taught Commodore a lesson, or at least
issued a warning, about software compatibility. But nobody was at Commodore
was listening. Commodore to it's dying breath.. new hardware.. new
software.
And oh, such hardware. Remember the Plus4/C16 fiasco? (Tramiel's swan song
in case you don't remember) Of the *new* C=64c? Most all of Commodore's
hardware development was side-ways.. 8-bit to 8-bit to 8-bit.. rather than
forward-looking.
(Here's some toughies.. remember the Commodore 264? 364? B-128? :-)
The 64 was never designed for a future. Two, three years tops on the
shelfs.. enough time to develop another (incompatible) Commodore appliance.
That was the Commodore approach.
Nobody was more surprised than Commodore itself by the ultimate range and
durability of the C-64. At every instance we see Commodore blinking at it's
own successes.. The Commodore 1600 and 1650 modems *built* Compuserve. But
unlike Steve Case (QuantumLink.. and a little thang called AOL) Commodore
didn't have a clue to what on-line computing would eventually become.
Hard drives and mass storage? Why would C=64 users NEED a standardized hard
drive solution?? DU-UH!! If a user needs a hard drive, "they should
consider getting a Commodore PC".. literally words from the mouth of a
Commodore rep.
Commodore made NO effort worth mentioning to develop software.
GEOS/Berkeley Software and The Mouse added five or more years to it's
lifespan.. Reviewers raved: 90% of Macs productivity at 30% of the cost.
Commodore had no idea somebody could make their little toy computer do all
that stuff. 2MHz 80-column GEOS128 with a 512K RAM expansion was actually
sweet.
Commodore's success with the 64 was handed to them by software developers --
games, applications.. by geeks and gurus who were hungry for cheap computers
to play with. After all this was the VERY beginning of personal computers.
Even with 10s of millions of operating 64s out there (not to mention Apples
and Ataris.. and PCs) here were still MANY supposedly intelligent people
scratching their heads as to WHY most 'normal' people would ever WANT a
computer.. What the heck would they DO with it?
It was also the beginning of Commodore's doom, because now you have two
things happening.. new, better hardware being developed all over as
companies try to cash in on a proven developing personal/home computer
market -- proven by Commodore's C=64 success. And you have computer geeks
and gurus by the thousands lusting after then NEXT BEST THING to play with
(ie, develop for) .. (aside, of course, from Microsoft and their
machinations)
C=128 development paid for itself but it was too little too late and in the
wrong direction: yet another closed box appliance with no evolutionary
future.
And as already noted, their Intel-based PC stuff in terms of features and
price was similarly a day late and a dollar too much.. and there was nothing
Commodore brand loyalty could ever do to fix that. Again, can you say
"Compatibility?"
The ONLY Next Best Thing Commodore had to offer WAS Amiga, and they had to
go out and BUY that (mostly in order to spite Jack Tramiel, only a little
because the technology was so fantastic at the time.)
Only partly in hindsight, I believe the Amiga was doomed as well because the
philosophy and technology in the chip design left it inherently behind the
development curve.. Elegance comes with a cost.. while simplicity yields
cheap and easy. In a marketplace driven by most and the NEWEST features at
the lowest cost (and here and there, Microsoft compatibility).. cheap and
easy wins every time.
At it's heart, the Amiga chipset was an 8/16-bit design in a world where
16/32 and maybe even 32/64 bit graphic chips were already being
brainstormed.. Like the 64bit 68xxx design, a 32/64-bit AGA/AAA system would
have been a monstrously difficult engineering problem.. can you say HEAT?
and can you say COST? can you say TIME? (as in not enough?)
In hindsight, the Amiga hardware design was from it's inception a dead end
biding its time.. and the insistence of TOO many Amiga developers and
pundits that "Amiga" ABSOLUTELY **MUST** mean "hardware" unfortunately
killed the Amiga OS which was/is a TRUE *GEM* in it's own right regardless
of hardware -- and very much worthy of a better fate.
In my opinion if there were two things about the Amiga that ever genuinely
scared Bill Gates and Apple -- because it so very much impressed PC and
Apple software developers -- NTSC graphic compatibility and Amiga Operating
System's marvelous preemptive multitasking.
Other (better) NTSC graphic solutions emerged quickly, and of course *now*
-- wasted years later -- Windows and Mac OSes feature multitasking.
d.
> They weren't cheap enough and didn't have any cutting edge features..
> Commodore PCs were turtles in the land of hares.. and the hares
Yeah, but was this because Commodore's half-hearted approach to the PC
line? They seem to be split right down the middle between the Amiga
and the PC's.
I guess it really wasn't that different to Commodore's thinking in the
80's where they had the home/consumer line with the C-64/C-128 and the
business market, with the CBM and B-series. They seem to continue with
the same thinking except the Amiga replaced the C-64/C-128 and the
CBM/B-series were replaced by the PC line.
> And oh, such hardware. Remember the Plus4/C16 fiasco? (Tramiel's swan song
> in case you don't remember) Of the *new* C=64c? Most all of Commodore's
I completely agree...the TED-based computers hurt Commodore a lot more
than what people realized. However, the Amiga was a bit of a clean
break for them...or should have been. That's how I looked at things.
Amiga allowed (or should have allowed) Commodore to start anew and
ditch all these pet projects.
> 8-bit to 8-bit to 8-bit.. rather than forward-looking.
Right, and it's this thinking, applied to the Amiga, that screwed
Commodore over near the end. Some great examples of this is: no AA in
the A3000, the release of the A6OO and putting a 68020 in a fairly
crippled A1200 instead of an 030.
Imagine how things would have been different if the A3000 had AA.
> Hard drives and mass storage? Why would C=64 users NEED a standardized hard
> drive solution?? DU-UH!! If a user needs a hard drive, "they should
Hard drives were pretty expensive back in the 80's. Looking at 3rd
party units like the Xetec Lt. Kernal, I can see why Commodore wasn't
interest in pushing hard drives to C-64 owners. I believe the
1541/71's provided a reasonable amount of storage per disk at the
time. Anyone who needed extra storage seemed to buy SFD-1001's and
later, 1581's.
I didn't own a hard drive until I got sick of swapping disks on my
dual-drive A2000. Even then, I bought 3rd party (ironically, a Xetec
controller) instead of Commodore's 2090a.
I think Commodore should have just build computers and let 3rd parties
take care of the rest. That seemed to be the most effective for them.
Look at all the great stuff GVP produced for the Amiga line. I think
Commodore wasted a lot of money developing add-on's that were just
inferior.
Remember the difference between the 2090 and 2090a? Autobooting! Why
the heck develop a HD controller that didn't autoboot? What's the
point?
> Commodore made NO effort worth mentioning to develop software.
Again, I'm not sure this is a big deal since 3rd parties always seem
to do a better job. Besides, I was a registered CATS developer and
they were pretty good at supporting those interested in developing
Amiga software.
> 2MHz 80-column GEOS128 with a 512K RAM expansion was actually sweet.
I owned a C-128/1571 but I didn't really do anything that took
advantage of it. I ran it in C-64 mode 80% of the time. Granted, it
was a great poor man's Mac when running GEOS especially when you added
an REU.
> Commodore's success with the 64 was handed to them by software developers --
> games, applications.. by geeks and gurus who were hungry for cheap computers
Sure...but you have to admit that from a technical perspective, the
C-64 was superior to just about every other 8-bit. It's this
technology that allowed developers to create some great games and
software. For example, take any game that was developed on the C-64,
Apple and Atari and I think you'd agree that the C-64 version was the
best. Compare GEOS on the C-64 with the Apple II version and it's no
contest.
> C=128 development paid for itself but it was too little too late and in the
> wrong direction: yet another closed box appliance with no evolutionary
Most computers at the time were "closed" including the original Mac's.
At least Commodore included a user port and expansion port that opened
up some serious options. Besides, this thinking was corrected by the
A2000 release.
> Only partly in hindsight, I believe the Amiga was doomed as well because the
> philosophy and technology in the chip design left it inherently behind the
The Intel/Windows combination doomed a lot of companies. Atari didn't
fare any better with the ST line. If Mac's didn't have Adobe and
Quark, Apple would have died too.
The only other thing I might add is the whole A3000UX experience. From
what I recall, it was a killer Unix platform and there was some
interest in Sun carrying it. For various reasons, Commodore wasn't
able to capitalize on the A3000UX and I think this is what eventually
doomed Commodore: they failed to capitalize on their products and
wasted a lot of money trying to do so.
CD32 is another good example of this. It was clearly superior to
everything else on the market at the time and for a number of years
after. Such is the story of Commodore.
MikeC
[snip]
> release of the A6OO and putting a 68020 in a fairly crippled A1200 instead
> of an 030.
Even some Fast RAM as standard would have doubled the speed of the A1200. As
for the A600, I think most of those who bought one at the time were rather
"miffed" at Commodore when they released the A1200 shortly thereafter.
> I didn't own a hard drive until I got sick of swapping disks on my
> dual-drive A2000. Even then, I bought 3rd party (ironically, a Xetec
> controller) instead of Commodore's 2090a.
>
> I think Commodore should have just build computers and let 3rd parties take
> care of the rest. That seemed to be the most effective for them. Look at all
> the great stuff GVP produced for the Amiga line. I think Commodore wasted a
> lot of money developing add-on's that were just inferior.
Yeah my first HD was a HD8 for my A500. Nice bit of kit but dearie me it was
expensive and GVP had that thing about using custom, more expensive, memory
SIMMs.
> Even some Fast RAM as standard would have doubled the speed of the A1200. As
> for the A600,
Yes...I (almost) forgot about the lack of fast ram in the stock A1200!
I also remember something about its strange keyboard layout and a
blank key. I remember thinking to myself, "did Commodore rush this out
the door or what?"
> I think most of those who bought one at the time were rather
> "miffed" at Commodore when they released the A1200 shortly thereafter.
I'm sure developers weren't thrilled with this as they were gearing up
for AGA mode. I guess those trying to clear out their older ECS
software were delighted with the A600.
MikeC
However, after such praise, I really can't help but make this teeny
little observation, and I think I'll let Bob The Angry Flower do the
talking for me here:
http://www.ece.ucdavis.edu/~gethigh/etc_comics/itsits.gif
:)
>Hi John,
>> Even some Fast RAM as standard would have doubled the speed of the A1200.
>> As for the A600,
>Yes...I (almost) forgot about the lack of fast ram in the stock A1200!
>I also remember something about its strange keyboard layout and a
>blank key. I remember thinking to myself, "did Commodore rush this out
>the door or what?"
Well, in most european countries the A1200 had *EXACTLY* the same layout as
the A500/2000/3000 keyboards, with no blank keys.
The point is that many countries, including Sweden, have more keys than US,
and C= decided to use only one keyboard for all countries, and just put
blank keys for countries that didnt have any use for them.
>> + On 04-Maj-04 00:56:51
> +mikec <mike...@hotmail.com> wrote
>
>>Yes...I (almost) forgot about the lack of fast ram in the stock
>>A1200! I also remember something about its strange keyboard layout
>>and a blank key. I remember thinking to myself, "did Commodore rush
>>this out the door or what?"
>
> Well, in most european countries the A1200 had *EXACTLY* the same
> layout as the A500/2000/3000 keyboards, with no blank keys.
>
> The point is that many countries, including Sweden, have more keys
> than US, and C= decided to use only one keyboard for all countries,
> and just put blank keys for countries that didnt have any use for
> them.
In the UK, that meant *two* blank keys! They're quite handy, actually: I
have one set up as a hotkey for toggling the audio filter. :-)
--
Duncan Snowden.
> Hi John,
>
> > Even some Fast RAM as standard would have doubled the speed of the A1200. As
> > for the A600,
>
> Yes...I (almost) forgot about the lack of fast ram in the stock A1200! I
> also remember something about its strange keyboard layout and a blank key. I
> remember thinking to myself, "did Commodore rush this out the door or what?"
AFAIK it's the same as the A500s didn't it have blank keys too (IIRC these blank
keys were used for other keymaps and replaced for those countries). The A4K
keyboard has two blank keys too.
> > I think most of those who bought one at the time were rather
> > "miffed" at Commodore when they released the A1200 shortly thereafter.
> I'm sure developers weren't thrilled with this as they were gearing up for
> AGA mode. I guess those trying to clear out their older ECS software were
> delighted with the A600.
Well the A600 was pretty useless for those games or anything else which utilised
the numeric keypad. I remember though that the major criticism was that those
who bought one thought they had been sold a redundant machine just so Commodore
could recoup some money. Many of these people who compained in the mags of the
time stated that they wouldn't be buying Commodore again.
More fundamentally they probably should have spent a good deal more on
Amiga development, so that AA would actually have been ready in time
for the A3000. As things turned out, AGA was too late to be
competitive and AA's enhanced sound never saw the light of day.
> the release of the A6OO and putting a 68020 in a fairly crippled
> A1200 instead of an 030.
What does a 68030 have to offer over a 68020, other than an MMU which
the OS wouldn't use?
<snip>
> I think Commodore should have just build computers and let 3rd parties
> take care of the rest. That seemed to be the most effective for them.
> Look at all the great stuff GVP produced for the Amiga line. I think
> Commodore wasted a lot of money developing add-on's that were just
> inferior.
Yes, development must have been spread quite thinly.
> Remember the difference between the 2090 and 2090a? Autobooting! Why
> the heck develop a HD controller that didn't autoboot? What's the
> point?
No-one could make a bootable hard drive without the hooks that were
added in Kickstart 1.3. The A2090 was one of several that required a
floppy boot disk.
<snip>
> CD32 is another good example of this. It was clearly superior to
> everything else on the market at the time and for a number of years
> after. Such is the story of Commodore.
CD32 was doomed from the word go because games publishers were already
deserting the Amiga, perhaps due to the disappointingly small advance
that the A1200 represented. The CD32 hardware was superior in some
respects but a console lives and dies by its software support.
--
Ben Hutchings
Any smoothly functioning technology is indistinguishable from a rigged demo.
> mikec wrote:
> > Hi Don,
> <snip>
> What does a 68030 have to offer over a 68020, other than an MMU which the OS
> wouldn't use?
Only full 030's have an MMU, many of the third party accelerators used the
EC030. One of the main benefits of an MMU would have been for developers and
programmers. As for benefits of an 030 it has to be speed; even a 40MHz EC030 is
considerably faster than an 020.
> As for benefits of an 030 it has to be speed; even a 40MHz EC030 is
> considerably faster than an 020.
MHz for MHz, an 030's about double the speed of an 020.
--
Duncan Snowden, back after a modem/thunderstorm crisis.
Uh, really? Surely that depends heavily on the memory system. The
only speed boosts the 68030 has are a *small* data cache and burst-
filling of both caches.
>On Sunday, John Burns wrote:
>> As for benefits of an 030 it has to be speed; even a 40MHz EC030 is
>> considerably faster than an 020.
>MHz for MHz, an 030's about double the speed of an 020.
Sorry, they are nearly exactly the same speed, however the
step 68030->68040 is huge.
However the 68030 exists in much faster clockrates (50MHz) than the 68020.
Yeah, it's a fair cop. I can admit when I'm wrong. :-) I was going by
my experience with A1200s, forgetting of course that 030 cards always
have fast RAM on-board - which would account for most of the speed boost
in itself.
--
Duncan Snowden.
Well they don't come with onboard RAM as default. I too was making this "leap of
faith" in my assessment of the 030s performance - there isn't much point to
putting an 030 board into your 1200 unless it has some RAM fitted.
> Only partly in hindsight, I believe the Amiga was doomed as well because the
> philosophy and technology in the chip design left it inherently behind the
> development curve.. Elegance comes with a cost.. while simplicity yields
> cheap and easy. In a marketplace driven by most and the NEWEST features at
> the lowest cost (and here and there, Microsoft compatibility).. cheap and
> easy wins every time.
But where was the cheap off-the-shelf hardware capable of delivering
Amiga like performance in 1985? Would you have been happy running a
version of Deluxe Paint on a 286 (a development of an 8-bit chip) with
EGA and MS-DOS's 640k upper memory limit? Would you describe such a
system as an inherently forward looking design or ahead of the
development curve? I'd say if that mess could evolve, the Amiga could
EASILY have held its edge until 1990 without major investment and just
a little wit from Commodore.
- First, the basic A1000 hardware should have been given a boost with
extra RAM and a hard drive as soon as the cost of this non-proprietary
hardware fell. (The A500/A1000 board could address 9 megs of RAM
without any hacks. Try that with a PC of 1985 vintage.) Similarly,
Commodore did not have to bear the cost of CPU development and they
could have produced 020 and 030 variants much, much sooner. With
extra CPU power/ram/hard drive and a non-static base machine,
programmers would have been far less tempted to squeeze performance
from crude hardware level hacks.
- The custom graphic chips are often described as an evolutionary dead
end. In fact, they had considerable room for improvement while
retaining a very high level of backward compatibility, as AA showed.
Jay Miner's team had already finished the ECS upgrade in early 1987.
Commodore's clever leaders then sacked the original Amiga team and
managed to lose the ECS blue prints. ECS saw the light of day in 1990
only after prototype chips were reverse engineered. Had Miner et al
been retained with a modest R&D budget (which was instead pissed away
on second rate PC clones) I don't think it requires a huge imaginative
leap to envision AA in 1988/9. Denise is the only thing that changes.
This would have been more than a year before VGA cards became
commonplace and, with a flicker-fixer, the Amiga would have been
comfortably ahead. (Not so by 1992 of course.) Paula was good enough
to hold its own until about 1991.
With something like Dave Haynie's A3000+ in 1989 and some committed
marketing, I think Commodore would have found enough breathing room to
transition to PowerPC/retargetable graphic cards, just as Apple did.
The fact that they lasted until 1994 making the choices they did shows
how brilliant the 1985 design was, and I think you do it a great
disservice.
Probably not. There's no need to denigrate the 286 as "a development
of an 8-bit chip" though; like the 68000, it has some features to ease
updating 8-bit system designs.
> with EGA and MS-DOS's 640k upper memory limit?
The latter is a software limitation, of course. There was the option
of using Xenix, though that was probably quite expensive.
<snip>
> - First, the basic A1000 hardware should have been given a boost with
> extra RAM and a hard drive as soon as the cost of this non-proprietary
> hardware fell. (The A500/A1000 board could address 9 megs of RAM
> without any hacks. Try that with a PC of 1985 vintage.)
Both the 68000 and the 286 have a 16 MB physical address space. You
couldn't fit 9 MB of RAM onto a PC motherboard of that era, but
neither could you on an A500/A1000. In both cases you would need an
add-on memory board (yes, these did exist for the PC). But the 286
would be a lot more expensive!
<snip>
> - The custom graphic chips are often described as an evolutionary dead
> end. In fact, they had considerable room for improvement while
> retaining a very high level of backward compatibility, as AA showed.
I'd believe that if it had included a compatible successor to Paula.
I know the A3000+ had a DSP but I don't believe it was backward-
compatible. (Possibly that wouldn't matter; maybe Paula and the
DSP could coexist.)
> Jay Miner's team had already finished the ECS upgrade in early 1987.
> Commodore's clever leaders then sacked the original Amiga team and
> managed to lose the ECS blue prints. ECS saw the light of day in 1990
> only after prototype chips were reverse engineered.
I've heard many myths about lost designs but not a lot of detail.
What/who's your source?
<snip>
> This would have been more than a year before VGA cards became
> commonplace and, with a flicker-fixer, the Amiga would have been
> comfortably ahead.
The very existence of flicker-fixers demonstrates the large downside
to specifying DMA and video timings so precisely.
> (Not so by 1992 of course.) Paula was good enough to hold its own
> until about 1991.
That's not how I see it. HD floppies were introduced in 1987 iirc.
I don't remember quite when 16-bit sound became common.
> With something like Dave Haynie's A3000+ in 1989 and some committed
> marketing, I think Commodore would have found enough breathing room to
> transition to PowerPC/retargetable graphic cards, just as Apple did.
<snip>
Possibly, yes. However, I believe Apple didn't document their
hardware so precisely, and changed it more often, preventing software
developers from depending on specific hardware capabilities and giving
them a lot more room to manouevre.
--
Ben Hutchings
Anthony's Law of Force: Don't force it, get a larger hammer.
> > But where was the cheap off-the-shelf hardware capable of delivering
> > Amiga like performance in 1985? Would you have been happy running a
> > version of Deluxe Paint on a 286 (a development of an 8-bit chip)
>
> Probably not. There's no need to denigrate the 286 as "a development
> of an 8-bit chip" though; like the 68000, it has some features to ease
> updating 8-bit system designs.
The 68000 represents a clear break from Motorola's 8 bit chips. It
doesn't run 6809 code, whereas the 286 is backwardly compatible with
the 8088. The 68000 is a MUCH cleaner design - fully 32 bit
internally.
>
> I'd believe that if it had included a compatible successor to Paula.
> I know the A3000+ had a DSP but I don't believe it was backward-
> compatible. (Possibly that wouldn't matter; maybe Paula and the
> DSP could coexist.)
They co-existed on the A3000+.
>
> > Jay Miner's team had already finished the ECS upgrade in early 1987.
> > Commodore's clever leaders then sacked the original Amiga team and
> > managed to lose the ECS blue prints. ECS saw the light of day in 1990
> > only after prototype chips were reverse engineered.
>
> I've heard many myths about lost designs but not a lot of detail.
> What/who's your source?
1988 Jay Miner interview originally published in Amiga Computing.
http://amiga.emugaming.com/jayinterview2.jpg
See the last column. Miner is describing the ECS with the Fatter
Agnus and SuperDenise, which the remnants of the Amiga team had ready
by early 1987. Maybe Commodore didn't really lose the blueprints
(can't remember where I read that now), but it's as good an
explanation as any for why the chips sat on the shelf for THREE YEARS.
They were pin-compatible with the originals, so it's not as if a new
motherboard design was needed. The rest of the interview is pretty
interesting too.
>
> <snip>
> > This would have been more than a year before VGA cards became
> > commonplace and, with a flicker-fixer, the Amiga would have been
> > comfortably ahead.
>
> The very existence of flicker-fixers demonstrates the large downside
> to specifying DMA and video timings so precisely.
Any video system depends on precise timings. The problem was that the
Amiga was expressly designed to be compatible with a TV signal (15Khz
horizontal scan rate). This was a unique strength and the sole reason
why the Video Toaster and other DTV systems were built around an Amiga
rather than an x86 box. The downside was that you couldn't get more
than 256 or so horizontal lines without interlace. ECS and AA had VGA
compatible 30Khz output modes, but unfortunately there was no hardware
over-ride to force 30Khz output if a program was instructing the
system to generate a 15Khz mode. (Apparently there wasn't enough
space left in ROM for the driver.) Hence the need for a flicker
fixer. A cludge, I agree, but a very workable stop gap solution. No
other system was as flexible in 1990, that's for sure.
>
> > (Not so by 1992 of course.) Paula was good enough to hold its own
> > until about 1991.
>
> That's not how I see it. HD floppies were introduced in 1987 iirc.
> I don't remember quite when 16-bit sound became common.
I was referring to sound only. It's true that Paula had difficulty
with HD floppies, but the difference between 880K and 1.4 megs was not
the end of the world. Serious users wouldn't have been using floppies
except for back-up. (Not in my parallel universe where Commodore
shipped hard disks as standard on mid-range machines anyway.) The
A3000 could write HD disks by spinning the drive at half speed.
The 8088 is 16-bit chip, not 8-bit. Yes it had an 8-bit data bus, but
so did the 68008.
> The 68000 is a MUCH cleaner design - fully 32 bit internally.
Not quite - the architecture is basically 32-bit but the 68000 had
16-bit ALUs which showed in some places, e.g. multiplication. From
a software point of view it can generally be considered 32-bit.
<snip>
>> > Jay Miner's team had already finished the ECS upgrade in early 1987.
>> > Commodore's clever leaders then sacked the original Amiga team and
>> > managed to lose the ECS blue prints. ECS saw the light of day in 1990
>> > only after prototype chips were reverse engineered.
>>
>> I've heard many myths about lost designs but not a lot of detail.
>> What/who's your source?
>
> 1988 Jay Miner interview originally published in Amiga Computing.
>
> http://amiga.emugaming.com/jayinterview2.jpg
>
> See the last column. Miner is describing the ECS with the Fatter
> Agnus and SuperDenise, which the remnants of the Amiga team had ready
> by early 1987.
No, he's describing something rather different. ECS as we know it
uses normal DRAM, not video RAM, and it doesn't support a 1024-pixel-
wide display, at least not along with the original display modes as
Jay says these chips did.
> Maybe Commodore didn't really lose the blueprints
> (can't remember where I read that now), but it's as good an
> explanation as any for why the chips sat on the shelf for THREE YEARS.
> They were pin-compatible with the originals, so it's not as if a new
> motherboard design was needed.
The chips he described they would not be pin-compatible with the
originals because of the different memory buses.
> The rest of the interview is pretty interesting too.
Yes, it fills in some bits of Amiga history that I was a bit hazy on.
>> <snip>
>> > This would have been more than a year before VGA cards became
>> > commonplace and, with a flicker-fixer, the Amiga would have been
>> > comfortably ahead.
>>
>> The very existence of flicker-fixers demonstrates the large downside
>> to specifying DMA and video timings so precisely.
>
> Any video system depends on precise timings.
The implementation needs to have precise timings. They can be subject
to change, though.
<snip>
> ECS and AA had VGA compatible 30Khz output modes, but unfortunately
> there was no hardware over-ride to force 30Khz output if a program
> was instructing the system to generate a 15Khz mode. (Apparently
> there wasn't enough space left in ROM for the driver.)
It's not something that could have been fixed with a driver.
The Amiga hardware reference manuals documented the internal timings
which should have been subject to change. This was a strength in that
it allowed precise control of the display using the copper in
conjunction with other parts of the display generator, but also a
weakness in that it left no room to change. There were a lot of Amiga
programs (particularly, but not only, games) which banged on the
hardware and would have broken if the timings changed. Even
system-friendly apps were guaranteed quite precise control over the
display through Views and ViewPorts.
> Hence the need for a flicker fixer. A cludge, I agree, but a very
> workable stop gap solution.
So why does my 1993 A1200 also need a flicker fixer? (Which cost a
significant amount of money and reduced the output colour-depth from
24-bit to 16-bit, annoyingly.)
> No other system was as flexible in 1990, that's for sure.
I'm not sure that's true, but I won't argue it.
>> > (Not so by 1992 of course.) Paula was good enough to hold its own
>> > until about 1991.
>>
>> That's not how I see it. HD floppies were introduced in 1987 iirc.
>> I don't remember quite when 16-bit sound became common.
>
> I was referring to sound only. It's true that Paula had difficulty
> with HD floppies, but the difference between 880K and 1.4 megs was not
> the end of the world.
Except when a PC-using customer (for example) gave you files on an HD
floppy.
> Serious users wouldn't have been using floppies except for back-up.
I take it you only exchanged files with people using the same BBS as
you, then? ;-)
> (Not in my parallel universe where Commodore shipped hard disks as
> standard on mid-range machines anyway.) The A3000 could write HD
> disks by spinning the drive at half speed.
That required a modified disk drive that was evidently too expensive
to include in low-end machines like the 1200. Another stop-gap.
--
Ben Hutchings
The most exhausting thing in life is being insincere. - Anne Morrow Lindberg
[snip]
> > See the last column. Miner is describing the ECS with the Fatter
> > Agnus and SuperDenise, which the remnants of the Amiga team had ready
> > by early 1987.
>
> No, he's describing something rather different. ECS as we know it
> uses normal DRAM, not video RAM, and it doesn't support a 1024-pixel-
> wide display, at least not along with the original display modes as
> Jay says these chips did.
>
No, I think he's talking about ECS. A display address range increase
to 2 megabytes refers to the Fatter Agnus. Video ram in this context
means chip ram. ECS's superhires mode is 1000+ pixels and co-exists
with the standard modes. Even if I'm wrong, the important point still
stands - the Amiga team had some substantial enhancements to the
chipset at an early stage and Commodore were dumb not to use them.
> <snip>
> > ECS and AA had VGA compatible 30Khz output modes, but unfortunately
> > there was no hardware over-ride to force 30Khz output if a program
> > was instructing the system to generate a 15Khz mode. (Apparently
> > there wasn't enough space left in ROM for the driver.)
>
> It's not something that could have been fixed with a driver.
The driver wasn't all that was needed, no, but according to Dave
Haynie, it was the sticking point. See under point 2 below:
http://amiga.emugaming.com/haynie2.txt
>
> The Amiga hardware reference manuals documented the internal timings
> which should have been subject to change. This was a strength in that
> it allowed precise control of the display using the copper in
> conjunction with other parts of the display generator, but also a
> weakness in that it left no room to change. There were a lot of Amiga
> programs (particularly, but not only, games) which banged on the
> hardware and would have broken if the timings changed. Even
> system-friendly apps were guaranteed quite precise control over the
> display through Views and ViewPorts.
Views and ViewPorts are created with high level library calls and
should have been eminently patchable with OS upgrades. If you played
by the RKM rules, it worked pretty well in practice as I recall. Not
that everyone did stick to the RKMs of course...
>
> > Hence the need for a flicker fixer. A cludge, I agree, but a very
> > workable stop gap solution.
>
> So why does my 1993 A1200 also need a flicker fixer? (Which cost a
> significant amount of money and reduced the output colour-depth from
> 24-bit to 16-bit, annoyingly.)
The flicker fixer was a stop-gap solution that was acceptable in 1990.
Commodore hadn't moved beyond the stop-gap solution 2 years later.
Ergo the A1200 needed a flicker fixer too. And yes, the stop-gap had
by then ceased to be acceptable.
>
> > No other system was as flexible in 1990, that's for sure.
>
> I'm not sure that's true, but I won't argue it.
Well, if you're not prepared to argue the point...
[snip]
>
> That required a modified disk drive that was evidently too expensive
> to include in low-end machines like the 1200. Another stop-gap.
Again, we're arguing about the merits of stop-gaps at different points
in time. It was all over for Commodore by 1992, but maybe not in
1988-90 when the available quick-fixes were not so ludicrous in the
context of the competition.
No, he described dual-ported video RAM
<http://en.wikipedia.org/wiki/Dual-ported_RAM>. However, all released
Amiga chipsets use ordinary DRAM; that's one of the reason why chip
memory is slow and especially so in high-colour modes.
> ECS's superhires mode is 1000+ pixels and co-exists with the
> standard modes.
He used the specific figure of 1024 whereas super-hires is 1280 pixels
wide. (Coincidentally or otherwise, 1024 pixels happens to be the
width limit of the original blitter. Kickstart 1.x limited the width
of a screen to 1008 pixels to allow for barrel-shifting and later
versions have the same limit when used with an old Agnus.)
> Even if I'm wrong, the important point still stands - the Amiga team
> had some substantial enhancements to the chipset at an early stage
> and Commodore were dumb not to use them.
Yes. I just thought it was interesting that ECS does *not* seem
to include all of the improvements of that chipset.
<snip>
>> The Amiga hardware reference manuals documented the internal timings
>> which should have been subject to change. This was a strength in that
>> it allowed precise control of the display using the copper in
>> conjunction with other parts of the display generator, but also a
>> weakness in that it left no room to change. There were a lot of Amiga
>> programs (particularly, but not only, games) which banged on the
>> hardware and would have broken if the timings changed. Even
>> system-friendly apps were guaranteed quite precise control over the
>> display through Views and ViewPorts.
>
> Views and ViewPorts are created with high level library calls and
> should have been eminently patchable with OS upgrades. If you played
> by the RKM rules, it worked pretty well in practice as I recall. Not
> that everyone did stick to the RKMs of course...
<snip>
If I remember correctly, ViewPorts can have custom copper-lists. This
was somewhat useful but of course incompatible with RTG and even with
a change of display mode (though some things could perhaps be
converted).
>Don Romero <don.r...@verizon.net> wrote in message news:<BCB950B7.1E263%don.r...@verizon.net>...
>> Only partly in hindsight, I believe the Amiga was doomed as well because the
>> philosophy and technology in the chip design left it inherently behind the
>> development curve.. Elegance comes with a cost.. while simplicity yields
>> cheap and easy. In a marketplace driven by most and the NEWEST features at
>> the lowest cost (and here and there, Microsoft compatibility).. cheap and
>> easy wins every time.
>But where was the cheap off-the-shelf hardware capable of delivering
>Amiga like performance in 1985?
It didn't exist.
And the claim doesn't have any place in today's market, anyway. Look
out there.. see any simplicity? I see Pentium 4s with 125 million
transistors, nVidia and ATi chips with 70-something pipleline stages
and a similar number of chips. We're not just talkin' "more complex
than the Amiga", we're talking "more complex by many orders of
magnitude" than the Amiga.
The problems were elsewhere, and with 20/20 hindsite and an actual
involvement in this stuff, I can tell you precisely where the problems
were. On the chip front, it wasn't the complexity of the chips, not at
all. It was the simple fact that Commodore management didn't spend the
necessary R&D money to maintain development of the custom chips in any
useful way. Look at modern tech companies, like nVidia, ATi, Intel, or
AMD. See how much money they plow back into R&D. Had Commodore done
this at similar levels, there would have no problem delivering new,
interesting chipsets.
Second problem was software: the details of the custom chips should
never have been revealed to programmers. This would have removed any
need to maintain binary compatibility. You don't worry about this
today, of course, because one ATi chip may or may not be anything like
the next, but no one cares, your drivers account for any differences.
That is the right way to do it, and it was well established as the
right way to do it before the Amiga ever shipped. That's the one area
of the whole OS that should have been done differently, before 1.0
shipped, IMHO.
>- First, the basic A1000 hardware should have been given a boost with
>extra RAM and a hard drive as soon as the cost of this non-proprietary
>hardware fell.
The big problem here: with about 150,000 or so Amiga 1000s in the
world, only some of those owners buying memory upgrades, and five or
six companies offering them, there was no way the prices could have
fallen. Not only that, but memory was wicked expensive in the
mid-to-late 80s. DRAM was in a worldwide supply crunch, and the prices
reflected it. While I don't disagree with the "it would have been
nice" factor, it also wasn't even remotely possible.
>- The custom graphic chips are often described as an evolutionary dead
>end. In fact, they had considerable room for improvement while
>retaining a very high level of backward compatibility, as AA showed.
Well, AA's compatibility was good, sure, but it was very much
evolutionary, and that showed. AA would have been great in 1988, ok in
1990, but in 1992, it was an also-ran, capabilities-wise. That goes
back to budget.. C= only had so much money to spend on improvements
(especially considering the huge salaries the top management got --
keep in mind, at this time, none of it was performance-keyed, and
folks like Gould and eventually, Ali were making substantially more
than the top dogs at Apple, IBM, Compaq, etc.
AAA, on the other hand, spent huge bits of technology on being
compatible. Much of that was totally impossible anyway, at least in
some configurations. For example, the 64-bit setup (one Andrea, one
Mary, two Lindas, and two Monicas.. I found this:
http://amiga.emugaming.com/amigaaaa.html with some of my drawings and
a shot from "The Deathbed Vigil" on it... seems correct, though they
missed the fact that AAA was actually in 1988, long before AA began).
>Jay Miner's team had already finished the ECS upgrade in early 1987.
Much of ECS was actually done in West Chester. Some of the delay was
the simple fact that Amiga Los Gatos wasn't using modern CAD tools.
They had to capture, simulate, and verify all the old stuff before
anything new could take place. The other problem was more serious: the
original Amiga chips, and ECS, were done in Commodore's 1.5 micron
high-speed NMOS process. They were very much pushing the limits on
what that process could achieve. Moving to CMOS was critical, but keep
in mind -- these were ALL transistor-level designs, not the gate-level
stuff people do today. Basically, you don't take an NMOS chip and
redesign it into a CMOS chip, you start over from scratch.
>Commodore's clever leaders then sacked the original Amiga team and
>managed to lose the ECS blue prints.
None of that happened. Again, ECS was done in West Chester, with some
consulting with the original team. For that matter, it was the second
major Amiga project done there, since Fat Agnus was designed in West
Chester (Bob Raible did the main architecture, Victor Andrade did the
chip, as I recall).
> ECS saw the light of day in 1990
>only after prototype chips were reverse engineered.
Also not true. There was no reverse engineering done. ECS chips were
running well before 1990, and they were phased into the A2000
production as a running change.
>Had Miner et al
>been retained with a modest R&D budget (which was instead pissed away
>on second rate PC clones) I don't think it requires a huge imaginative
>leap to envision AA in 1988/9.
Well, you're right about the budget in general. However, very little
was spent on the PC clones development. The point of that was to allow
C= to be a "one-stop shopping" supplier. One big problem Apple had in
Europe was their lack of a PC -- many large companies, in the day,
wanted to deal with a single vendor for all of their PC needs. If they
wanted PCs and Macs, and you didn't make both, they'd have to
choose... and it never went against the PC. This is also what drove
the Amiga UNIX project. PC development was done in-house, but it was
primarily systems work. The chips stuff they did was very simple
gate-array development (which means, systems engineers like me, Greg
Berlin, George Robbins, etc.... only those in the PC group. Not chip
designers) for glue logic. By the 1990s, they were outsourcing the
PCs, much like everyone else (Dell, Compaq, Gateway, etc).
> Denise is the only thing that changes.
Incorrect. Denise did pretty much everything Denise could ever do.
That's exactly why the Lisa chip in the AA set was the only
totally-new chip in AA. To go much beyond Denise (say, to add just the
simple upgrade from 16 to 256 LUT registers), CMOS was the only
possible choice. It couldn't have been a simple add-on to Denise. Of
course, even if it was, you would have had to tweak Agnus to deliver
an increased bandwidth (memory access in AA is 4x faster than in ECS
or OCS, if you didn't know).
>With something like Dave Haynie's A3000+ in 1989
Certainly, if C= management had treated Engineering with the same kind
of investment that, well, pretty much everyone else in high-tech done,
things could have happened differently. We never would have had AA in
1989, but with proper investment, maybe AAA (which began in 1988, but
probably could have started a year or two earlier, with the right
backing). That certainly wouldn't have sucked.
>I think Commodore would have found enough breathing room to
>transition to PowerPC/retargetable graphic cards, just as Apple did.
There's no telling. For one, it's not an instant assumption that we
would have gone to PowerPC. That was probably the right decision in
1990-1991, sure. Much before then, MIPS would have been the obvious
choice; too much beyond that, and x86 woudl have been the only
reasonable choice. The system architecture I designed in 1991, which
was intended to be the transitional machine (moving from dedicated
AA/AAA chips to Hombre and commodity RTG + PCI chips, moving from 68K
to "something else") was CPU independent.
>The fact that they lasted until 1994 making the choices they did shows
>how brilliant the 1985 design was, and I think you do it a great
>disservice.
Guess I missed that part.
One thing I did, because, well, I could do it (leading high-end Amiga
development had its advantages, and pretty much anything that Jeff
Porter, me, and sometimes Greg Berlin agreed on would fly, at least in
the days before Ali & Sydnes decided to kill the company with their
incompetance and micromanagement), was to ensure each Amiga system was
long lived. So we had the CPU slots (which Apple eventually copied,
but the x86 world never did, even though IBM actually published a very
good paper on this in the early 90s), video slots, expansion, etc. I
absolutely knew that the C= way (and fortunately, I learned that way
in the TED and C128 days, so as not to have to experiment on the
Amiga) was never going to get us a new model every year. Of course,
that only goes so far. My goal was to make your machines useful for at
least five years. I didn't count on 10-15 being necessary :-)
Dave Haynie | Chief Toady, Frog Pond Media Consulting
dha...@jersey.net| Take Back Freedom! Bush no more in 2004!
"Deathbed Vigil" now on DVD! See http://www.frogpondmedia.com
>James Copeland wrote:
>> Ben Hutchings <ben-publ...@decadentplace.org.uk> wrote in message
>> news:<slrncbe4bm.1k6.b...@shadbolt.i.decadentplace.org.uk>...
>>> > See the last column. Miner is describing the ECS with the Fatter
>>> > Agnus and SuperDenise, which the remnants of the Amiga team had ready
>>> > by early 1987.
>
>>> No, he's describing something rather different. ECS as we know it
>>> uses normal DRAM, not video RAM, and it doesn't support a 1024-pixel-
>>> wide display, at least not along with the original display modes as
>>> Jay says these chips did.
I don't know the context. There was some idea in Los Gatos for an
"ultra-hires" monochrome mode using VRAM and some external logic. This
was one of the things that was implemented (far as that goes) in ECS,
but no one ever did anything with it. I don't know the details. But it
wasn't terribly useful, just another crutch (like a flickerFixer). We
convinced the world they needed color in 1985 (if not before); better
monochrome in 1988 would have been boring.
>> No, I think he's talking about ECS. A display address range increase
>> to 2 megabytes refers to the Fatter Agnus. Video ram in this context
>> means chip ram.
>No, he described dual-ported video RAM
><http://en.wikipedia.org/wiki/Dual-ported_RAM>. However, all released
>Amiga chipsets use ordinary DRAM; that's one of the reason why chip
>memory is slow and especially so in high-colour modes.
The VRAM of the day (supported by AAA, also used in the Hedley Hires
device) was, in fact, dual ported memory. You have one parallel bus,
just like a normal DRAM, and one serial bus. You run a special cycle,
which looks more or less like a refresh cycle to the parallel bus, and
you download a whole row of data (probably 256 bits in those days) to
a shift register. They typically ran at several times the normal speed
of the DRAM bus (in our case, 280ns full cycles), making it fast
enough to drive a display. It was also interesting in that it could
very easily support an alternative pixel clock, while everything else
in the Amiga chipset was done in lock-step with the chip bus cycle, so
such things weren't possible (well, they were in AAA, but that's a
very different story... AAA had both "chip" and "graphics" buses).
>> ECS's superhires mode is 1000+ pixels and co-exists with the
>> standard modes.
>He used the specific figure of 1024 whereas super-hires is 1280 pixels
>wide. (Coincidentally or otherwise, 1024 pixels happens to be the
>width limit of the original blitter. Kickstart 1.x limited the width
>of a screen to 1008 pixels to allow for barrel-shifting and later
>versions have the same limit when used with an old Agnus.)
The standard in those days for hi-resolution was probably something
like 1024 x 768; I'm sure that's where the numbers come from, in
either case. ECS fixed the blitter size issues, as I recall; that's
why you got 1024-pixel support (rather than 1008) with the Hedley
Hires monitors (we all used them in West Chester) with the more modern
chips.
The built-in SuperHiRes was an ECS thing, though not terribly useful.
It just cut the pixel clock again, going to 35ns pixels (HiRes was
70ns, Lowres was 140ns pixels). Since the pixel clock was fixed, even
though the modes were somewhat flexible, there were not enough degrees
of freedom to make this terribly useful -- you couldn't vary
resolution independently of scan rate, for example.
>
>> Even if I'm wrong, the important point still stands - the Amiga team
>> had some substantial enhancements to the chipset at an early stage
>> and Commodore were dumb not to use them.
They had ideas; that's not the same as finished work. Some of it
probably could have been done better, but unfortunately, C= didn't
want it work that way. Some of it made sense: West Chester was the
design center. No one from Los Gatos wanted to move East.. they had
the option. A number of the former Los Gatos staff stayed on as
contractors. They had no trouble fitting into the West Chester way of
doing things -- we all had total respect for those guys. If, say, Dave
Needle had some to West Chester and wanted to run the A2000 project
there, I would have happily taken the Second Banana seat.
>If I remember correctly, ViewPorts can have custom copper-lists. This
>was somewhat useful but of course incompatible with RTG and even with
>a change of display mode (though some things could perhaps be
>converted).
Custom copper lists would die not only on any 64-bit AAA system, but
even usually on AA systems, once you kicked it into "AA" mode. That's
exactly why 32-bit and burst come up disabled on a AA system.
There wasn't any easy way to support the old register model. That
shouldn't have been the problem -- there shouldn't be any need to
support the register model. Other than in very trivial situations
(hell, even VGA chips are set up via BIOS calls, not
direct-to-the-metal), a register map should last precisely as long as
the chip designer thinks it should. If there's a reason to change it,
it should change. When you don't, things take 10x longer and STILL
don't wind up with perfect compatibility.
Thanks for taking the time to nail some of my half-remembered myths
Dave. Just to set the record straight, when was ECS finished and when
did work on AA begin?
Of course it would be ridiculous to attibute the Amiga's success
solely to the 1985 chips. Commodore's engineers fought valiantly
against fearsome odds and produced some great systems. I just wish to
God that you'd been given a proper budget so you could have released
things like the A3000+ sooner.
I quite agree, but just to play devil's advocate, do you think the
low-end Amigas would have sold nearly so well without hardware-
bashing games [*], or that the Amiga have been commercially viable
without these mass low-end sales?
[*] I'm not saying it's impossible to write a system-friendly game
that runs acceptably a low-end Amiga. Sim City gives the lie to
that. However I doubt that the more graphically impressive games
such as Shadow of the Beast could have so impressive without a
precise description of the graphics hardware.
<snip>
>>Jay Miner's team had already finished the ECS upgrade in early 1987.
>
> Much of ECS was actually done in West Chester. Some of the delay was
> the simple fact that Amiga Los Gatos wasn't using modern CAD tools.
<snip>
>> ECS saw the light of day in 1990
>>only after prototype chips were reverse engineered.
>
> Also not true. There was no reverse engineering done. ECS chips were
> running well before 1990, and they were phased into the A2000
> production as a running change.
<snip>
> things could have happened differently. We never would have had AA in
> 1989, but with proper investment, maybe AAA (which began in 1988, but
> probably could have started a year or two earlier, with the right
> backing). That certainly wouldn't have sucked.
<snip>
Could you just summarise the chronology of the development and release
of the various chipsets or individual enhanced chipsets? The timing
is unclear to me. I think it's something like this but a lot of the
dates are guesses:
1982-1985: development of OCS
1985: release of OCS
[This ignores the repackaging of Agnus and introduction of EHB.]
1985-1988: development of ECS
1988: release of ECS 1 MB Agnus
1988-1993: development of AAA (incomplete)
1990: release of full ECS
1991-1992: development of A(G)A
1992: release of AGA
1993-1994: development of Hombre (incomplete)
Do you think that perhaps AAA was too ambitious and that it might
have been better to advance the chipset in smaller, quicker steps?
--
Ben Hutchings
I haven't lost my mind; it's backed up on tape somewhere.
Do you mean something other than the Hedley display, and are you
saying that the ECS chips actually produced are capable of this (aside
from the lack of that external logic, obviously)?
<snip>
>>> No, I think he's talking about ECS. A display address range increase
>>> to 2 megabytes refers to the Fatter Agnus. Video ram in this context
>>> means chip ram.
>
>>No, he described dual-ported video RAM
>><http://en.wikipedia.org/wiki/Dual-ported_RAM>. However, all released
>>Amiga chipsets use ordinary DRAM; that's one of the reason why chip
>>memory is slow and especially so in high-colour modes.
>
> The VRAM of the day (supported by AAA, also used in the Hedley Hires
> device) was, in fact, dual ported memory.
Do you know why AGA does not support VRAM? Incompatibility, expense,
or shortness of development time?
<snip>
> The built-in SuperHiRes was an ECS thing, though not terribly useful.
> It just cut the pixel clock again, going to 35ns pixels (HiRes was
> 70ns, Lowres was 140ns pixels). Since the pixel clock was fixed, even
> though the modes were somewhat flexible, there were not enough degrees
> of freedom to make this terribly useful -- you couldn't vary
> resolution independently of scan rate, for example.
ISTR that ECS SuperHires is a horrible hack too - colour lookup still
runs on a 70 ns clock, taking a pair of interleaved pixel values and
returning a pair of interleaved colour values from a suitably mangled
palette.
<snip>
> Other than in very trivial situations (hell, even VGA chips are set
> up via BIOS calls, not direct-to-the-metal),
<snip>
I think you're wrong there. My understanding is that VGA was the last
register-level "standard" for PC graphics, and that's why even Windows
NT, which doesn't use real-mode BIOS calls, can use a generic VGA
driver in "safe mode". However VGA is simple enough, and there is
enough variation in unspecified details between its many clones, that
register-level compatibility with it doesn't hold PC graphics back.
--
Ben Hutchings
Q. Which is the greater problem in the world today, ignorance or apathy?
A. I don't know and I couldn't care less.
>dha...@jersey.net (Dave Haynie) wrote in message news:<40bf704a....@news.jersey.net>...
>[snip]
>> One thing I did, because, well, I could do it (leading high-end Amiga
>> development had its advantages, and pretty much anything that Jeff
>> Porter, me, and sometimes Greg Berlin agreed on would fly, at least in
>> the days before Ali & Sydnes decided to kill the company with their
>> incompetance and micromanagement), was to ensure each Amiga system was
>> long lived. So we had the CPU slots (which Apple eventually copied,
>> but the x86 world never did, even though IBM actually published a very
>> good paper on this in the early 90s), video slots, expansion, etc. I
>> absolutely knew that the C= way (and fortunately, I learned that way
>> in the TED and C128 days, so as not to have to experiment on the
>> Amiga) was never going to get us a new model every year. Of course,
>> that only goes so far. My goal was to make your machines useful for at
>> least five years. I didn't count on 10-15 being necessary :-)
>Thanks for taking the time to nail some of my half-remembered myths
>Dave. Just to set the record straight, when was ECS finished and when
>did work on AA begin?
ECS phased into A2000 production, I believe it was in 1988, but I
can't swear to it. We had a testing phase, too, which even resulted in
the redesign of Denise, since some timing tweaks on the video signal
cause the Video Toaster to malfunction. And you can't have that.
AAA was started in 1988, while AA (originally called Pandora) was in
late '89 or early 1990. I built the first AA prototype system in the
winter of 1990-1991. We had first silicon in late December, but due to
a bug (I _think_ it was in the Alice chip), we didn't have a booting
AA machine until February of 1991. That was up and running AmigaOS,
with one chip revision -- not too shabby. There were actually a few AA
bugs never fixed, since I had found perfectly decent system
work-arounds before they had any chance at fixing them in the silicon
(and you didn't make unnecessary changes to a chip as "full" as
Alice).
>Of course it would be ridiculous to attibute the Amiga's success
>solely to the 1985 chips. Commodore's engineers fought valiantly
>against fearsome odds and produced some great systems.
Standing on the sholders of giants. If not for the Amiga, what was C=
going to do?
Actually, I know that, at least a year or two out. In 1984-85, there
was the C900 project in West Chester. Bob Raible and George Robbins,
apparently the third set of engineers on this project, were also the
first to actually get it working to production standards (both
exceptional engineers, also very complementary in their skills). The
C900 was a full 16-bit machine, based on the Z8000 microprocessor. The
machine they built was like a baby Sun-2: megapixel (monochrome-only)
display, UNIX-like OS (Mark Williams "Coherent"), etc. Tragically,
when the Amiga was purchased, the C900 was cancelled, though it would
be another 7 years before the Amiga really overlapped the C900 (Amiga
3000 running AmigaUNIX, certainly a faster and better version, but
you'd have to believe that, if the C900 didn't die in '84-85, there
would have been additional models to follow).
The wise move, back then, would have been for C= to drop the PC Clone
work, OEM PC clones from some decent company without any foot in
Europe (where C= was the strongest), and keep up, perhaps even cross
license C900 and Amiga technology. Of course, hindsight and all that,
but it wouldn't have mattered -- C='s bosses were never that clever,
even if "Dave of the Future" had paid them a visit one night.
>I just wish to
>God that you'd been given a proper budget so you could have released
>things like the A3000+ sooner.
I wish I could have released the A3000+ at all. And so would everyone
working in audio or video back then: the DSP3210 in the A3000+ did
32-bit floating point 5x-10x faster than the '040; we had 16-bit audio
I/O, software modem, etc. Very cool for the day.
The problem wasn't even technical, but political. Ali hired Bill
Sydnes to replace Henri Rubin as VP of Engineering. Sydnes' main job
for six months was to kill every on-going project, lest anyone start
to suspect that maybe Henri's administration (eg, Jeff Porter) had
really been running things perfectly well (eg, the bosses who brought
you the A500, A2000, and A3000; not the guys who made the A600 or
A4000). So the A3000+ was killed, just because.
Worse yet, so was the "A1000+". This was our $800, separate-keyboard,
AA-based, two Zorro + video + CPU slot, Fast RAM support built-in,
25MHz system. The one in-between the A500/A1200 and A3000/A4000 that
many, many people has asked for. More than anything, I think THAT
machine, done the way it was being done in early 1991, could have kept
C= around, given the times. Of course, we also had AA shipping in
volume in April of 1992, not in small numbers (for that "Osborne"
flavor) starting in October 1992.
>Dave Haynie wrote:
>> I don't know the context. There was some idea in Los Gatos for an
>> "ultra-hires" monochrome mode using VRAM and some external logic. This
>> was one of the things that was implemented (far as that goes) in ECS,
>> but no one ever did anything with it.
>Do you mean something other than the Hedley display, and are you
>saying that the ECS chips actually produced are capable of this (aside
>from the lack of that external logic, obviously)?
There was a special "UltraHiRes" mode, which was basically little more
than extra kind of cycle that would trigger the VRAM shift-register
loads. So the whole "UltraHiRes" thing would have been external logic;
the minimal hook necessary to make it work was supposedly implemented
in ECS.
>> The VRAM of the day (supported by AAA, also used in the Hedley Hires
>> device) was, in fact, dual ported memory.
>
>Do you know why AGA does not support VRAM? Incompatibility, expense,
>or shortness of development time?
You have to understand VRAM... it's basically a whole ordinary DRAM,
plus "other stuff". Part of that "other stuff" basically means you
have a separate bus for the graphics data processing, you don't
process it via the normal "chip" bus. So, for example, for AA to do
anything with VRAM, the Lisa chip would have needed another
38-something pins, to access this graphics bus.
AAA actually did support VRAM, and I suppose that alone is why AA
doesn't -- simplicity. AA was very evolutionary, running a simple
modification to the old chip bus. AAA could run 100's of cycles worth
of bursts, full 32-bit chip bus, 64-bit graphics bus with VRAM, all
kinds of crazy things. AAA also had the Linda chip, which was a full
dual-ported line buffer, kind of necessary to make VRAM stuff work
without being totally insane, relative to memory alignment and other
restrictions.
Also, in those days, VRAM was wicked expensive and hard to get. The
only reason Hedley Hires was even possible was that we "pulled a C="
on the chip specs. They wrote a spec for the kind of memory chip the
hardware actually needed, knowing that the 150ns spec was the
in-demand part, and no one wanted the 200ns parts. This amounted to
roughly a 170ns chip, many of the 200ns parts met the spec.
C= always did that kind of stuff.
><snip>
>> The built-in SuperHiRes was an ECS thing, though not terribly useful.
>> It just cut the pixel clock again, going to 35ns pixels (HiRes was
>> 70ns, Lowres was 140ns pixels). Since the pixel clock was fixed, even
>> though the modes were somewhat flexible, there were not enough degrees
>> of freedom to make this terribly useful -- you couldn't vary
>> resolution independently of scan rate, for example.
>ISTR that ECS SuperHires is a horrible hack too - colour lookup still
>runs on a 70 ns clock, taking a pair of interleaved pixel values and
>returning a pair of interleaved colour values from a suitably mangled
>palette.
Well, sure, but it actually had to -- the ECS LUT didn't run at 35ns.
The SuperHiRes stuff was basically just a MUX added on the periphery
of the LUT. You just couldn't get that kind of speed out the old NMOS
chips.
><snip>
>> Other than in very trivial situations (hell, even VGA chips are set
>> up via BIOS calls, not direct-to-the-metal),
><snip>
>I think you're wrong there. My understanding is that VGA was the last
>register-level "standard" for PC graphics, and that's why even Windows
>NT, which doesn't use real-mode BIOS calls, can use a generic VGA
>driver in "safe mode".
Well, sure, "generic" VGA, I think, is all a clone of the original IBM
VGA registers, though maybe with 24-bit color (IBM originally shipped
18-bit LUTs). But anything SVGA is questionable, as the "S" parts
evolved independently of IBM, who was off building XGA and other
stuff. One of the first jobs of the VESA standards body was to define
the logical SVGA modes. That's what I'm alluding to here. And I think,
once the mode is set up, it's the same from HW to HW too, which is why
this function could live the card's BIOS.
> There was a special "UltraHiRes" mode, which was basically little more
> than extra kind of cycle that would trigger the VRAM shift-register
> loads. So the whole "UltraHiRes" thing would have been external logic;
> the minimal hook necessary to make it work was supposedly implemented
> in ECS.
Well, so what does this UltraHiRes bit in the ECS register set
actually do? I've seen it in the specs, but found absolutely no
reasonable explanation for its function? Was the design goal something
like a resolution beyond the 1280 pixels in PAL/NTSC?
As far as I get you, it would have required some add-onn hardware
in the video slot?
So long,
Thomas
Hi Dave, I have been away for a few years now and haven't been able to
follow the Amiga scene. Still holding annual digs at your place? Did I
miss the opportunity (which I have never had the chance to take
advantage of) this year? Where is the old crowd hanging out online now?
If your prefer email to myr...@cox.net.
Myron
>Way off topic...
>
>Hi Dave, I have been away for a few years now and haven't been able to
>follow the Amiga scene. Still holding annual digs at your place?
I have been... not sure about the timing this year, but I'm thinking
of doing it again. Wouldn't be until late August or September.
> Where is the old crowd hanging out online now?
There's nothing really central for the C= crowd. I'm on the Team Amiga
list, which has a number of people from the old days, mostly talking
about things unrelated to the Amiga these days.