Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Woz thinks of an Apple II improvement over 35 years later

2,623 views
Skip to first unread message

D Finnigan

unread,
Dec 3, 2014, 3:35:04 PM12/3/14
to
I'm posting this here for the remaining few people who haven't read this
elsewhere.

Our Mike Willegal who makes Apple I and rev 0 Apple II replicas has the
ear of Steve Wozniak and often corresponds with him. Most recently, Woz
mentioned that he had just thought of an improvement to the Apple II.

He said:

"I am totally interested in hearing such things even after all these
decades. I awoke one night in Quito, Ecuador, this year and came up with
a way to save a chip or two from the Apple II, and a trivial way to have
the 2 grays of the Apple II be different (light gray and dark gray) but
it’s 38 years too late. It did give me a good smile, since I know how
hard it is to improve on that design."

Willegal posted this in his blog:
http://www.willegal.net/blog/?p=6023#comments

I posted a comment asking if Willegal was going to press Woz to get
more details. He did. This was Willegal's interpretation of Woz's next
reply:

"Regarding the two shades of Grey, they are represented in memory by
either A or 5 which in both cases turns into a 50% duty cycle square
wave going to the video mixer (those three resistors on the right side
of the board). Changing the duty cycle of the signal (percent hi versus
low) will change the shade of grey, as well as affect other colors. Woz
suggested that it could be done with a resistor, capacitor and diode,
though he didn’t specify the exact circuit.

I did some experiments over the weekend and found I could easily get an
effect just by adding a 47pF capacitor to the output of the 74LS74 at
B10, pin 5 (rev 0 schematics). This gave slightly different shade of
gray, but didn’t affect the other colors very much. However that
capacitor and some other things I tried, really messes with the
integrity of the signal, so I’m looking for a “cleaner” solution. Maybe
I’ll try using a 74LS123 one shot, though it kind of violates Woz’s idea
that it could be done without adding chips. Maybe the integrity of the
signal doesn’t matter that much and the simple capacitor solution would
be fine. You could use a variable cap and make the effect adjustable.

This flip flop is clocked by the system 14MHz clock, so you can’t do it
with normal digital logic. Also, fooling with the clock input to that
flip flop would likely upset timing for the entire system.

He didn’t say what he had in mind for eliminating a chip or two."

jWs

unread,
Dec 4, 2014, 12:29:50 PM12/4/14
to
On Wednesday, December 3, 2014 3:35:04 PM UTC-5, D Finnigan wrote:
> I'm posting this here for the remaining few people who haven't read this
> elsewhere.
>
> Our Mike Willegal who makes Apple I and rev 0 Apple II replicas has the
> ear of Steve Wozniak and often corresponds with him. Most recently, Woz
> mentioned that he had just thought of an improvement to the Apple II.


General question(s) regarding this topic: why is less chips always considered better? Sure, if it does exactly what it did before and is just as easy to program, elimination makes sense.

But can anyone think of how if a few MORE chips were added to the Apple II, it could have been made much easier to program, or have been more flexible somehow?

For instance, I've always thought the high res graphics addressing to be overly complicated, and why have a completely non-standard floppy disk format.

- John S.


David Schmidt

unread,
Dec 4, 2014, 12:39:18 PM12/4/14
to
On 12/4/2014 12:29 PM, jWs wrote:
> General question(s) regarding this topic: why is less chips always considered better?

Sherman, set the Wayback machine to the 1970s. Chips were expensive.
CPUs were insanely expensive. That drove Woz's decision to adopt the
6502 (not his first choice, as I understand it). Every chip added adds
expense and reduced profit. In books I've read, he learned this from
work at other jobs - designs were accepted or rejected (or at least
critiqued) based on chip count. Plus, there's the intellectual
stimulation of getting the most out of the simplest design. The entity
that became Apple Computer, Inc. was in the business of making money -
not making it convenient to move bits around on a TV screen. They
wanted to make it possible, yes - but programmer convenience was not the
primary driver at the time.

gid...@sasktel.net

unread,
Dec 4, 2014, 1:23:40 PM12/4/14
to
Don't forget that less chips also means a smaller computer, and Woz wanted to make a Personal Computer to use at home and not just in schools. Although, "programmer convience" may not have been the primary driver, Apple somewhat made an attempt at it since they incorporated Applesoft, which made it easy to teach, easy to learn and easy to program.

But the other side of the coin is, since gaming systems were as popular as the Apple was as a teaching aid, adding an extra chip or two would have made graphics and sound that much easier to program and sound better on the Apple. I believe that Apple computer would have sold that many more and would have been even more popular in homes. Maybe even as popular and as powerful as the Amiga with its 16 bit 16000 CPU.

open...@gmail.com

unread,
Dec 4, 2014, 2:07:09 PM12/4/14
to

The non-standard disk format was a great choice! Having the RWTS in software allowed for enhancements to floppy storage and creative use of the disk.

And the cost argument was compelling. Apple computers had fast disks, and the overall cost was reasonable, meaning many users adopted disk storage quickly. Other computer manufacturers resolved this with cartridges, and cassette, both with strengths, but lacking the simple utility floppy disk storage made possible.

As for the goofy screen addressing, I'm sure many of us would much rather seen something a little more sane, but the color shift bit turned out to be a great choice too.

Many machines presented the 4 color screen. An Apple could do six, and it turns out 6 colors is enough to do most anything. It was just enough. 7 bits per byte was a good call, but the addressing maybe wasn't.

I've always thought it would be cool to just scan more of the screen memory, linear, to get 160 color pixels / screen. It would be more than 40 columns of text, but it would also use the memory better.

A card could do this.

Michael Black

unread,
Dec 4, 2014, 2:16:59 PM12/4/14
to
On Thu, 4 Dec 2014, jWs wrote:

> On Wednesday, December 3, 2014 3:35:04 PM UTC-5, D Finnigan wrote:
>> I'm posting this here for the remaining few people who haven't read this
>> elsewhere.
>>
>> Our Mike Willegal who makes Apple I and rev 0 Apple II replicas has the
>> ear of Steve Wozniak and often corresponds with him. Most recently, Woz
>> mentioned that he had just thought of an improvement to the Apple II.
>
>
> General question(s) regarding this topic: why is less chips always
> considered better? Sure, if it does exactly what it did before and is
> just as easy to program, elimination makes sense.
>
Design is a criteria, and one design could be "best" while another "best"
yet each is different and based on different criteria.

Steve Wozniak always seems to like to do things with minimal parts count.
So he did figure out how to change this over here so that over there is
done simpler. For commercial equipment, saving 5cents means 5cents
multiplied by how many units are made, which can be significant over the
full run.

It's not just the cost of the component. If the circuit board can use up
less space, that will be cheaper to make, and require a smaller case.
Again, over a lot of units, it can add up.

On the other hand, you can add parts to make things overall simpler. I
have a cheap pocket shortwave radio, sells for $30, is on par with that
horrible desktop radio I bought in 1971 for about $90 (and that was when
money was more valuable). This portable is a simple radio, except they
add an IC and a digital readout, so there is a clock and the ability to
digitally display the frequency the radio is tuned to. It doesn't improve
radio reception performance, but by adding this IC that complicates it so
much (even though the manufacturing process isn't made a whole lot more
complicated), they don't need a dial (and the space required for it), they
don't need to fuss at the factory to calibrate the dial (or just ship it
without caring about calibration), and they can have each band very small,
which makes it easier to tune with the little thumb knob.

In the days of tubes, radios tried to be minimal because tubes were
expensive, and took up a lot of space. When transistors came along, one
could add transistors without much more cost (in money or space) so the
gain could be spread over more stages, and that simplified other things.
When ICs came along, they used a whole lot more transistors to do what was
done with discrete transistors, because the cost of adding a few
transistors meant little to the cost of the IC, but adding transistors
implified overall design.

> But can anyone think of how if a few MORE chips were added to the Apple
> II, it could have been made much easier to program, or have been more
> flexible somehow?
>
But that was his tradeoff. He did cut back on ICs, so one had that weird
spacing of the video memory. He "fixed" it by having a routine to deal
with it, so in effect the gaps in the video memory didn't matter. Other
people designed for minimal software, and while I'm not sure that's a good
example, that sort of thing often means more hardware cost.

He came up with a wonderful floppy disk controller, extremely cheap for
the parts at a time when other floppy controllers used a lot of ICs or an
expensive 40pin IC. But the cost was that the Apple II had to do more in
software (I suspect there would have been problems if there'd ever been a
mutlitasking OS for the Apple II, I suspect the floppy software would have
required too much attention so the rest would slow down during floppy
operations), and they started with a less complicated floppy disk drive.
In the latter case, it didn't make the drives cheaper (despite taking away
a lot of parts), but it meant one couldn't use off the shelf floppy drives
that eventually became so very cheap. And with other computers, one could
have double sided drives, and then later swap 3.5" drives for 5.25" drives
and get more capacity, something that didn't happen so easily with the
Apple II.

> For instance, I've always thought the high res graphics addressing to be
> overly complicated, and why have a completely non-standard floppy disk
> format.
>
It seemed to help at the time. The Apple II was expensive when it came
out, yet was a fairly full blown system when others were bits and pieces
you put together to come up with a similar system. But, it was cheap when
compared to those other systems when they were fully loaded like the Apple
II. If cost had not been a factor with the Apple II, the IC count could
have gone up, and the price gone up, which at the time likely was an
issue.

Later, it mattered a whole lot less. The Apple II had to use off the
shelf components, so cost mattered. When they could use custom ICs in the
IIC and IIE, because they knew they would sell enough to warrant the
overhead of the custom ICs, that was cost cutting in a different form.
But it was only viable when you could front the money for the custom IC in
the first place, something not an option in 1976.

Michael

Michael J. Mahon

unread,
Dec 4, 2014, 3:17:00 PM12/4/14
to
I think it's interesting that software-oriented people think that
simplifying software is a major hardware design goal. It is not.

Systems are hardware + software. Hardware costs are non-recurring (design)
costs + recurring (manufacturing) costs. Software costs are design costs +
support costs, no real manufacturing costs.

Therefore, as long as low-level software complexity is compartmentalized
and does not significantly add to design costs or contribute to support
costs, it is not an important hardware design factor.

Hardware is freed by low-level software to implement the most
cost-effective hardware by moving recurring manufacturing cost into
non-recurring software design cost--often an excellent tradeoff,
particularly if high volumes can be expected.

Woz, for both personal and professional reasons, was relentless in his
pursuit of greater system functionality from lower hardware complexity, and
he knew how to use low-level software to present a clean system interface.

Beauty or elegance or "design merit" is directly proportional to useful
functionality and inversely proportional to the implementation complexity
required to achieve it.

An elegant design with good cost-performance was key to the success of the
Apple II, and, necessarily, to the success of Apple.

One of the lessons of computer architecture is to never let software folks
design a hardware interface alone--they misunderstand where the real
symmetries are in hardware and fail to exploit them while introducing
countless costly "features". (And they seldom understand what is getting
cheaper and what's getting more expensive.)
--
-michael - NadaNet 3.1 and AppleCrate II: http://home.comcast.net/~mjmahon

David Schmenk

unread,
Dec 4, 2014, 3:58:59 PM12/4/14
to
Conversely, don't allow hardware folks design a hardware interface alone--they assume whatever mismatch in functionality can be handled later in software. Having insight into both is fundamental to efficient software implementation and cost effective hardware. The hires memory map of the Apple II is always singled out as convoluted and inefficient for software. But in all honesty, the design tradeoff Woz made incurred minimal software overhead and saved a few cents per motherboard and, probably more important, increased reliability. Definitely a big win in 1977.

Michael J. Mahon

unread,
Dec 4, 2014, 5:06:50 PM12/4/14
to
I agree completely--though, in my experience, experienced hardware
designers are much more likely to appreciate this than even experienced
software designers.

The ignorance of what software is actually doing in hardware has actually
gotten continually worse as language levels have risen and the only API
many programmers learn is tens of thousands of feet above the hardware.

HP's architecture teams were always deep with hardware, software, and
architecture expertise.

"Architecture expertise" involves system-level integration tradeoffs, but,
even more importantly, how to maintain backward compatibility for 20+ years
without constraining the constant evolution of efficient implementations.
Tricky, that. ;-)

Done properly, it allows a large and growing body of software written to
the architected interface to run compatibly and efficiently on a large
range of implementations of that architecture, even when the range of
software applications and hardware implementations is difficult to predict.


History is replete with examples of people accidentally nailing one of
their feet to the ground. Architecture is about preventing that.

> The hires memory map of the Apple II is always singled out as convoluted
> and inefficient for software. But in all honesty, the design tradeoff Woz
> made incurred minimal software overhead and saved a few cents per
> motherboard and, probably more important, increased reliability.
> Definitely a big win in 1977.

Yes, and eliminating the need for a separate DRAM refresh probably saved
several chips, not to mention the performance hit from having to steal
cycles from the 6502 to do the refresh (as the C=64 line did).

Perhaps an even bigger implication of not folding DRAM refresh into screen
refresh would have been the loss of timing determinism for the processor.
The whole scheme of using the 6502 to control the "upper end" of disk I/O
would not have been so easy if processor cycles could be stolen by refresh
circuitry.

Looked at another way, Woz's refresh design is one of the things that made
the use of DRAM a good choice for the Apple II, at a time when SRAM was
still very common.

And doing it all without using a Z80, with it's built-in refresh counter,
or another bunch of multiplexers and counters, each with concomitant
performance hits, was a coup.

You may recall that some machines of the era took all the hits: DRAM
refresh interference, graphics refresh interference, and more complex
hardware designs. The Apple II had none of that.

Michael Black

unread,
Dec 4, 2014, 6:11:28 PM12/4/14
to
I thought they'd limited it to 40 characters since many were using tv sets
as monitors, and tv sets would have mangled wider lines.

Of course, 80 column cards did come later, and I suspect were pretty
common after a certain point. But there Apple didn't cut corners, if
they'd made the Apple II self-contained, one would have had to buy a new
computer to get those 80 columns. Since the bus connectors were there, it
made the whole thing terribly adaptive. right down to putting various CPU
boards in the computer.

On the other hand, the IIE is pretty neat in terms of being compatible
with the II, but giving 80 columns. Just add more memory to the memory
card, static if you only needed more columns, dynamic RAM if you wanted
more columns and more RAM.

Michael

mwillegal

unread,
Dec 4, 2014, 10:03:51 PM12/4/14
to
It was interesting to learn that some major components of the Apple II video/DRAM refresh system was already in place in the Apple 1. In fact the video timing counters also feed the DRAM refresh system in the Apple 1. However the refresh data did not feed the video and it suspended the 6502 instead of stealing 1/2 the memory cycle. Even though the Apple 1, suspends the CPU during refresh, the performance difference between the Apple 1 and Apple II is not great, with the Apple II, just a tiny bit faster. The memory mapped video with color graphics was a giant leap forward in functionality. There was major expense in adding the slots to the Apple II, something I read that Woz avocated, and was against Job's wishes. So adding functionality at high cost wasn't always avoided by Woz, when it was deemed important enough.

regards,
Mike W.

Michael J. Mahon

unread,
Dec 4, 2014, 10:35:53 PM12/4/14
to
In fact, slots provide exactly the kind of future-proofing that a durable
design needs. Without slots, a requirements shift means having to start
over.

mdj

unread,
Dec 4, 2014, 11:37:12 PM12/4/14
to
On Friday, 5 December 2014 08:06:50 UTC+10, Michael J. Mahon wrote:

> Perhaps an even bigger implication of not folding DRAM refresh into screen
> refresh would have been the loss of timing determinism for the processor.
> The whole scheme of using the 6502 to control the "upper end" of disk I/O
> would not have been so easy if processor cycles could be stolen by refresh
> circuitry.

Not just coalesced, but out of band - or two ~1mhz sidebands either side of a ~2mhz 'carrier' :-)

It is the *quintessential* architecture decision of the Apple II, and it's interesting to note the comparative complexity of two attempts to succeed it. The Apple /// and the IIe. One of them treats the original design as a mistake ;-)

Of course the period of time where memory was faster than microprocessors was very short-lived, and it soon became essential to squeeze out every last hert of available memory bandwidth for computation.

The Apple II though is a near-perfect exploitation of that particular tradeoff, and as you note, it provided the foundation for perhaps the finest example of elegant hardware *and* software design ever devised. The Apple II is bested IMO only by the engineering masterpiece of the Disk II.

Matt

Steve Nickolas

unread,
Dec 4, 2014, 11:49:18 PM12/4/14
to
On Thu, 4 Dec 2014, Michael J. Mahon wrote:

> In fact, slots provide exactly the kind of future-proofing that a durable
> design needs. Without slots, a requirements shift means having to start
> over.

Biggest plus of the Apple ][ over the C64.

-uso.

Bill Garber

unread,
Dec 5, 2014, 1:18:57 AM12/5/14
to

"Steve Nickolas" <usot...@buric.co> wrote in message
news:alpine.DEB.2.02.1412050450060.70689@localhost...
You can say that 6 more times. ;)

Bill Garber - I love my
C1 F0 F0 EC E5 A0 C9 C9 E7 F3
http://www.sepa-electronics.com


Your Name

unread,
Dec 5, 2014, 1:25:40 AM12/5/14
to
In article <puqdnaKU7JLN0xzJ...@giganews.com>, Bill Garber
<will...@comcast.net> wrote:
> "Steve Nickolas" <usot...@buric.co> wrote in message
> news:alpine.DEB.2.02.1412050450060.70689@localhost...
> > On Thu, 4 Dec 2014, Michael J. Mahon wrote:
> >>
> >> In fact, slots provide exactly the kind of future-
> >> proofing that a durable design needs. Without slots,
> >> a requirements shift means having to start over.
> >
> > Biggest plus of the Apple ][ over the C64.
>
> You can say that 6 more times. ;)

The C64 had *a* slot, although it was almost totally used for pluging
in games cartridges. :-)

Bill Buckels

unread,
Dec 5, 2014, 6:58:53 AM12/5/14
to
"David Schmidt" <schm...@my-deja.com> wrote:
>The entity that became Apple Computer, Inc. was in the business of making
>money - not making it convenient to move bits around on a TV screen. They
>wanted to make it possible, yes - but programmer convenience was not the
>primary driver at the time.

The design of the Apple II graphics modes until the Apple IIgs is a thing of
beauty, and no harder to program than other computer graphics of the day
like the Commodore 64 and and (a little later) the IBM-PC. I can provide
many examples of programs that I've done for the last 30 years that supports
that last remark.

Today programming and working around the limitations of the Apple II's small
footprint is a source of great joy and a fitting and honourable recreational
activity for an aging mind like mine, but it is really shocking to think
that this all began from the young mind of just one guy with nothing but his
wits and creativity. Some may remember that my hero is not Jobs but Woz,
although I give Jobs credit for the Big Picture, and capitalizing Woz in the
early days.

It's a good thing that Apple and others took care of the busines side and
made all this a reality when they did. The industry they started and moved
along fed many families for many years including mine. Without the Apple II
it's hard to say where we would all be today.

The position of the color space of the Apple II's 16 Lo-Res NTSC colors also
used in DHGR is perfect. Adding the second grey wouldn't change this much,
but changing the position of the other colors in the color space would foul
the color balance I think

Bill


Gamoe

unread,
Dec 5, 2014, 9:03:10 AM12/5/14
to
*This* is the type of thing I would like included in my Apple II Wiki
project. This covers not only the *technical* aspects of the Apple II,
but it's *historical* development, the *people* and *passion* involved
as well as and underlying design *philosophy*.

Again, please e-Mail me if you're interested in this project. Ther'e's
more on it on the "share your story- Help me make an Apple II Wiki"
thread.

Anyway, thanks for posting this story, D Finnigan! (BTW, I look forward
to purchasing your book at some point once I have some funds.)

Michael Black

unread,
Dec 5, 2014, 4:50:16 PM12/5/14
to
Yes. But one slot is not that much better than no slots. The Radio Shack
Color Computer also had a slot, and it was fine so long as I only needed
to add a floppy disk controller. When I wanted to add a serial port, I
had to do some work to get "another slot". Radio Shack sold an expander,
but I wasn't spending money on that.

Michael

D Finnigan

unread,
Dec 5, 2014, 5:12:38 PM12/5/14
to
Gamoe wrote:
>
> Anyway, thanks for posting this story, D Finnigan! (BTW, I look forward
> to purchasing your book at some point once I have some funds.)
>

Oh yeah, thanks for reminding me. I usually lower the price right around
this time of the year.
Right now, retail price is $25 and Amazon is discounting it to $22.50.

I can set retail to $20 and then it will really fly off the shelves, as I've
found out... ;-)

Stand by.

--
]DF$
Apple II Book: http://macgui.com/newa2guide/
Usenet: http://macgui.com/usenet/ <-- get posts by email!
Apple II Web & Blog hosting: http://a2hq.com/

gid...@sasktel.net

unread,
Dec 5, 2014, 6:47:28 PM12/5/14
to
On Friday, 5 December 2014 15:50:16 UTC-6, Michael Black wrote:
> On Fri, 5 Dec 2014, Your Name wrote:
>
> > In article <>, Bill Garber
> > <> wrote:
> >> "Steve Nickolas" <> wrote in message
> >> news:alpine.DEB.2.02.1412050450060.70689@localhost...
> >>> On Thu, 4 Dec 2014, Michael J. Mahon wrote:
> >>>>
> >>>> In fact, slots provide exactly the kind of future-
> >>>> proofing that a durable design needs. Without slots,
> >>>> a requirements shift means having to start over.
> >>>
> >>> Biggest plus of the Apple ][ over the C64.
> >>
> >> You can say that 6 more times. ;)
> >
> > The C64 had *a* slot, although it was almost totally used for pluging
> > in games cartridges. :-)
> >
> Yes. But one slot is not that much better than no slots. The Radio Shack
> Color Computer also had a slot, and it was fine so long as I only needed
> to add a floppy disk controller. When I wanted to add a serial port, I
> had to do some work to get "another slot". Radio Shack sold an expander,
> but I wasn't spending money on that.
>
> Michael


I kind of disagree with the whole slot issue.

One slot is all that is needed. Look at the Laser 128 EX/EX2. Built in serial port with the option to get a serial to parallel adapter. Built in Clock, but even if it didn't have one, at one time there was the No-Slot-Clock. Built in Ram expansion without a slot. Built in 3.6 Mhz CPU and could have been replaced with the 8 Mhz Zip Chip. All I used the one slot for was a hard drive.

And, even apple was getting away from slots. Look at the IIc and IIc+. Really could have used one slot here for the hard drive.

Even the one slot on the C-64 was enough, as an ide card for storing games on a SD card was created for it as well. Can play a lot of games without switching cartridges. A faster cpu and more memory would have almost been redundant on a C-64.

Michael J. Mahon

unread,
Dec 5, 2014, 7:17:45 PM12/5/14
to
You miss the point.

Slots are not just to add your favorite mass market peripherals. They are
there to allow *any* peripheral to be added!

Built-ins are fine for a "packaged" system when only software
configurability is enough, but there were literally thousands of peculiar
peripherals added to the Apple II, most with no mass market, like electron
microscopes, speech trainers, and gas chromatographs.

Slots are the way a computer says "Welcome!" To the unknown future.

IBM saw it, and it was good.

Michael J. Mahon

unread,
Dec 5, 2014, 7:17:45 PM12/5/14
to
Because of their root in the 14.3MHz dot clock, all Apple II "pure" colors
are uniformly spaced 22.5 degrees apart in chroma phase. This provides
optimal coverage of the NTSC color gamut for 16 colors.

Michael J. Mahon

unread,
Dec 5, 2014, 7:17:45 PM12/5/14
to
And the Apple slots have fully decoded select lines and address space, plus
ROM/RAM selects and address space. They even provide bank-switched
extension space, DMA and interrupt support, and onboard ROM disable.

Very capable and convenient slots, indeed.

Michael J. Mahon

unread,
Dec 5, 2014, 7:17:46 PM12/5/14
to
My thoughts exactly--and I love "hert". ;-)

gid...@sasktel.net

unread,
Dec 5, 2014, 7:59:45 PM12/5/14
to

> > I kind of disagree with the whole slot issue.
> >
> > One slot is all that is needed. Look at the Laser 128 EX/EX2. Built in
> > serial port with the option to get a serial to parallel adapter. Built in
> > Clock, but even if it didn't have one, at one time there was the
> > No-Slot-Clock. Built in Ram expansion without a slot. Built in 3.6 Mhz
> > CPU and could have been replaced with the 8 Mhz Zip Chip. All I used the
> > one slot for was a hard drive.
> >
> > And, even apple was getting away from slots. Look at the IIc and IIc+.
> > Really could have used one slot here for the hard drive.
> >
> > Even the one slot on the C-64 was enough, as an ide card for storing
> > games on a SD card was created for it as well. Can play a lot of games
> > without switching cartridges. A faster cpu and more memory would have
> > almost been redundant on a C-64.
>
> You miss the point.
>
> Slots are not just to add your favorite mass market peripherals. They are
> there to allow *any* peripheral to be added!
>
> Built-ins are fine for a "packaged" system when only software
> configurability is enough, but there were literally thousands of peculiar
> peripherals added to the Apple II, most with no mass market, like electron
> microscopes, speech trainers, and gas chromatographs.
>
> Slots are the way a computer says "Welcome!" To the unknown future.
>
> IBM saw it, and it was good.



Slots are only good for a select few. Mostly hi-end gamers, science researchers and big business movie and music ventures. But apple finally saw the true advantage of the all-in-one architecture for normal home users. iMacs, iPad, iPhone, iPod - none had expansion slots. And only the iMacs had usb ports for hard drive expansion, which was all that was needed.

And Apple also saw the need for hi-end equipment as well. And offered the Power Mac in a tower for those select few who wet their bed because they have a card to install into a slot.

Apple saw it, and it IS even better.

On a similar note. On Wednesday night, the series "Stuff of America" articled both the Apple II and C-64. They highlighted the C-64 more than the Apple II as being the catalyst that brought in a new age for home users to have an electronic device in their home at having over 17 million sold.

Your Name

unread,
Dec 5, 2014, 8:56:31 PM12/5/14
to
In article <2931ead2-d79a-4600...@googlegroups.com>,
You don't really need slots. Ports work just as well for most things -
electron microscopes, etc. can just as easily be plugged in via an
ethernet port, or whatever.

These days there's no Mac that ships with slots (not even the Mac Pro),
although you can get external boxes with slots that plug into the
computer via the Thunderbolt port.



> On a similar note. On Wednesday night, the series "Stuff of America"
> articled both the Apple II and C-64. They highlighted the C-64 more than the
> Apple II as being the catalyst that brought in a new age for home users to
> have an electronic device in their home at having over 17 million sold.

The C64 sold more largely because it was cheaper. The fact that it was
also a good computer was an added bonus. We got the VIC20 and then C64
because the Apple II was too expensive.

Of course, the C64 is like the VW Beetle of the computer world. They
were still making C64 computers not that long ago and recently there
was a "C64" being sold that was really just a PC box with emulation
software.

Your Name

unread,
Dec 5, 2014, 8:59:44 PM12/5/14
to
In article
<1661738570439517135....@news.giganews.com>, Michael
J. Mahon <mjm...@aol.com> wrote:
> Michael Black <et...@ncf.ca> wrote:
> > On Fri, 5 Dec 2014, Your Name wrote:
> >> In article <puqdnaKU7JLN0xzJ...@giganews.com>, Bill Garber
> >> <will...@comcast.net> wrote:
> >>> "Steve Nickolas" <usot...@buric.co> wrote in message
> >>> news:alpine.DEB.2.02.1412050450060.70689@localhost...
> >>>> On Thu, 4 Dec 2014, Michael J. Mahon wrote:
> >>>>>
> >>>>> In fact, slots provide exactly the kind of future-
> >>>>> proofing that a durable design needs. Without slots,
> >>>>> a requirements shift means having to start over.
> >>>>
> >>>> Biggest plus of the Apple ][ over the C64.
> >>>
> >>> You can say that 6 more times. ;)
> >>
> >> The C64 had *a* slot, although it was almost totally used for pluging
> >> in games cartridges. :-)
> >>
> > Yes. But one slot is not that much better than no slots. The Radio
> > Shack Color Computer also had a slot, and it was fine so long as I only
> > needed to add a floppy disk controller. When I wanted to add a serial
> > port, I had to do some work to get "another slot". Radio Shack sold an
> > expander, but I wasn't spending money on that.
>
> And the Apple slots have fully decoded select lines and address space, plus
> ROM/RAM selects and address space. They even provide bank-switched
> extension space, DMA and interrupt support, and onboard ROM disable.
>
> Very capable and convenient slots, indeed.

I never said it was a better slot. Simply that the C64 did indeed have
a slot. The C64 slot was used for other things as well as game
cartridges. We had a "freeze frame" cartridge that basically dumped the
content of memory out to a file and allowed you to use cheat modes - it
was sold as a way of doing things like saving progress through a game,
but of course most people used it to simply copy games.

Steve Nickolas

unread,
Dec 5, 2014, 10:55:17 PM12/5/14
to
On Sat, 6 Dec 2014, Your Name wrote:

> I never said it was a better slot. Simply that the C64 did indeed have
> a slot. The C64 slot was used for other things as well as game
> cartridges. We had a "freeze frame" cartridge that basically dumped the
> content of memory out to a file and allowed you to use cheat modes - it
> was sold as a way of doing things like saving progress through a game,
> but of course most people used it to simply copy games.

Didn't the Apple ][ have addon cards like that too? I want to say
Wildcat?

-uso.

Michael Black

unread,
Dec 5, 2014, 11:11:23 PM12/5/14
to
But that only works if the computer has enough built in. And once you
start putting in the kitchen sink, the price goes up, so entry is higher
yet you may be paying for things you don't need.

One thing that made the Apple II was the slots. So endless things could
be added. When it was time for 80 columns, you could get a board. If you
needed a serial port, you could get that, or a parallel port. If you
wanted to go further, you could get that Z80 Softcard or that 6809 board,
or even that 16bit board. That meant one could keep on using the Apple II
longer than its original lifespan. But, if there'd only been one slot,
you'd have to choose between using a printer, or running CP/M, you
couldn't do both with one slot.

And people did grumble about the IIC. It had more built in than the II,
but if you wanted soemthing more exotic, a custom card had to be designed,
and those cost money. Then you get to the original Mac, and endless work
had to be done to expand memory, or add a hard drive.

Yes, only one slot meant cost was kept down. But like I said, when I
needed more than a floppy disk controller for my Color Computer, I had to
make a sort of extension, or spend more money to buy a bus expander from
Radio Shack. For many, you did need more than one slot.

Michael


Michael Black

unread,
Dec 5, 2014, 11:15:12 PM12/5/14
to
Yes, some thought was given to the Apple II expansion bus.

I've read that the S100 bus was really just thrown together, just a way of
expanding the 8080 pinout, rather than a serious attempt at a universal
expansion bus. Of course, at the time, it was just there to add
peripherals, it was only when it became a "standard" that the problems
came in. There were lots of problems when people wanted to add CPU cards
for something that wasn't an 8080.

It is a surprise that Steve Wozniak could see so much into the future to
create that bus.

Michael

Bill Buckels

unread,
Dec 5, 2014, 11:28:05 PM12/5/14
to
"Michael J. Mahon" <mjm...@aol.com> wrote:
>Because of their root in the 14.3MHz dot clock, all Apple II "pure" colors
>are uniformly spaced 22.5 degrees apart in chroma phase. This provides
>optimal coverage of the NTSC color gamut for 16 colors.

It's a thing of beauty.

By comparison, the IIgs RGB color spacing is WAY-OFF. It can't possibly
display DHGR correctly, composite artifacting aside. The more I got into
converting from true-color to DHGR the more the poor emulation capabilities
of the IIgs started to fall apart like a cheap suit. It was very poorly
done... it seems to sell hardware with "sharp" pixels to display SHR at the
expense of a really second rate DHGR display.

Taking a mean distance cleans-up the DHGR RGB display quite a bit it's a
poor match for a television. All a person can do is work around it. This is
one area that Apple didn't do well at all.

Bill



Your Name

unread,
Dec 6, 2014, 12:55:09 AM12/6/14
to
In article <alpine.LNX.2.02.1...@darkstar.example.org>,
Michael Black <et...@ncf.ca> wrote:

> On Fri, 5 Dec 2014, gid...@sasktel.net wrote:
>
> > On Friday, 5 December 2014 15:50:16 UTC-6, Michael Black wrote:
> >> On Fri, 5 Dec 2014, Your Name wrote:
> >>
> >>> In article <>, Bill Garber
> >>> <> wrote:
> >>>> "Steve Nickolas" <> wrote in message
> >>>> news:alpine.DEB.2.02.1412050450060.70689@localhost...
> >>>>> On Thu, 4 Dec 2014, Michael J. Mahon wrote:
> >>>>>>
> >>>>>> In fact, slots provide exactly the kind of future-
> >>>>>> proofing that a durable design needs. Without slots,
> >>>>>> a requirements shift means having to start over.
> >>>>>
> >>>>> Biggest plus of the Apple ][ over the C64.
> >>>>
> >>>> You can say that 6 more times. ;)
> >>>
> >>> The C64 had *a* slot, although it was almost totally used for pluging
> >>> in games cartridges. :-)
> >>>
> >> Yes. But one slot is not that much better than no slots. The Radio Shack
> >> Color Computer also had a slot, and it was fine so long as I only needed
> >> to add a floppy disk controller. When I wanted to add a serial port, I
> >> had to do some work to get "another slot". Radio Shack sold an expander,
> >> but I wasn't spending money on that.
> >
> >
The C64 had multiple ports specifically for some peripherals as well as
the cartridge slot:

- Cartridge expansion slot for program modules and memory
expansions, among others

- Serial bus for CBM printers and disk drives

- Cassette tape interface

- User port for modems and third-party printers, among
other things

- Two game ports for joysticks, mouse, graphics tablets,
etc.

Michael J. Mahon

unread,
Dec 6, 2014, 5:34:14 PM12/6/14
to
The NTSC capability of the IIgs has always seemed to me like a
quick-and-dirty work-around.

They had to deal with converting native RGB to composite, so they managed
"compatibility" by leveraging a rather bad NTSC-to-RGB converter, then
converting back to NTSC again!

I wish they'd just bypassed the two conversions and delivered straight
Apple II NTSC video out.

Michael J. Mahon

unread,
Dec 6, 2014, 5:34:14 PM12/6/14
to
Steve and his friend Allen Baum designed the Apple II peripheral bus
architecture. Between them, they had some experience with peripheral busses
and their usual shortcomings.

Michael J. Mahon

unread,
Dec 6, 2014, 5:34:15 PM12/6/14
to
This thread is drifting into the general question of the role/benefit of
slots in today's computers.

That's way off the point of slots in the Apple II timeframe, where they
literally spawned an industry in building novel slot cards that extended
the Apple II far beyond it's built-in capabilities, keeping it alive and
useful far beyond contemporaneous slot-less computers.

Your Name

unread,
Dec 6, 2014, 9:07:11 PM12/6/14
to
In article
<1336466773439596808....@news.giganews.com>, Michael
Which brings it back to my original point - not all those
contemporaries were slotless. Both the Commodore Vic20 and C64 had a
slot and other ports. I think the the early Ataris also had at least
one slot and extra ports. I can't recall whether or not the Texas
Instruments computer a friend had actually had any slots.

Yes, there were also a lot of crappy "computers" released around that
time, like those from Sinclair, which didn't have slots, but then they
were only really toys.

Michael Black

unread,
Dec 6, 2014, 11:28:37 PM12/6/14
to
On Sun, 7 Dec 2014, Your Name wrote:


> Which brings it back to my original point - not all those
> contemporaries were slotless. Both the Commodore Vic20 and C64 had a
> slot and other ports. I think the the early Ataris also had at least
> one slot and extra ports. I can't recall whether or not the Texas
> Instruments computer a friend had actually had any slots.
>
The TI 99/4 had a slot. One reason it didn't sell well was that the
peripherals for the slot were expensive, and that was something to do with
the design. So it didnt' sell that well, until the price dropped to
somewhere around a hundred dollars, at which point it became a relatively
successful computer (at least, enough people remember it decades later).

I think somewhere upthread the point was made that the multiple slots were
important to the Apple Ii. It was probably uncommon, up to a certain
point, for a "home computer" to not have some method of expansion. But
just bringing out the bus lines to a connector didnt' necessarily make
expansion easy.


> Yes, there were also a lot of crappy "computers" released around that
> time, like those from Sinclair, which didn't have slots, but then they
> were only really toys.
>
The SInclair had a slot, at least one of the SInclairs. There were memory
expansion modules, and likely other things, that could plug in. Maybe the
ZX80 didn't have a slot, and they added it to the 81, but I thought both
had them.

On the other hand, it apparently wasn't well designed physically, I've
read comments about the flakieness of that connector, people having to add
some suppor underneath the module plugged into the connector.


Michael

Your Name

unread,
Dec 7, 2014, 12:06:48 AM12/7/14
to
In article <alpine.LNX.2.02.1...@darkstar.example.org>,
Michael Black <et...@ncf.ca> wrote:
> On Sun, 7 Dec 2014, Your Name wrote:
> >
> > Which brings it back to my original point - not all those
> > contemporaries were slotless. Both the Commodore Vic20 and C64 had a
> > slot and other ports. I think the the early Ataris also had at least
> > one slot and extra ports. I can't recall whether or not the Texas
> > Instruments computer a friend had actually had any slots.
> >
> The TI 99/4 had a slot. One reason it didn't sell well was that the
> peripherals for the slot were expensive, and that was something to do with
> the design. So it didnt' sell that well, until the price dropped to
> somewhere around a hundred dollars, at which point it became a relatively
> successful computer (at least, enough people remember it decades later).
>
> I think somewhere upthread the point was made that the multiple slots were
> important to the Apple Ii. It was probably uncommon, up to a certain
> point, for a "home computer" to not have some method of expansion. But
> just bringing out the bus lines to a connector didnt' necessarily make
> expansion easy.

They weren't uncommon. Pretty much every *real* old "home computer"
(i.e. prior to the original Mac and Windows) I've ever used has had at
least one slot and specialised ports.



> > Yes, there were also a lot of crappy "computers" released around that
> > time, like those from Sinclair, which didn't have slots, but then they
> > were only really toys.
>
> The SInclair had a slot, at least one of the SInclairs. There were memory
> expansion modules, and likely other things, that could plug in. Maybe the
> ZX80 didn't have a slot, and they added it to the 81, but I thought both
> had them.

One solely for memory expansion isn't really a "slot" as is being meant
in this topic, but you're right if it had one it may have been put to
other uses.



> On the other hand, it apparently wasn't well designed physically, I've
> read comments about the flakieness of that connector, people having to add
> some suppor underneath the module plugged into the connector.

Nothing about Sinclair's toys was well designed ... they were all cheap
'n' nasty rubbish I wouldn't touch with a million light year long barge
pole.

Michael J. Mahon

unread,
Dec 7, 2014, 12:37:30 AM12/7/14
to
Your Name <Your...@YourISP.com> wrote:
> In article <alpine.LNX.2.02.1...@darkstar.example.org>,
> Michael Black <et...@ncf.ca> wrote:
>> On Sun, 7 Dec 2014, Your Name wrote:
>>>
>>> Which brings it back to my original point - not all those
>>> contemporaries were slotless. Both the Commodore Vic20 and C64 had a
>>> slot and other ports. I think the the early Ataris also had at least
>>> one slot and extra ports. I can't recall whether or not the Texas
>>> Instruments computer a friend had actually had any slots.
>>>
>> The TI 99/4 had a slot. One reason it didn't sell well was that the
>> peripherals for the slot were expensive, and that was something to do with
>> the design. So it didnt' sell that well, until the price dropped to
>> somewhere around a hundred dollars, at which point it became a relatively
>> successful computer (at least, enough people remember it decades later).
>>
>> I think somewhere upthread the point was made that the multiple slots were
>> important to the Apple Ii. It was probably uncommon, up to a certain
>> point, for a "home computer" to not have some method of expansion. But
>> just bringing out the bus lines to a connector didnt' necessarily make
>> expansion easy.
>
> They weren't uncommon. Pretty much every *real* old "home computer"
> (i.e. prior to the original Mac and Windows) I've ever used has had at
> least one slot and specialised ports.

Take a good look at the Apple II slot architecture.

I think you'll find that there was nothing else done so well and so
completely--before or since.

Fully decoded ROM space, expansion ROM space, fully decoded "device select"
space, interrupt support, DMA support, etc. Designing a peripheral card
for the Apple II was a piece of cake.

Even activation support (IN#s, PR#s) and software protocols were specified
and provided.

Not every card manufacturer "got it", but many did, making adding a
peripheral about as easy as it gets.

I have a six degree-of-freedom robotic arm that plugs into an Apple slot.
After issuing a PR#n to the card, ROM Applesoft is extended to provide arm
control commands, courtesy of the "ARMBASIC" extensions in the card's ROM,
which uses the 6502 to directly control all the stepper motors in the arm
simultaneously!

It doesn't get easier than that--in fact, it hasn't gotten *as easy* yet...

Michael J. Mahon

unread,
Dec 7, 2014, 12:37:30 AM12/7/14
to
Michael Black <et...@ncf.ca> wrote:
> On Sun, 7 Dec 2014, Your Name wrote:
>
>
>> Which brings it back to my original point - not all those
>> contemporaries were slotless. Both the Commodore Vic20 and C64 had a
>> slot and other ports. I think the the early Ataris also had at least
>> one slot and extra ports. I can't recall whether or not the Texas
>> Instruments computer a friend had actually had any slots.
>>
> The TI 99/4 had a slot. One reason it didn't sell well was that the
> peripherals for the slot were expensive, and that was something to do
> with the design. So it didnt' sell that well, until the price dropped to
> somewhere around a hundred dollars, at which point it became a relatively
> successful computer (at least, enough people remember it decades later).
>
> I think somewhere upthread the point was made that the multiple slots
> were important to the Apple Ii. It was probably uncommon, up to a certain
> point, for a "home computer" to not have some method of expansion. But
> just bringing out the bus lines to a connector didnt' necessarily make expansion easy.

Hear, hear!

A lot of thought and design are needed to make simple, effective interfaces
possible. Otherwise, you're just constructing an interface to a raw
microprocessor.

>> Yes, there were also a lot of crappy "computers" released around that
>> time, like those from Sinclair, which didn't have slots, but then they
>> were only really toys.
>>
> The SInclair had a slot, at least one of the SInclairs. There were
> memory expansion modules, and likely other things, that could plug in.
> Maybe the ZX80 didn't have a slot, and they added it to the 81, but I
> thought both had them.
>
> On the other hand, it apparently wasn't well designed physically, I've
> read comments about the flakieness of that connector, people having to
> add some suppor underneath the module plugged into the connector.

It was a mechanical disaster!

Michael J. Mahon

unread,
Dec 7, 2014, 12:37:31 AM12/7/14
to
And my point is that the Apple slots are not just an _ad hoc_ "expansion
port", they were carefully and fully designed to support truly "plug and
play" peripherals, in a way that no computer has done, before or since.

Your Name

unread,
Dec 7, 2014, 1:11:26 AM12/7/14
to
In article
<813549790439622046.2...@news.giganews.com>, Michael
That's not what you said, but you now want to try to move the goal
posts so you can pretend you were right all along ... whatever you want
to delude yourself with. :-\

STYNX

unread,
Dec 7, 2014, 1:46:31 AM12/7/14
to
On Sunday, December 7, 2014 6:37:31 AM UTC+1, Michael J. Mahon wrote:
> And my point is that the Apple slots are not just an _ad hoc_ "expansion
> port", they were carefully and fully designed to support truly "plug and
> play" peripherals, in a way that no computer has done, before or since.
> --
> -michael - NadaNet 3.1 and AppleCrate II: http://home.comcast.net/~mjmahon

The slots are a integrated architecture in hardware as well as software. That is what i like about the Apple II. You can put a few pieces of hardware together to get something functional or go the extra mile to integrate your design into the system completely. All the possibilities without the fuss... no system bus has since been so easy to use with that much possibilities. With ISA IBM took a few ideas from the A2 and implemented a similar but simpler system-bus concept and modified it with each Model of the PC.
...
If someone compares the ZX Series with a toy, you can say the same about the VC20 and C64... except they had _slightly_ better concepts. My thought is that each and every computer system of the early 80s had unique concepts and designs that created specific problems to solve others. No system was perfect and most were designed for a handful of specific tasks. In most cases even the designers didn't have a clue what the systems could actually be used for.

-Jonas

Bill Buckels

unread,
Dec 7, 2014, 6:55:16 AM12/7/14
to
"Michael J. Mahon" <mjm...@aol.com> wrote:
>I wish they'd just bypassed the two conversions and delivered straight
>Apple II NTSC video out.

A far better improvement than the one that WOZ thought about the other
night.

Bill


Tempest

unread,
Dec 7, 2014, 10:11:38 AM12/7/14
to
On Sunday, December 7, 2014 6:55:16 AM UTC-5, Bill Buckels wrote:
> "Michael J. Mahon" wrote:
> >I wish they'd just bypassed the two conversions and delivered straight
> >Apple II NTSC video out.
>
> A far better improvement than the one that WOZ thought about the other
> night.
>
> Bill

Is that possible to do? The reason I still have my IIe out is because the colors on the HR and DHR graphics on the IIgs are off and look cruddy. Some games like Zork Zero aren't even playable because you can't read the DHRG text.

David Schmenk

unread,
Dec 7, 2014, 11:22:31 AM12/7/14