Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Is it time for another Forth chip?

1,669 views
Skip to first unread message

Wayne morellini

unread,
Jun 7, 2022, 5:52:48 AM6/7/22
to
I know we are waiting to hear what the 6Ghz chip Stephen has been working with will turn out like, and what Green Arrays will release for the glasses (which type of thing demands an advanced design). But recently, I saw a document on Colorforth for ARM, and comparisons to Swift Forth etc. Which got me wondering about a lower end design. Now, with the passing of Doctor Ting, it reminds me of the Mup21 he had that kicked things off, and Jeff's work latter. Isn't it time we had something more like these designs upgraded? 16 bit or more versions?

minf...@arcor.de

unread,
Jun 7, 2022, 12:38:59 PM6/7/22
to
Wayne morellini schrieb am Dienstag, 7. Juni 2022 um 11:52:48 UTC+2:
> I know we are waiting to hear what the 6Ghz chip Stephen has been working with will turn out like, and what Green Arrays will release for the glasses (which type of thing demands an advanced design). But recently, I saw a document on Colorforth for ARM, and comparisons to Swift Forth etc. Which got me wondering about a lower end design. Now, with the passing of Doctor Ting, it reminds me of the Mup21 he had that kicked things off, and Jeff's work latter. Isn't it time we had something more like these designs upgraded? 16 bit or more versions?

A Micropython chip will be more attractive, I guess.

Paul Rubin

unread,
Jun 7, 2022, 2:58:01 PM6/7/22
to
Wayne morellini <waynemo...@gmail.com> writes:
> Isn't it time we had something more like these designs upgraded? 16
> bit or more versions?

If you want to build it, no one is stopping you! There is a free fab
service from Google that you probably know about, for FOSS projects. I
think it would mostly be a research-ich project to see what performance
you can get with limited hw resources, compared with conventional
designs. A b16-like Forth macrocell might be more directly useful for
people making their own chips.

For anyone mostly interested in Forth as a programming language, special
chips are likely too much hassle to even think about, without some
overwhelming advantage over regular chips.

David Schultz

unread,
Jun 7, 2022, 5:13:08 PM6/7/22
to
On 6/7/22 4:52 AM, Wayne morellini wrote:
> I know we are waiting to hear what the 6Ghz chip Stephen has been working with will turn out like, and what Green Arrays will release for the glasses (which type of thing demands an advanced design). But recently, I saw a document on Colorforth for ARM, and comparisons to Swift Forth etc. Which got me wondering about a lower end design. Now, with the passing of Doctor Ting, it reminds me of the Mup21 he had that kicked things off, and Jeff's work latter. Isn't it time we had something more like these designs upgraded? 16 bit or more versions?

The first step would be to implement your design on a FPGA to see how it
works. Then, if it works and there is some perceived need for more
speed, and a market to justify the expense, go full custom silicon. Not
standard cell please. That is a close cousin of FPGA.

The tools to dabble in custom silicon are out there. I used magic in
grad school long ago and it is still kicking and screaming on Linux.
IRSIM is the logic level simulation tool for that and it is also still
around. I used magic on a 2u process but the idea was that the design
rules were scalable. I think that assumption died a while back.

http://opencircuitdesign.com/


Alas the tiling tools (Lager) I used back then are harder to find.


Start with a FPGA. Jumping to full custom is going to be a big step
because you will almost certainly need a cache with all its complexity
if you use off chip memory. An FPGA might be slow enough that SRAM can
keep up.


--
http://davesrocketworks.com
David Schultz

Wayne morellini

unread,
Jun 8, 2022, 6:39:23 AM6/8/22
to
A more fancy fashion accessory. Why waste the time on a dead end performance implementation that could never be as good?

Wayne morellini

unread,
Jun 8, 2022, 7:59:50 AM6/8/22
to
On Tuesday, June 7, 2022 at 7:52:48 PM UTC+10, Wayne morellini wrote:
>

I wasn't expecting such negative answers, but more community spirit. If I wanted to do my own design, I would do it under agreement, this is more about community design. I remember when I wanted to implement my own uniquely improved design, and people wanted to seize control of it. In real life money flows one way, in business money tend to flow one opposite way (that's not a Chinese proverb, that's just me writing). :)

I had thought of colorforth as a sort of model of a stripped down x86 mode, like Arm has Thumb, where the instructions can use existing instruction circuitry. But it could also be an alternative to Thumb on the Arm too. A future basis They could make a super light weight computing chip (SLWC) out of. Not as light as the recent misc's, but also able to be put in arrays, and used in a similar way that Arm is used in custom chips. So, that companies could incorporate into many forms of chips.

So. a cell, isa and interfacing spec (basically what they use for Arm chips), and pursue a manufacturer to kick off with a io logic processor version (also suitable in arrays), a minimal microcontroller (also suitable as an array controller) and a full system on a chip (also useful to have an array it controls) in 16 and 32 bit versions. All the same two cells with different attachments. To kick things off, and attract other supporters instead of RiscV and Arm. Basically, largely coming in the low end, with the 32 bit being a simpler alternative to the high end. There is no reason you could not reliably emulate desktop OS administration and security. Sort of like a KaiOS/Firefox OS smart phone compared to an android phone. The simpler getting a larger share of lower end product category.

As far as my own. It's unaffordable, but maybe in the future if success is already obtained to afford it. Isn't Google non commercial open source or something. Maybe that helps here, but not for myself. Control of design and use of resources to enhance design and market is needed to maximise public good here. But, 2 micron chips? I'm talking about modern process sizes, certainly not less than the 180nm which is currently used for misc.

Commercially myself, I am more interested in simple printed circuits at 2.5 micro to 1 micron. I think I finally have a clean room solution to do this at home. That's more doable on an individual's level. 1 billion dollars to $100, it doesn't matter so much about making chips so small if the cost of making is cheap enough. I tried to contact Jeff a long time ago about a seperate technology, which might even be suitable for 180nm, where they could role stamp a circuit. I was envisioning printing on plastic sheets, and stacking them together in a stack to make a super handheld Game system. If you wanted to use normal silicon chip process structures, there are other sheets. A thin stack gives you the whole system. This sort of tech is probably more viable for GA's current mode sizes. I know somebody with a number of chip ovens in his low throughput factory. If he can do it, GA, could as well. There are places out there, just like with the Swiss watch manufacturers and 180nm process nodes. Imagine if they could make a all in one system board solution in 1 inch, for a fraction of the price compared to using conventional techniques and their normal process node, a big saving for them and their clients. But, that is only for low clock low energy applications, which is most things. High concentrations of processing required less stacking, different stack materials, and more cooling. So, you can apply this to a lot more situations requiring significant processing loads too, before you have to resort to conventional chips. My design drafting for my retro gaming computer and processing designs are turning up a lot of simple alternative ways of don't things. So, a lot of things can be done.

Anyway, I remember Jeff telling me he couldn't get any further help with money for chips from his family, unless it was the kind you get with fish. Considering what else happened to him with his prototype chip run, it was a real loss to the computer industry that he couldn't do the F series of processors. He had good intent of ideas, and his wafer arrays. Since my father unexpectedly died recently, before I could get back in to spend time with him, I am also coming to a similar dilemma, even if the Lymes treatment seems to be working very well, I'm now neurologically handicapped compared to my past work, bit at least I've got spades of past work to think about. Jeff had ability, lost money and got sick, and I've lost ability, getting better but loosing finance at the wrong time.

Anyway, this is not about what I can do and my work. It's about the community. There is a guy with a channel called the Unknown Cat, that I've seen. And, he was saying he's getting cranky and slowing down because he's getting older. Well, it seems to be a lot of that going around here, instead of community.

Everybody loosing their edge or mind, is getting negative. I've seen

Why bother with b16, I understand that is the one which was cut down for FPGA? Instead of say Dr Ting's 16 or 32 bit, or Jeff's VLSI design? What's wrong with using that compared to modern misc instruction set format?

FPGA makes more sense when the product is FPGA based, or you are prototyping towards final product. There were misc chip simulators out there, after you go through the extending forth to simulate the logicality of the instruction set architecture.

Anyway, some thoughts.

minf...@arcor.de

unread,
Jun 8, 2022, 8:24:51 AM6/8/22
to
Just because you don't agree with what others wrote, you call them negative?

Isn't it positive at all that they "warned" before entering a possible dead end?

Rick C

unread,
Jun 8, 2022, 8:52:08 AM6/8/22
to
On Tuesday, June 7, 2022 at 5:52:48 AM UTC-4, Wayne morellini wrote:
> I know we are waiting to hear what the 6Ghz chip Stephen has been working with will turn out like, and what Green Arrays will release for the glasses (which type of thing demands an advanced design). But recently, I saw a document on Colorforth for ARM, and comparisons to Swift Forth etc. Which got me wondering about a lower end design. Now, with the passing of Doctor Ting, it reminds me of the Mup21 he had that kicked things off, and Jeff's work latter. Isn't it time we had something more like these designs upgraded? 16 bit or more versions?

I saw your last post, but let me respond to this one. There have been many stack chips, even if virtually no Forth chips (a pedantic distinction, I know). You seem to want to fix a problem in some previous chip without indicating what needs to be addressed. If the Mup21 was a good CPU chip, why is it no longer in production? Without an understanding of that, why do you think a redo would be successful?

I also don't understand the interest in a 16 bit chip. In the microprocessor world, nearly all devices are 4/8, 16 or 32 bits. Over time the trends which have emerged are that 4/8 bit MCUs are still dominant in low end applications where cost is the determinant. 32 bit processors have come down in price enough to dominate in nearly every other application space. 16 bit processors are actually becoming less popular, both in percentages of design wins and numbers of devices shipped. There is no reason to think the applications for stack processors would be any different. A 16 bit device is the last design I would consider... well, maybe next to last, after the 18 bit design.

The next stack processor that should be marketed is a 32 bit design... IF there is a genuine interest in selling the chip to a general market. It also needs a number of features lacking in most stack processors.

1) A crystal clock oscillator, if not two. 32.768 kHz and some high frequency such as 12.0 MHz. While an asynch processor is a great idea, applications are seldom time invariant, so need clocking at some point in the design. Possibly this clock can be included in an external interface, but if no such interface exists, then a high speed clock is needed in the chip. A low speed clock is useful for wake up calls to poll the environment for activity. If the activity has a signal that can wake up the CPU, fine, but often the polling is needed to check an analog status or the state of a noisy I/O, such as a button, that you don't want to wake up for every transition.

2) 3.3V I/Os, if not 5V tolerant. Having to add parts to a design to provide level shifting is an absurdity in today's MCU world. With the vast majority of MCUs having 3.3V I/O capability, if not 5V tolerance, it is hard for a device to compete which requires level shifting.

3) There are others, but this is the real issue with a stack processor... C based development tools. Yes, this may be blasphemy in this group, but if an MCU is going to succeed in the world today, it must support the language that the vast majority of embedded programmers use. There must be the debugging tools the world is used to. Such a chip must allow the world to use it, the way *they* are accustomed to working. The conventional Forth programming paradigm is too radical, and too crude (compared to what they typically use) for wide spread adoption in the rest of the world. There is no CPU architecture that is going to make programmers abandon 50 years of progress in development techniques.

Just my opinion with a few facts.

--

Rick C.

- Get 1,000 miles of free Supercharging
- Tesla referral code - https://ts.la/richard11209

Wayne morellini

unread,
Jun 8, 2022, 9:21:39 AM6/8/22
to
Don't you think that's a bit of a twist? They cut accross the conversation negatively towards incompatible directions. This is not a fantasy where down is dysfunctionally up and up is dysfunctionally down. Harder and more limited is better, and actual better is harder and more limited. Community becomes non community, and non community becomes community. It's extremely short sighted and in some places, reverse.

Rick C

unread,
Jun 8, 2022, 9:55:40 AM6/8/22
to
What you say is so right, which must mean it is all wrong. Or is that backwards?

--

Rick C.

+ Get 1,000 miles of free Supercharging
+ Tesla referral code - https://ts.la/richard11209

minf...@arcor.de

unread,
Jun 8, 2022, 10:12:52 AM6/8/22
to
Second this. In addition, minimal power consumption, idle states, and flexible interrupt
handling capabilities could be some more selling points. Since Forth code can be extremely
compact, memories can be held relatively small
.
Still one would have to be able to beat eg Arduino Pico.

Leave out such "annoying edge computing facilities", one would have to race against atiny4.
Guess who'll win.

Wayne morellini

unread,
Jun 8, 2022, 10:27:04 AM6/8/22
to
Facts, hardly anything you said is not accommodated in what I'm saying. You can make it level tolerant. The starting point is not the finish. How many of those 4 bit CPU's are C processors. C is doable here too. This is exclusively a forth county, building forth usage excercise, where success means work for people here, and more new forth programmers. C didn't completely stop other programming where required in the past. In the Arduino like space it's a novel marketable challenge, and the space is ripe for modular future development of small manufacturer mass CE products, which are hard to source parts for cheaply at lower volumes, even to talk to somebody about buying. Here. We get towards commodity chips and modules, that are like chip packages in size. But, that all requires foresight in a ecosystem developer. How many of those 4 bit processors can you get that are smaller than a 1000 transistor plus 16 bit processor, where you don't have to use the extra lines and treat it like a 5 bit processor, or 4 bits. So, 4 bits has no real advantage often, and at time the 16 bits can offer a lot more flexibility and speed. Now, my original retro design proposal was a 8 bit processor with single thread 16 bit like performance, I could do it as 4 bits, as that is the target on the high side for instruction width. You could use a setting to clip the data width to a 4 or 8 bit, thumbnail, to maintain code repository for a 4 or 8 bit version. Many ways to do things. If the community would prefer 4 or 8 bits instead, they are welcome to it. But, I think 26 bit misc is a healthy compromise.

Now, considering DR Tings recent passing, let's not get into the reasons that the original mup didn't succeed, and note that I never said revive that, but something more complete, and I did point yo Jeff's later work which was a significantly useful improvement. But. Is there rally any reason that Tings p16 and p32 aren't good starting places. The suitability of the instruction set architecture is more important, as the underlying circuit can be completely redesigned at any point after. On the other hand, if the ISA isn't perfect, then comparability will pay when changed. We are only talking about a usable sellable proof 0f concept to start with. A first manufacturer, could completely design their own circuit to fit the isa, if they want. The actual circuit is a little moot. But good if it also is very ideal. As long as none of this is like the short memory misc array programming communications model. I'm not happy. My own proposals completely get away from that complexity at greatly increased throughput and reduced power consumption of pass through communications. Hence, that's mine, with several other major enhancements, for my own commercial chips. I get sick of having to talk companies into improvements. To have somebody punishing people for what is still good programming practices is intolerable. Our problems often are not the architecture
You can tell that Jeff privately told me of a number of things in this field of doing business. But another thing, is that I would do the video coprocessor in a radically different way than the mup. I would have simple nested counters and I'm looking at a list of display lists, to do complex things better simpler. Radically. Even compressed lossless graphics suitable for a 8 bit era computer. I'm going say this, because there is no market for such work, and I'm not interested in pursuing it, but I'm working out how to make a simple FPGA comparable to a blotter for an Atari VCS, and adding multicoloured bitmapped graphics, and some other radical stuff where you could make a game comparable to Sonic the Hedgehog on the VCS, that looks like the master system version. That's about as far as I can push it without putting other processing circuits for 3D. The stuff which could have potentially bbeen done in the 1970's and 1980's, is incredible. But the VCS was made down to a price and the 64 needed just a few more changes to get that bit more.

That's why places like .. frustrate me. You see how everything could be flipped and come out good, but you get stuff that makes people resist instead, as we both know in the past. Unfortunate, modern systems are too powerful and complex for simple things to make a significant difference outside of performance. However. Retro is a niche market and can help develop a viable chip for further use. Short term, up to a millions quantity product is possible within the third release year, but my end product is still in the tens of millions territory. Money, brains and talent, is what I'm looking for. Unfortunately, I've lost some of that, but these are still relatively simple products in untapped high appeal market segments. That's called promise, in mass marketing terms.

Facts and certainty.
.


Wayne morellini

unread,
Jun 8, 2022, 10:35:12 AM6/8/22
to
Thanks for that. I'll have to have a look at the AVR stuff. The thing we 9ffer 9s better performance for less transistors on the ground, equalling less cost. I would be interested in finding out about anything that can clearly best that, or match it. But, even do. Of this just gets close enough then it would probably normally get picked up instead..

Paul Rubin

unread,
Jun 8, 2022, 8:40:46 PM6/8/22
to
Wayne morellini <waynemo...@gmail.com> writes:
> Commercially myself, I am more interested in simple printed circuits
> at 2.5 micro to 1 micron. I think I finally have a clean room
> solution to do this at home. That's more doable on an individual's
> level.

I couldn't read that whole long post, but tons of very interesting chips
were done in 2.5 micron and larger sizes. The Mead-Conway revolution
happened in the 3 to 5 micron era. There was a #homecmos channel on
freenode for a while that was trying to do stuff at, iirc, something
like 12 microns. If you have an affordable way to make 2.5 micron chips
at home, that is really quite revolutionary, especially if you can do
mixed signal.

As for a MISC-like chip though, yes of course there is skepticism: who
would want a thing like that, at least as a separate chip rather than a
macro cell? DIY satisfaction (and I'm all in favor of that) seems like
the main reason to make it, unless you've got some pretty clear and
quantitative claims about how it would outperform conventional chips at
some meaningful application.

This morning I was thinking it would be interesting to have a chip with
wide but single core SMT, to allow super fast coroutine switching
without having to save and restore registers, sort of like a Forth
multitasker (that only has a couple of registers to save and restore, so
the task switcher is fast), but allowing something more like
conventional OS's and compilers, which do use lots of registers. That
might be more interesting than a MISC chip. It could be a RISC-V with
a special instruction added to call another context, and there might be
8 or so contexts available in a core, sort of like the Parallax
Propeller.

Paul Rubin

unread,
Jun 8, 2022, 8:57:48 PM6/8/22
to
Wayne morellini <waynemo...@gmail.com> writes:
> Thanks for that. I'll have to have a look at the AVR stuff. The
> thing we offer is better performance for less transistors on the
> ground, equalling less cost. [2 typos fixed, I think]

ALU transistor costs in MCU's today are almost irrelevant compared to
transistors driving i/o pins, transistors in memory arrays, and mixed
signal peripherals. Raspberry Pi Foundation decided to make its own MCU
a few years ago (the RP2040). It sells for $1 retail, it has two 32-bit
ARM cores, 264KB of ram, its own weird programmable digital i/o
peripheral, A-D converters, timers, etc. The ARM cores are tiny
compared to the ram array and the other stuff. If you decreased the
transistor count in the ARM cores to zero by replacing ARM with a stack
architecture, the chip cost wouldn't decrease by enough for anyone to
care.

I wouldn't pay attention to AVR when making comparisons to your MISC
part. I do like AVR8 but really it's a legacy design. You have to
compare to ARM, RISC-V, and the like.


> I would be interested in finding out about anything that can clearly
> best that, or match it.

Before anyone can suggesting things better than X, you first have to
tell us what X is. X is your MISC design and V is the conventional
thing (RISC-V, say). Does X use fewer transistors? Don't care (see
above), ALU transistors are free. If your rationale involves transistor
count then you have already declared irrelevance. Does X provide more
operations per unit of energy? That is much more interesting. GA at
least focused on that. But you have to give us some numbers for X,
since you are the one who knows anything about it.

We do have the GA chips, which were done by smart people. Do they have
the advantage that GA claimed? I am not sure, but can provisionally
accept "yes". Is the advantage so overwhelming as to get people to use
GA144's in favor of the stuff they are used to? Apparently not.

Paul Rubin

unread,
Jun 8, 2022, 9:03:51 PM6/8/22
to
Wayne morellini <waynemo...@gmail.com> writes:
> Facts, hardly anything you said is not accommodated in what I'm
> saying. You can make it level tolerant. The starting point is not
> the finish. How many of those 4 bit CPU's are C processors.

I couldn't read that very long post either (could you try shorter
ones?), but there really are no 4 bit processors these days, except
maybe as sequencer cells inside ASIC's. The Padauk PA150 is a super
cheap 8 bit processor, costing 3 cents (0.03 USD) retail in small
quantities. It can run C code. I was interested in it, but I figure I
can afford to splurge and get the more powerful 10 cent version.

The AVR is a step up from the Padauk and is also 8 bits and was designed
to run C. It has 32 8-bit registers and is a convenient compiler
target. THe Padauk is a minimal load-store design with a single
accumulator, so not that great for HLL's, but people do use it.

dxforth

unread,
Jun 9, 2022, 12:45:27 AM6/9/22
to
On 9/06/2022 10:57, Paul Rubin wrote:
>
> I do like AVR8 but really it's a legacy design.

'The infliction of cruelty with a good conscience is a delight to
moralists. That is why they invented AVR8.' - Bertrand Russell

Rick C

unread,
Jun 9, 2022, 3:15:10 AM6/9/22
to
The "idle states" are probably not required, IF a design technique is used like the F18, where CPUs are automatically stopped when there is no further data for them to process. The "idle state" is basically OFF with no penalty for restarting.

Forth code can be smaller, but "extremely" is a bit of an overstatement I think.


> Still one would have to be able to beat eg Arduino Pico.

Beat it in what regard exactly? Do you mean power consumption?


> Leave out such "annoying edge computing facilities", one would have to race against atiny4.
> Guess who'll win.

Not sure what you are saying with this.

--

Rick C.

-- Get 1,000 miles of free Supercharging
-- Tesla referral code - https://ts.la/richard11209

minf...@arcor.de

unread,
Jun 9, 2022, 3:34:47 AM6/9/22
to
A hypothetical Forth chip could IMO only be successful in the embedded or microcontroller domain.
There flexible IOs to communicate with the outer world (edges) are essential. This has to be reflected
in the chip design.
For instance the Arduino Pico offers
https://arduino-pico.readthedocs.io/en/latest/pins.html
Or of the atiny family the small atiny4
https://www.microchip.com/en-us/product/ATTINY4


Rick C

unread,
Jun 9, 2022, 3:45:24 AM6/9/22
to
Virtually every coffee maker, microwave oven, and remote control has a 4 bit processor in it. There's only one reason why anyone would use a 4 bit MCU, because it costs less. So your claims that the 4 bit devices are dead are, as Mark Twain said, "greatly exaggerated". You should try to keep in mind that, your experiences are quite different from a Chinese designer who is buying at a 1E6 piece price.

Rolling your own processor at home is a nice idea. Lots of potential fun in that. But it is never going to compete with commercial products. While micron level designs can have low leakage current, they will be very power hungry when active. The power consumption in CMOS devices is from charging and discharging capacitance in the circuit. Capacitance of a 1 micro feature size design is going to be more than an order of magnitude higher than the rather obsolete 180 nm process used for the GA144 and even worse in comparison to more modern processes used for MCUs.

--

Rick C.

-+ Get 1,000 miles of free Supercharging
-+ Tesla referral code - https://ts.la/richard11209

Rick C

unread,
Jun 9, 2022, 4:02:07 AM6/9/22
to
I rather thought that was obvious. I didn't realize that needed to be pointed out to anyone.


> There flexible IOs to communicate with the outer world (edges) are essential. This has to be reflected
> in the chip design.

Again, obvious. That's why I mentioned the need for 3.3V I/Os and 5V tolerance. I'm also picturing a GA144 like structure, so dedicated peripherals are not needed. One of the few things the GA144 got right is the ability to use processors as software based peripherals.


> For instance the Arduino Pico offers
> https://arduino-pico.readthedocs.io/en/latest/pins.html
> Or of the atiny family the small atiny4
> https://www.microchip.com/en-us/product/ATTINY4

Yeah, the GA144 could emulate pretty much any common peripheral with the I/O CPUs. They seem to have munged the dedicated memory interface so it would not mate well with DRAM, so instead they developed apps with very expensive and power hungry static RAM (also very low density and on the way out). I tried to see if I could figure out how to use it with DRAM (you don't actually have to clock DRAM as fast as it can possibly run, you just need to meet all the timing specs). But at one point I needed to understand the timing between the three CPUs that were handling the interface, including the timing of the internal comms between them. They would not release any internal timing info. I was told to "play" with the chips to see if the design would work. That's the exact opposite of how design work should proceed. Why would anyone invest time and effort into a design if the data sheet says it is not possible? If they won't provide that data, I'm sure as heck not going to waste my time to measure it for them.

--

Rick C.

-+- Get 1,000 miles of free Supercharging
-+- Tesla referral code - https://ts.la/richard11209

none albert

unread,
Jun 9, 2022, 4:19:48 AM6/9/22
to
In article <1cff2504-294a-4be1...@googlegroups.com>,
Rick C <gnuarm.del...@gmail.com> wrote:
<SNIP>
>Yeah, the GA144 could emulate pretty much any common peripheral with the
>I/O CPUs. They seem to have munged the dedicated memory interface so it
>would not mate well with DRAM, so instead they developed apps with very
>expensive and power hungry static RAM (also very low density and on the
>way out). I tried to see if I could figure out how to use it with DRAM
>(you don't actually have to clock DRAM as fast as it can possibly run,
>you just need to meet all the timing specs). But at one point I needed
>to understand the timing between the three CPUs that were handling the
>interface, including the timing of the internal comms between them.
>They would not release any internal timing info. I was told to "play"
>with the chips to see if the design would work.

At one point I had the same idea, but no expertise to pursue it.
Your experience (and mine) is exactly why the GA's fail and
are going to continue failing.

> waste my time

That sums it up nicely.

>Rick C.
--
"in our communism country Viet Nam, people are forced to be
alive and in the western country like US, people are free to
die from Covid 19 lol" duc ha
albert@spe&ar&c.xs4all.nl &=n http://home.hccnet.nl/a.w.m.van.der.horst

Anton Ertl

unread,
Jun 9, 2022, 4:27:44 AM6/9/22
to
Rick C <gnuarm.del...@gmail.com> writes:
>Virtually every coffee maker, microwave oven, and remote control has a 4 bi=
>t processor in it.

Citation needed

When googling for "4-bit microcontroller market", I only find
references that don't give 4-bit market share. E.g.,
<https://www.verifiedmarketresearch.com/product/microcontroller-market/>
says

|[A microcontroller] is capable of processing a word length that ranges
|between 4-bit up to 64-bit.

But when it comes to market segmentation, one of the segmentations
they provide is into 8-bit, 16-bit, and 32-bit microcontrollers. It
seems that 4-bit microcontrollers are not longer relevant.

>There's only one reason why anyone would use a 4 bit MC=
>U, because it costs less.

These days, the main reason for 8-bit probably is that you don't need
to redesign the thing. And for 4-bit the prime is sufficiently far in
the past that even that is not a good reason for most uses: the
designs with 4-bit controllers from maybe the 1980s have been
redesigned with 8-bit or bigger microcontrollers in the meantime.

- anton
--
M. Anton Ertl http://www.complang.tuwien.ac.at/anton/home.html
comp.lang.forth FAQs: http://www.complang.tuwien.ac.at/forth/faq/toc.html
New standard: https://forth-standard.org/
EuroForth 2022: http://www.euroforth.org/ef22/cfp.html

Jurgen Pitaske

unread,
Jun 9, 2022, 10:32:00 AM6/9/22
to
I at least found one with a quick google:
https://www.emmicroelectronic.com/product
https://www.emmicroelectronic.com/about
SWATCH might ring a bell ...
But the market is extremely narrow, and probably counted under custom silicon,
as they are probably masked, so only for very high volumes and not for the general market.

Anton Ertl

unread,
Jun 9, 2022, 11:06:03 AM6/9/22
to
Jurgen Pitaske <jpit...@gmail.com> writes:
>I at least found one with a quick google:
>https://www.emmicroelectronic.com/product
>https://www.emmicroelectronic.com/about

No mention of "4 bit" or "4-bit" (except in the context of "64 bit")
on these pages (I did not follow any links).

Wayne morellini

unread,
Jun 9, 2022, 12:36:40 PM6/9/22
to
Your following post wasn't that much shorter Paul. Time is passing on.

Wayne morellini

unread,
Jun 9, 2022, 12:42:08 PM6/9/22
to
I was talking about macro cell, that manufacturers could use, and finding one to kick it off. Also large IO microcontroller and CE versions base on the same core.

Wayne morellini

unread,
Jun 9, 2022, 12:53:04 PM6/9/22
to
Paul the transistors are consuming the energy doing the processing. That's how the GA gets its advantage
The problem is not what you are saying, it's lack of revolution of good design. The designs you are comparing conventional too, are strictly limited and difficult, not to be compared. What I'm saying is strictly more conventional outside of forth, making it easier to use. What I mentioned is things like Jeff's and Dr Tings, but at more conventional 16 and 32 bit sizes (and conventional or parasitic on chip memory). Reading gets you all the good bits. At University I read a lot, to.learn what was up. :). I think I made it clear I was t in favour of the reduced
misc model that the x18s now use.

Wayne morellini

unread,
Jun 9, 2022, 1:03:32 PM6/9/22
to
Well. It's just handy for a small company or hobbyist to print their complete product or circuit in house than to go through the Hassel of trying to get an order of the lowest or fastest arm they then gave to pick and place or send off to a third party manufacturer, to get made when they don't need the low energy, small size or high speed of the commercial chip. Having said that, a misc are so small a machine could stop one in that's more affordable than an arm as an alternative. There is market for both. Saying here, that you could buy a tube of 10,000 misc chips and put an pick and place feeder attachment to a 3D printer. Print over the chip and circuit and you are done. There is room for both styles of technology.

Jurgen Pitaske

unread,
Jun 9, 2022, 1:14:53 PM6/9/22
to
see the data sheet:
https://www.emmicroelectronic.com/sites/default/files/products/datasheets/em6607_ds.pdf

Features
‰ Low Power typical 1.8µA active mode
typical 0.5µA standby mode
typical 0.1µA sleep mode
@ 1.5V, 32kHz, 25 °C
‰ Low Voltage 1.2 to 3.3 V
‰ ROM 2k × 16 (Mask Programmed)
‰ RAM 96 × 4 (User Read/Write)
‰ 2 clocks per instruction cycle
‰ RISC architecture
‰ 5 software configurable 4-bit ports
‰ 1 High drive output port
‰ Up to 20 inputs (5 ports)
‰ Up to 16 outputs (4 ports)
‰ buzzer three tone
‰ Serial Write buffer – SWB
‰ Supply Voltage level detection (SVLD).
‰ Analogue and timer watchdog
‰ 8 bit timer / event counter
‰ Internal interrupt sources (timer, event counter,
prescaler)
‰ External interrupt sources (portA + portC)
Description
The EM6607 is a single chip low power, mask
programmed CMOS 4-bit microcontroller. It contains
ROM, RAM, watchdog timer, oscillation detection circuit,
combined timer / event counter, prescaler, voltage level
detector and a number of clock functions. Its low voltage
and low power operation make it the most suitable
controller for battery, stand alone and mobile equipment.
The EM6607 microcontroller is manufactured using EM’s
Advanced Low Power CMOS Process.
In 24 Pin package it is direct replacement for EM6603.

Wayne morellini

unread,
Jun 9, 2022, 10:20:13 PM6/9/22
to
So, are you saying we should use this chip instead, or 24 pin 4 bit misc chip, or piggyback this ochip on a misc chip to give a high energy mode, or get EM's advanced low powered CMOS process, or EM should take over misc and buy out GA. It's potentially so confusing! Sorry, hard to resist giving a misconstrued reply that mistakes everything, like I get around here. :)

I like to see somebody try to make a Linux desktop computer out of these, like people say that problems can be broken down into smaller bit sizes. But, hie would they fix the resulting bias worth of computer on the desk alongside an equivalent top end PC configuration. Imagine the pipelining you would need. You might be able to run alongside the data as it propagates down to the other end of the bus, as it goes through a million deep pipeline. I can actually do this with normal chips and beat the signal to the end. I can do that with normal high end processors, just by standing on the chip, I can beat the signal to the other side of the chip before I even started. It's all that spooky quantum stuff, action at a distance maybe? :)

Rick C

unread,
Jun 9, 2022, 10:26:38 PM6/9/22
to
On Thursday, June 9, 2022 at 4:27:44 AM UTC-4, Anton Ertl wrote:
> Rick C <gnuarm.del...@gmail.com> writes:
> >Virtually every coffee maker, microwave oven, and remote control has a 4 bi=
> >t processor in it.
>
> Citation needed

Yeah, wait here while I get that for you.


> When googling for "4-bit microcontroller market", I only find
> references that don't give 4-bit market share. E.g.,
> <https://www.verifiedmarketresearch.com/product/microcontroller-market/>
> says
>
> |[A microcontroller] is capable of processing a word length that ranges
> |between 4-bit up to 64-bit.
>
> But when it comes to market segmentation, one of the segmentations
> they provide is into 8-bit, 16-bit, and 32-bit microcontrollers. It
> seems that 4-bit microcontrollers are not longer relevant.

Not sure who "they" is. I don't put all my faith in any individual report. These reports have a target audience. If you understand the market you would understand why 4-bit processors are not very well reported.

I'm not sure what you are trying to say.


> >There's only one reason why anyone would use a 4 bit MC=
> >U, because it costs less.
>
> These days, the main reason for 8-bit probably is that you don't need
> to redesign the thing. And for 4-bit the prime is sufficiently far in
> the past that even that is not a good reason for most uses: the
> designs with 4-bit controllers from maybe the 1980s have been
> redesigned with 8-bit or bigger microcontrollers in the meantime.

You don't need to redesign what "thing"? I don't understand what you are saying. There are NEW designs that use 4-bit MCUs... the ones where they omit not absolutely essential resistors costing a fraction of a penny, because of the impact on profit margin.

There is literally no reason on earth to use an 8-bit processor in a $20 coffee maker. Every piece of this appliance is cost optimized. The software running on the 4 bit processor is probably not more than 200 instructions. Zero reason to use anything other than the cheapest 4-bit MCU they can find.

https://www.walmart.com/ip/Mainstays-12-Cup-Programmable-Coffee-Maker-1-8-Liter-Capacity-Black/476343008

They also use 4-bit MCUs in remote controls, microwave ovens and many, many other relatively low cost appliances and devices. You don't have to believe it, but it is reality. When they sell millions of an item, they work hard to squeeze literally every penny out of the cost. Oh, yeah, I forgot toys! Profit margins are king there. If a product is a loser, they want to save every penny and if a product is a winner, they want to milk every penny from the many units sold.

--

Rick C.

-++ Get 1,000 miles of free Supercharging
-++ Tesla referral code - https://ts.la/richard11209

Rick C

unread,
Jun 9, 2022, 10:34:57 PM6/9/22
to
There was a 4-bit stack processor, designed for such devices as keyfobs, the MARK4 by Atmel. It lasted a few years and is no longer made. I'm not sure why. The companies selling 4-bitters, are mostly Chinese, no-name foundries. They are very hard to compete with on price. In return, they don't give you the time of day unless you are buying 100s of thousands to start with. That's why they don't show up on the radar.

--

Rick C.

+-- Get 1,000 miles of free Supercharging
+-- Tesla referral code - https://ts.la/richard11209

Paul Rubin

unread,
Jun 9, 2022, 11:42:08 PM6/9/22
to
Rick C <gnuarm.del...@gmail.com> writes:
> There is literally no reason on earth to use an 8-bit processor in a
> $20 coffee maker.

I have a $20 coffee maker and afaict it has no processor at all. There
is a tank of water, a heating element, and a thermostat. You close the
cover over the water tank, which seals it with a gasket, and then turn
on the heat. Steam builds up in the tank which pushes the water up
through a siphon-like tube where it drops through the coffee grounds
into the carafe. When all the water is pushed out that way, the heating
element gets above 100C, and a thermostat shuts off the power. Rice
cookers work roughly the same way.

> The software running on the 4 bit processor is probably not more than
> 200 instructions. Zero reason to use anything other than the cheapest
> 4-bit MCU they can find.

Given that a flash programmable 8 bit cpu is 3 cents retail (i.e. you
can order 100 of them for 3 bucks from lcsc.com), I don't see how much
they can save by using 4 bitters. Stuff made in really huge quantity
likely uses ASIC's rather than MCU's.

> They also use 4-bit MCUs in remote controls, microwave ovens and many,
> many other relatively low cost appliances and devices.

I'd want to see a teardown before being convinced of this. It has to be
a model introduced in the past 10 years, not something from the 1980's,
since we're talking about new designs.

The TV remote my mom uses has voice recognition. It wouldn't surprise
me if it has a 32 bit cpu.

Rick C

unread,
Jun 10, 2022, 12:00:30 AM6/10/22
to
On Thursday, June 9, 2022 at 11:42:08 PM UTC-4, Paul Rubin wrote:
> Rick C <gnuarm.del...@gmail.com> writes:
> > There is literally no reason on earth to use an 8-bit processor in a
> > $20 coffee maker.
> I have a $20 coffee maker and afaict it has no processor at all. There
> is a tank of water, a heating element, and a thermostat. You close the
> cover over the water tank, which seals it with a gasket, and then turn
> on the heat. Steam builds up in the tank which pushes the water up
> through a siphon-like tube where it drops through the coffee grounds
> into the carafe. When all the water is pushed out that way, the heating
> element gets above 100C, and a thermostat shuts off the power. Rice
> cookers work roughly the same way.

Yeah, there's no MCU in a fork either. I would ask what your point is, but you would probably try to explain it.


> > The software running on the 4 bit processor is probably not more than
> > 200 instructions. Zero reason to use anything other than the cheapest
> > 4-bit MCU they can find.
> Given that a flash programmable 8 bit cpu is 3 cents retail (i.e. you
> can order 100 of them for 3 bucks from lcsc.com), I don't see how much
> they can save by using 4 bitters. Stuff made in really huge quantity
> likely uses ASIC's rather than MCU's.

You are right. You don't see.


> > They also use 4-bit MCUs in remote controls, microwave ovens and many,
> > many other relatively low cost appliances and devices.
> I'd want to see a teardown before being convinced of this. It has to be
> a model introduced in the past 10 years, not something from the 1980's,
> since we're talking about new designs.
>
> The TV remote my mom uses has voice recognition. It wouldn't surprise
> me if it has a 32 bit cpu.

LOL! More likely it also has a wifi connection to the Internet!!! LOL

Sometimes you are so funny. You crack me up!

--

Rick C.

+-+ Get 1,000 miles of free Supercharging
+-+ Tesla referral code - https://ts.la/richard11209

Jurgen Pitaske

unread,
Jun 10, 2022, 2:24:30 AM6/10/22
to
No, this was for Anton to find this 4 bit documentation at EM, and for others who want to see products out there now.

4 bit is useful where needed, and mostly for cost and volume.
Any Forth chip should have at least 16 bit

dxforth

unread,
Jun 10, 2022, 3:13:05 AM6/10/22
to
On 10/06/2022 13:42, Paul Rubin wrote:
>
> The TV remote my mom uses has voice recognition. It wouldn't surprise
> me if it has a 32 bit cpu.

I'd settle for rubber buttons that function in the cold. Shouldn't have
to treat remotes like pampered poodles.

Anton Ertl

unread,
Jun 10, 2022, 3:28:26 AM6/10/22
to
Rick C <gnuarm.del...@gmail.com> writes:
>On Thursday, June 9, 2022 at 4:27:44 AM UTC-4, Anton Ertl wrote:
>> Rick C <gnuarm.del...@gmail.com> writes:=20
>> >Virtually every coffee maker, microwave oven, and remote control has a 4=
> bi=3D=20
>> >t processor in it.=20
>>=20
>> Citation needed=20
>
>Yeah, wait here while I get that for you.=20
>
>
>> When googling for "4-bit microcontroller market", I only find=20
>> references that don't give 4-bit market share. E.g.,=20
>> <https://www.verifiedmarketresearch.com/product/microcontroller-market/>=
>=20
>> says=20
>>=20
>> |[A microcontroller] is capable of processing a word length that ranges=
>=20
>> |between 4-bit up to 64-bit.=20
>>=20
>> But when it comes to market segmentation, one of the segmentations=20
>> they provide is into 8-bit, 16-bit, and 32-bit microcontrollers. It=20
>> seems that 4-bit microcontrollers are not longer relevant.=20
>
>Not sure who "they" is.

In this case https://www.verifiedmarketresearch.com

>I don't put all my faith in any individual report.=
> These reports have a target audience. If you understand the market you w=
>ould understand why 4-bit processors are not very well reported. =20

So you claim that 4-bit processors are relevant, don't want to provide
any evidence for that, dismiss evidence against it out of hand, and
then say that one has to be in the know to understand why there is no
evidence for your claims. Maybe you should found the church of 4-bit
processing.

>> These days, the main reason for 8-bit probably is that you don't need=20
>> to redesign the thing. And for 4-bit the prime is sufficiently far in=20
>> the past that even that is not a good reason for most uses: the=20
>> designs with 4-bit controllers from maybe the 1980s have been=20
>> redesigned with 8-bit or bigger microcontrollers in the meantime.=20
>
>You don't need to redesign what "thing"?

Whatever things still use 8-bit MCUs. Maybe "every coffee maker,
microwave oven, and remote control".

>There are NEW designs that use 4-bit MCUs...

That's what you claim.

>You don't have to belie=
>ve it, but it is reality.

You don't provide any evidence, so the only choice is to believe or
not to believe.

none albert

unread,
Jun 10, 2022, 4:41:01 AM6/10/22
to
In article <9b86fdb3-8c42-4954...@googlegroups.com>,
Jurgen Pitaske <jpit...@gmail.com> wrote:
<SNIP>
>Description
>The EM6607 is a single chip low power, mask
>programmed CMOS 4-bit microcontroller. It contains
>ROM, RAM, watchdog timer, oscillation detection circuit,
>combined timer / event counter, prescaler, voltage level
>detector and a number of clock functions. Its low voltage
>and low power operation make it the most suitable
>controller for battery, stand alone and mobile equipment.
>The EM6607 microcontroller is manufactured using EM’s
>Advanced Low Power CMOS Process.
>In 24 Pin package it is direct replacement for EM6603.

This is certainly a 3 eurocent processor.

Groetjes Albert

Jurgen Pitaske

unread,
Jun 10, 2022, 8:42:34 AM6/10/22
to
Not many know EM in Switzerland or have been there.

Wikipedia gives a nice overview of the 4 bit history with more known names.
How many of these are still available is a question I am not really interested in.
I have my own MISC processor 16 bit in FPGA
A similar version had been done about 20 years ago as ASIC as a student project.
It uses minimum ressources, so not optimised for speed, but flexible

List of 4-bit processors at wikipedia, copy and paste from there


Intel C4004

NEC D63GS 4-bit microcontroller

NEC D63GS: a 4-bit microcontroller for infrared remote control transmission
card-edge PCB
Olympia CD700 Desktop Calculator using the National Semiconductor MAPS MM570X bit-serial 4-bit microcontroller
16-pin DIP
National Semiconductor MM5700CA/D bit-serial 4-bit microcontroller
Intel 4004
Intel 4040
TMS 1000
Atmel MARC4 core[26][27] – (discontinued: "Last ship date: 7 March 2015"[28])
Samsung S3C7 (KS57 Series) 4-bit microcontrollers (RAM: 512 to 5264 nibbles, 6 MHz clock)
Toshiba TLCS-47 series
HP Saturn
NEC μPD75X
NEC μCOM-4
NEC (now Renesas) μPD612xA (discontinued), μPD613x, μPD6x[23][29] and μPD1724x[30] infrared remote control transmitter microcontrollers[31][32]
EM Microelectronic-Marin EM6600 family,[33] EM6580,[34][35] EM6682,[36] etc.
Epson S1C63 family
National Semiconductor "COPS I" and "COPS II" ("COP400") 4-bit microcontroller families[37]
National Semiconductor MAPS MM570X
Sharp SM590/SM591/SM595[38]: 26–34 
Sharp SM550/SM551/SM552[38]: 36–48 
Sharp SM578/SM579[38]: 49–64 
Sharp SM5E4[38]: 65–74 
Sharp LU5E4POP[38]: 75–82 
Sharp SM5J5/SM5J6[38]: 83–99 
Sharp SM530[38]: 100–109 
Sharp SM531[38]: 110–118 
Sharp SM500[38]: 119–127  (ROM 1197×8 bit, RAM 40×4 bit, a divider and 56-segment LCD driver circuit)
Sharp SM5K1[38]: 128–140 
Sharp SM4A[38]: 141–148 
Sharp SM510[38]: 149–158  (ROM 2772×8 bit, RAM 128×4 bit, a divider and 132-segment LCD driver circuit)
Sharp SM511/SM512[38]: 159–171  (ROM 4032×8 bit, RAM 128/142×4 bit, a divider and 136/200-segment LCD driver circuit)
Sharp SM563[38]: 172–186 

S Jack

unread,
Jun 10, 2022, 10:21:30 AM6/10/22
to
On Thursday, June 9, 2022 at 9:26:38 PM UTC-5, gnuarm.del...@gmail.com wrote:
>
> There is literally no reason on earth to use an 8-bit processor in a $20 coffee maker. Every > > >piece of this appliance is cost optimized. The software running on the 4 bit processor is >probably not more than 200 instructions. Zero reason to use anything other than the cheapest >4-bit MCU they can find.

It may take a few more instructions to get the coffee maker to talk to the cup.
--
me

Wayne morellini

unread,
Jun 10, 2022, 12:14:24 PM6/10/22
to
Sorry. It was humour at the type of post you get around here, and on operator like technical forums, which just mistakes things in a chaotic fashion, rather than appropriately assess things.

Wayne morellini

unread,
Jun 10, 2022, 12:28:39 PM6/10/22
to
Anyway, I'm going have to leave you guys. My father passed away unexpectedly last month, after not been able to see him more than a handful.of times I'm the last 6 months due to covid restrictions, and not been able to talk with him on the phone due to his hearing. Now, there are unnecessary petty will games, and the stress is getting too much, after a number of other overlapping stressful things trying to take advantage of my disability, which has partly lifted a bit for now, to address this. Anybody know of a civil causes body, that might want to invest the odd million, then I might be able to sort out everything which has been deliberately done to me.

But keep up the conversation, it almost seems like people favour a 4 bit misc version do far. :)

Thanks.

Wayne morellini

unread,
Jun 10, 2022, 12:32:41 PM6/10/22
to
I'm basically not going get any commercial work done with all these people deliberately interfering with my life. What a waste. They cost much more than they gain.

Jurgen Pitaske

unread,
Jun 10, 2022, 2:48:09 PM6/10/22
to
I hope you can sort your private issues soon. All the best.

Rick C

unread,
Jun 10, 2022, 4:35:59 PM6/10/22
to
That is simply an opinion. There's no reason why a 8 or even 4 bit CPU can't be a stack processor and be useful to those who build things with 4 and 8 bit CPUs. Heck, it might turn out that a 5 bit CPU is preferable. My stack processor is data path agnostic. It can have any size instruction (if you recode them to suit), data path, address path. There is no direct dependence between the three.

--

Rick C.

++- Get 1,000 miles of free Supercharging
++- Tesla referral code - https://ts.la/richard11209

Rick C

unread,
Jun 10, 2022, 4:46:36 PM6/10/22
to
Yes,

> don't want to provide
> any evidence for that,

Correct, because anyone familiar with the world of 4-bit processors, understands there is little evidence, unless as I've said, you want to actually build a product and contact the makers directly. People like TI don't want to bother with 4-bit MCUs because the profit margins are too small for them to make a profit.


> dismiss evidence against it out of hand,

Except no evidence was provided. I explained how they don't show on the same radars as 8, 16 and 32 bit MCUs.


> then say that one has to be in the know to understand why there is no
> evidence for your claims. Maybe you should found the church of 4-bit
> processing.

Indeed. I simply have been paying attention to that world long enough to know. You seem to be a total newbie. Believe what you want. Most people here do.


> >> These days, the main reason for 8-bit probably is that you don't need=20
> >> to redesign the thing. And for 4-bit the prime is sufficiently far in=20
> >> the past that even that is not a good reason for most uses: the=20
> >> designs with 4-bit controllers from maybe the 1980s have been=20
> >> redesigned with 8-bit or bigger microcontrollers in the meantime.=20
> >
> >You don't need to redesign what "thing"?
> Whatever things still use 8-bit MCUs. Maybe "every coffee maker,
> microwave oven, and remote control".

You aren't making sense. Whatever.


> >There are NEW designs that use 4-bit MCUs...
> That's what you claim.

So, are you claiming that 4-bit processors are no longer designed into new products? Show me a simple product, that could make use of a 4-bit processor, is made in qty over a million and has an 8-bit processor. The coffee maker is a perfect example. Can you show me an under $25 coffee maker that uses an 8-bit processor? They don't exist because the chip may only cost a penny more, but that's $10,000 in lost profits. Why would they give that up? Why do you refuse to believe the 4-bit processors are still viable at the very low cost end of the market?


> >You don't have to belie=
> >ve it, but it is reality.
> You don't provide any evidence, so the only choice is to believe or
> not to believe.

You only need to look. Juergen has already provided links to such devices. I'm not your search engine. I'm also a bit busy.

--

Rick C.

+++ Get 1,000 miles of free Supercharging
+++ Tesla referral code - https://ts.la/richard11209

Jurgen Pitaske

unread,
Jun 10, 2022, 4:47:00 PM6/10/22
to
You have an opinion, I have an opinion.
My opinion about 4 bit very special and not for the general public seems to be supported by reality and market overviews.
Apart from this, anybody can have any processor.
Let's have as many Forth processors as there are Forths out there.
And none is successful. As it is now.

Rick C

unread,
Jun 10, 2022, 4:51:40 PM6/10/22
to
Wikipedia is not very useful on this issue. This is largely an historical list of devices made by mainstream companies. I see a mention of a remote control MCU. That might be current and any of the Sharp devices may be current. The Intel 4004 was the first single chip CPU ever made, no? I don't see the Atmel MARC4. Maybe I need to add that.

--

Rick C.

---- Get 1,000 miles of free Supercharging
---- Tesla referral code - https://ts.la/richard11209

Rick C

unread,
Jun 10, 2022, 4:57:43 PM6/10/22
to
Sorry, I don't understand some of what you say.

There is no disadvantage for a stack processor to have a 4 bit data path if it is being used in applications where this is appropriate, such as clocks, microwave ovens, coffee makers, remote controls, etc.

Why do you think a stack processor has to have a 16 bit data path?

The successful stack CPUs are in FPGAs and custom chips. We just don't know so much about them, something shared with the MIPS processor.

--

Rick C.

---+ Get 1,000 miles of free Supercharging
---+ Tesla referral code - https://ts.la/richard11209

Paul Rubin

unread,
Jun 10, 2022, 6:19:09 PM6/10/22
to
Rick C <gnuarm.del...@gmail.com> writes:
> Why do you think a stack processor has to have a 16 bit data path?

If we're talking about Forth, we usually expect data cells to be able to
hold addresses. 4 bits is awfully small for that. 8 bits, maybe.
Also, I had thought we were talking about MCU's, which are packaged,
multipurpose programmable devices that control the rest of the product
through i/o pins. Stuff inside an ASIC or FPGA doesn't count as that.

Paul Rubin

unread,
Jun 10, 2022, 6:36:07 PM6/10/22
to
Rick C <gnuarm.del...@gmail.com> writes:
> Indeed. I simply have been paying attention to that world long enough
> to know. You seem to be a total newbie.

I would say paying attention to the world for a long time means you
witnessed some stuff that happened in the distant past, which can be the
basis of much wisdom. But it doesn't make you an authority about what
is or isn't happening in the present.

> So, are you claiming that 4-bit processors are no longer designed into
> new products?

I don't think such a claim was made. Only that there hasn't been
convincing evidence shown against it.

> Show me a simple product, that could make use of a 4-bit processor, is
> made in qty over a million and has an 8-bit processor.

Every PC keyboard including in the pre-USB era had an 8035-style
processor even though it could have used a 4 bitter. In USB keyboards,
using 4 bitters may be unfeasible so I won't count them.

Paul Rubin

unread,
Jun 10, 2022, 6:49:34 PM6/10/22
to
albert@cherry.(none) (albert) writes:
>>The EM6607 is a single chip ... 4-bit microcontroller.
> This is certainly a 3 eurocent processor.

I like how it comes with 2kx16 bits (4k bytes) of code space (mask
rom). Somehow GA thought that 64 18-bit words was enough.

It has 96 nibbles of ram, no mention of a hardware stack. In a Forth
chip with 4-bit data, how is the return stack supposed to be stored?

Paul Rubin

unread,
Jun 10, 2022, 6:51:59 PM6/10/22
to
Wayne morellini <waynemo...@gmail.com> writes:
> Anyway, I'm going have to leave you guys. My father passed away
> unexpectedly last month, after not been able to see him more than a
> handful.of times I'm the last 6 months due to covid restrictions, and
> not been able to talk with him on the phone due to his hearing.

I'm very sorry to hear this about your father. My condolences. My mom
is hard of hearing and I've been meaning to set up a text chat system
that she can use.

James Brakefield

unread,
Jun 10, 2022, 6:59:18 PM6/10/22
to
Have a FPGA 4-bit accumulator ISA core that supports return and data stacks
via general purpose index registers. Can be widened to 8/9 or 16/18-bits.
https://github.com/jimbrake/lem1_9min
If you need to have a FPGA chip anyway, this will take ~200 6LUTs and as little as a half of a block RAM.
~100 lines of fully commented straight-forward VHDL (sans initialization & overhead).
There are a number of 8 & 16-bit designs with under 200 LUTs. Even 32-bit if done serially.
Then again very few true Forth or stack designs under 200 LUTs?

Paul Rubin

unread,
Jun 10, 2022, 7:07:06 PM6/10/22
to
James Brakefield <jim.bra...@ieee.org> writes:
> Then again very few true Forth or stack designs under 200 LUTs?

I think once any FPGA is involved, we're outside the realm of super low
cost parts. There are lots of sub-10-cent 8-bit MCU's. I don't know of
any sub-10-cent FPGA's even with 200 LUT-4's, much less LUT-6's.

Rick C

unread,
Jun 11, 2022, 1:21:03 AM6/11/22
to
On Friday, June 10, 2022 at 6:19:09 PM UTC-4, Paul Rubin wrote:
> Rick C <gnuarm.del...@gmail.com> writes:
> > Why do you think a stack processor has to have a 16 bit data path?
> If we're talking about Forth, we usually expect data cells to be able to
> hold addresses. 4 bits is awfully small for that. 8 bits, maybe.

I think you are confused. Many 8 bit processors have 8 bit data paths and 16 bit addresses. The cell size is what you want it to be, not dictated by the hardware. The same applies to a 4-bit processor. Don't confuse cells with the address unit.


> Also, I had thought we were talking about MCU's, which are packaged,
> multipurpose programmable devices that control the rest of the product
> through i/o pins. Stuff inside an ASIC or FPGA doesn't count as that.

You didn't quote what you are replying to. You can define an MCU anyway you want. Juergen's comment wasn't about MCUs, it was about "Forth chips". I simply pointed out one of the successful uses of "Forth chips", even though, there's no such thing as a "Forth chip" other than in the Humpty Dumpty definition.

Don't get your knickers in a knot.

--

Rick C.

--+- Get 1,000 miles of free Supercharging
--+- Tesla referral code - https://ts.la/richard11209

Rick C

unread,
Jun 11, 2022, 1:30:30 AM6/11/22
to
Actually, it couldn't at the time. For one, they simply copied the design from the previous designs, as they did with so many parts of the PC, firmware and all. There was always a huge amount of paranoia about compatibility in the PC world. Secondly, at the time, there were no 4-bit processors with the same I/O capabilities. At least, I never saw any. This is most likely because every I/O pin costs in test which can dominate the price of a low end device. In the 40 pin packages used in keyboards, the cost advantage of a 4-bit processor probably evaporated. Then lastly, a keyboard doesn't really fit the definition of a simple product. I'm referring to things like remote controls and coffee makers that have minimal requirements (obviously if you need an 8-bit CPU for any of various reasons, you need an 8-bit CPU). A $50 keyboard doesn't fit that description. I'm not sure you could get many keyboards for that price at the time. They used to be designed with excellent mechanical switches that cost around a buck apiece.

--

Rick C.

--++ Get 1,000 miles of free Supercharging
--++ Tesla referral code - https://ts.la/richard11209

Rick C

unread,
Jun 11, 2022, 1:34:09 AM6/11/22
to
On Friday, June 10, 2022 at 6:49:34 PM UTC-4, Paul Rubin wrote:
> albert@cherry.(none) (albert) writes:
> >>The EM6607 is a single chip ... 4-bit microcontroller.
> > This is certainly a 3 eurocent processor.
> I like how it comes with 2kx16 bits (4k bytes) of code space (mask
> rom). Somehow GA thought that 64 18-bit words was enough.
a
Not sure how you can compare the two. The GA144 has an unlimited code space since the storage is all external to the chip.


> It has 96 nibbles of ram, no mention of a hardware stack. In a Forth
> chip with 4-bit data, how is the return stack supposed to be stored?

??? What does the data path have to do with the return stack? I think you are too accustomed to bigger processors where it's all the same number. As I said in another post, the cell size in Forth, just like the address size in a CPU has nothing to do with the data path. An 8080 has 8 bit data paths, and 16 bit addresses. Why do you keep getting confused about this?

--

Rick C.

-+-- Get 1,000 miles of free Supercharging
-+-- Tesla referral code - https://ts.la/richard11209

Jurgen Pitaske

unread,
Jun 11, 2022, 2:03:33 AM6/11/22
to
No need to add.
it is included in the list I copied, just next to the TMS1000

Anton Ertl

unread,
Jun 11, 2022, 2:37:03 AM6/11/22
to
Rick C <gnuarm.del...@gmail.com> writes:
>??? What does the data path have to do with the return stack?

In Forth, the words >R, R>, and R@ connect the two.

Anton Ertl

unread,
Jun 11, 2022, 3:34:13 AM6/11/22
to
Rick C <gnuarm.del...@gmail.com> writes:
>People like TI don't want =
>to bother with 4-bit MCUs because the profit margins are too small for them=
> to make a profit. =20

Do you want to say that TI cannot make a profit from them, or that MCU
manufacturers in general cannot? If the former, why can other MCU
manufacturers make a profit? If the latter, why would any MCU maker
bother making a 4-bit MCU? Or if customers demand 4-bit MCUs, why do
MCU manufacturers not raise the prices?

>> >There are NEW designs that use 4-bit MCUs...
>> That's what you claim.=20
>
>So, are you claiming that 4-bit processors are no longer designed into new =
>products?

I wrote that you made a claim without providing evidence.

>Show me a simple product, that could make use of a 4-bit process=
>or, is made in qty over a million and has an 8-bit processor. The coffee m=
>aker is a perfect example. Can you show me an under $25 coffee maker that =
>uses an 8-bit processor? They don't exist because the chip may only cost a=
> penny more, but that's $10,000 in lost profits.

And what if it costs 0 pennies more? In
<https://bernd-paysan.de/b16-presentation.pdf> (from 2005), Bernd
Paysan gives the size of the (small) b16 in the XC035 process (a
0.35um (350nm) process) as 0.16mm^2. Intel and AMD were at 90nm by
that time. These days, my guess is that it takes 100x less area in a
process used now for embedded controllers. How much does 0.0016mm^2
in such a process cost? How much do you save by using, say 0.0000mm^2
(a 4-bit CPU will certainly need more area than that)? Will it save a
penny? I doubt it.

Unfortunately, I have not found anything on cost per area, but I have
found
<https://www.fabricatedknowledge.com/p/the-rising-tide-of-semiconductor>,
which states that the cost per 100M gates is <$2 for processes more
recent than 45/50nm, and $1.3 for 28nm; so for new designs MCU
manufacturers are likely to use 28nm. The number of gates in a
b16-small is not directly stated, but I remember a number of 300 LUTs
for an FPGA, so that may be 1200 gates. At the $2/100M gates cost,
that would be $0.000024 or 0.0024 cents. Even if this underestimates
the actual cost by a factor of 10, the cost would still be only 0.024
cents, and eliminating it completely would save only $240 in your 1M
production run. How much extra time does a programmer need to deal
with the 4-bit MCU? Are the $240 sufficient to pay for that?

> I'm not your search engine.

Search engines have not given me any evidence for your claims.

> I'm also a bit busy.

You have enough time to post claims, and enough time to vigorously
defend them, but are too busy to present evidence for them.

Paul Rubin

unread,
Jun 11, 2022, 3:41:29 AM6/11/22
to
Rick C <gnuarm.del...@gmail.com> writes:
> Not sure how you can compare the two. The GA144 has an unlimited code
> space since the storage is all external to the chip.

It has 64 words of program rom available at each node.

> ??? What does the data path have to do with the return stack? ....
> An 8080 has 8 bit data paths, and 16 bit addresses. Why do you keep
> getting confused about this?

Idk about "data paths" which I thought was a feature of the hardware
implementation, not the cpu architecture. E.g. the 8088 also had 8 bit
data paths iirc, but it was a 16 bit architecture (size of the registers
and operands for most instructions). The PDP 8/S was a 12-bit
architecture with 1-bit data path (S stood for serial). It was very
slow.

We're talking about stack cpu's in a Forth context, one would hope that
the idea is to program that cpu in Forth pretty much directly. So the
word size (accumulator size, register size, whatever) would be 4 bits,
in a 4 bit arch. What is the return stack going to look like in the
Forth for that cpu? Will there be instructions like >R R> for the stack
juggling of Forth tradition? Will the return stack hold return
addresses?

Paul Rubin

unread,
Jun 11, 2022, 4:05:13 AM6/11/22
to
Rick C <gnuarm.del...@gmail.com> writes:
> The coffee maker is a perfect example. Can you show me an under $25
> coffee maker that uses an 8-bit processor? They don't exist because
> the chip may only cost a penny more,

Anton made a similar point, but "a penny more" is another claim I'm
skeptical of. Would the 4-bit processor cost less by some amount x>0?
Let's assume yes. Is x > $1? Obviously not. Is x > $0.0000001? I'm
ok assuming yes. You are claiming x is around $0.01 but I would like to
see evidence that it is that large. I'll toss out $0.001 as an
alternative number. Most of the cost of that part is likely to be in
the package, pin drivers, on-chip peripherals, etc. Not the ALU. The
memory arrays are similar in size to those on small 8 bitters.

One data point: the EM6607 mentioned earlier has 4k bytes (2k 16-bit
words) of mask rom, and 96 nibbles of ram. Do they bother making
versions with less ram and rom, since a coffee maker shouldn't need that
much code? If not, it is because the cost of those extra transistors is
insignificant compared to the other stuff in the chip.

I couldn't easily find a teardown of the Walmart coffee maker you
linked, but here is a Mr. Coffee:

https://www.eetimes.com/mr-coffee-teardown-simple-effective-design/

The board in that thing isn't wasteful but you can see they were ok with
passing costs along to the consumer.

There was BOLTR video where AvE took apart an Instant Pot pressure
cooker, which is more than $25 but is sort of a glorified coffee maker.
IIRC it had an 8 bit cpu inside.

Cheap FM broadcast receivers of 30 years ago had analog tuners and
discriminators, but that stuff is all DSP now. Cost is low enough that
basic 8 bit MCU's must be close to free.

I made an error about the 3-cent PA150 earlier. Its program memory is
OTP rather than reprogrammable flash. The more expensive versions (5
cents etc.) are reprogrammable. For huge quantities you'd use mask rom,
of course.

https://jaycarlson.net/2019/09/06/whats-up-with-these-3-cent-microcontrollers/

Marcel Hendrix

unread,
Jun 11, 2022, 4:55:31 AM6/11/22
to
On Saturday, June 11, 2022 at 9:34:13 AM UTC+2, Anton Ertl wrote:
[..]
> And what if it costs 0 pennies more? In
> <https://bernd-paysan.de/b16-presentation.pdf> (from 2005), Bernd
> Paysan gives the size of the (small) b16 in the XC035 process (a
> 0.35um (350nm) process) as 0.16mm^2. Intel and AMD were at 90nm by
> that time. These days, my guess is that it takes 100x less area in a
> process used now for embedded controllers. How much does 0.0016mm^2
> in such a process cost? How much do you save by using, say 0.0000mm^2
> (a 4-bit CPU will certainly need more area than that)? Will it save a
> penny? I doubt it.

I think the (possible) reason is not so much the number of bits in an
instruction, but the the number of pins on the package (or the effort
needed to check all pins). At least, that is the way it works for analog
discrete stuff: all small-signal bipolar transistors have the same
minimum cost. That's why I was surprised to see a 4-bit processor with
24 pins in the list -- that should make absolutely no sense.

AFAIK some 4-bitters are produced for extreme environmental
conditions (250 deg C ambient) for which my argument does
not hold (and evidently nothing holds for the current 4-bit
superconducting processors).

-marcel

Jan Coombs

unread,
Jun 11, 2022, 7:54:26 AM6/11/22
to
On Fri, 10 Jun 2022 15:49:32 -0700
Paul Rubin <no.e...@nospam.invalid> wrote:

> albert@cherry.(none) (albert) writes:
> >>The EM6607 is a single chip ... 4-bit microcontroller.
> > This is certainly a 3 eurocent processor.
>
> I like how it comes with 2kx16 bits (4k bytes) of code space (mask
> rom). Somehow GA thought that 64 18-bit words was enough.
>
> It has 96 nibbles of ram, no mention of a hardware stack.
The stack pointer is two bits, so likely the stack is 4 x 12b.
Need for storing return addresses is reduced by having three
program counters. [1]

> In a Forth
> chip with 4-bit data, how is the return stack supposed to be stored?

In a stack of registers having at least the width of the program counter.
In the j1[2] the program counter is 12b wide, but the return stack is 16b
wide to allow it to be also used for temporary data storage.

Jan Coombs
--

[1] EM6607 Ultra-low power microcontroller with 4 high drive outputs
https://www.emmicroelectronic.com/sites/default/files/products/datasheets/em6607_ds.pdf

[2] J1: a small Forth CPU Core for FPGAs
http://euroforth.org/ef10/papers/bowman.pdf


Rick C

unread,
Jun 11, 2022, 9:41:21 AM6/11/22
to
On Saturday, June 11, 2022 at 2:37:03 AM UTC-4, Anton Ertl wrote:
> Rick C <gnuarm.del...@gmail.com> writes:
> >??? What does the data path have to do with the return stack?
> In Forth, the words >R, R>, and R@ connect the two.

You are confusing the data path of the chip with the data path of the Forth. Not at all the same.

--

Rick C.

-+-+ Get 1,000 miles of free Supercharging
-+-+ Tesla referral code - https://ts.la/richard11209

Rick C

unread,
Jun 11, 2022, 9:42:01 AM6/11/22
to
On Saturday, June 11, 2022 at 3:34:13 AM UTC-4, Anton Ertl wrote:
> Rick C <gnuarm.del...@gmail.com> writes:
> >People like TI don't want =
> >to bother with 4-bit MCUs because the profit margins are too small for them=
> > to make a profit. =20
>
> Do you want to say that TI cannot make a profit from them, or that MCU
> manufacturers in general cannot? If the former, why can other MCU
> manufacturers make a profit? If the latter, why would any MCU maker
> bother making a 4-bit MCU?

The same reason TI and most other semi companies got out of the DRAM business many years ago (and many got back in through acquisitions, only to exit again by spin offs). The semiconductor business is actually a number of different business areas with different business models. Intel has their business model that doesn't include most types of devices. They can't operate their foundries effectively while making billion transistor ICs and SSI logic ICs. They can't sell both types efficiently, the two models are so different, they would be like two different companies. So they don't bother with anything that isn't related to their main focus. Same for every semi company.

The 4-bit MCUs are made by smaller companies with fully depreciated foundries running highly efficiently, achieving the lowest possible cost for such devices. They have virtually no more overhead than a roofing company with a single salesman taking phone calls from people with leaking roofs. They may be small and unsophisticated, but they are able to crank out all the chips needed for coffee makers the world around.

Talk to someone who designs toys. These people know who to talk to for these devices. As I've said, they literally will scour their designs looking for individual resistors that are not essential to the operation of the toy, to save that fraction of a penny.


> Or if customers demand 4-bit MCUs, why do
> MCU manufacturers not raise the prices?

Which MCU manufacturers? For most, their prices were already too high and they dropped their 4-bit products because of the lower priced competition.


> >> >There are NEW designs that use 4-bit MCUs...
> >> That's what you claim.=20
> >
> >So, are you claiming that 4-bit processors are no longer designed into new =
> >products?
>
> I wrote that you made a claim without providing evidence.

You mean the claim that 4-bit processors are in use?


> >Show me a simple product, that could make use of a 4-bit process=
> >or, is made in qty over a million and has an 8-bit processor. The coffee m=
> >aker is a perfect example. Can you show me an under $25 coffee maker that =
> >uses an 8-bit processor? They don't exist because the chip may only cost a=
> > penny more, but that's $10,000 in lost profits.
> And what if it costs 0 pennies more? In
> <https://bernd-paysan.de/b16-presentation.pdf> (from 2005), Bernd
> Paysan gives the size of the (small) b16 in the XC035 process (a
> 0.35um (350nm) process) as 0.16mm^2. Intel and AMD were at 90nm by
> that time. These days, my guess is that it takes 100x less area in a
> process used now for embedded controllers. How much does 0.0016mm^2
> in such a process cost? How much do you save by using, say 0.0000mm^2
> (a 4-bit CPU will certainly need more area than that)? Will it save a
> penny? I doubt it.

You are suggesting that the chips are small enough that die size is no longer a cost factor? I can't follow your logic above. I don't believe anyone uses even 90nm technology for 4-bit MCUs. The cost is related to the equipment used, the capital cost. By using fully amortized equipment they get the lowest cost. But die size is still relevant. Your example of a 0.0016mm^2 chip is not realistic. Find an 8 bit device that is made on a 35nm process node you are talking about. You won't.


> Unfortunately, I have not found anything on cost per area, but I have
> found
> <https://www.fabricatedknowledge.com/p/the-rising-tide-of-semiconductor>,
> which states that the cost per 100M gates is <$2 for processes more
> recent than 45/50nm, and $1.3 for 28nm; so for new designs MCU
> manufacturers are likely to use 28nm.

None of these processes are used for low end MCUs that work with 5 volt tolerant I/Os.


> The number of gates in a
> b16-small is not directly stated, but I remember a number of 300 LUTs
> for an FPGA, so that may be 1200 gates. At the $2/100M gates cost,
> that would be $0.000024 or 0.0024 cents. Even if this underestimates
> the actual cost by a factor of 10, the cost would still be only 0.024
> cents, and eliminating it completely would save only $240 in your 1M
> production run. How much extra time does a programmer need to deal
> with the 4-bit MCU? Are the $240 sufficient to pay for that?

Since your starting assumptions are incorrect, your conclusion is also incorrect.


> > I'm not your search engine.
> Search engines have not given me any evidence for your claims.
> > I'm also a bit busy.
> You have enough time to post claims, and enough time to vigorously
> defend them, but are too busy to present evidence for them.

Yes, because the 4-bit MCU foundries don't do a lot of advertising. I know about them because of similar conversations with those who do. Try looking in sci.electronics.design. Heck, I'll post there and see if there are still any toy makers around. People in that group tend to be in the business of electronic design rather than, like many here, just talking about electronic design. Although I shouldn't disparage. I would never attempt to make an IC in my garage, but it sounds like someone here is on that path. Who knows? Maybe he will end up producing a device.

--

Rick C.

-++- Get 1,000 miles of free Supercharging
-++- Tesla referral code - https://ts.la/richard11209

Rick C

unread,
Jun 11, 2022, 9:52:44 AM6/11/22
to
On Saturday, June 11, 2022 at 3:41:29 AM UTC-4, Paul Rubin wrote:
> Rick C <gnuarm.del...@gmail.com> writes:
> > Not sure how you can compare the two. The GA144 has an unlimited code
> > space since the storage is all external to the chip.
> It has 64 words of program rom available at each node.

It has 64 words of program RAM at each node. It can also execute instructions arriving through a pipe, as they arrive, such as from an off chip Flash or other source.


> > ??? What does the data path have to do with the return stack? ....
> > An 8080 has 8 bit data paths, and 16 bit addresses. Why do you keep
> > getting confused about this?
> Idk about "data paths" which I thought was a feature of the hardware
> implementation, not the cpu architecture. E.g. the 8088 also had 8 bit
> data paths iirc, but it was a 16 bit architecture (size of the registers
> and operands for most instructions).

In the 8080 only *some* of the registers were 16 bit. The A register was only 8 bit and handled all ALU operations. The data path was only 8 bits through the ALU, to/from memory, ect.

The 8088 was a 16 bit design with an 8 bit memory interface. Internally, it still had a 16 bit data paths.


> The PDP 8/S was a 12-bit
> architecture with 1-bit data path (S stood for serial). It was very
> slow.
>
> We're talking about stack cpu's in a Forth context, one would hope that
> the idea is to program that cpu in Forth pretty much directly. So the
> word size (accumulator size, register size, whatever) would be 4 bits,
> in a 4 bit arch. What is the return stack going to look like in the
> Forth for that cpu? Will there be instructions like >R R> for the stack
> juggling of Forth tradition? Will the return stack hold return
> addresses?

Not sure what you are trying to say. No one designs a 4 bit architecture with a 4 bit address width. A Forth on a 4 bit machine would have a wider data path (most likely) and a wider address path (certainly). Why not look at the data sheet for some 4-bit processors? 8 bit processors typically have 16 bit address widths. Why not look at some Forths on 8 bit MCUs to see how they did it?

--

Rick C.

-+++ Get 1,000 miles of free Supercharging
-+++ Tesla referral code - https://ts.la/richard11209

Rick C

unread,
Jun 11, 2022, 10:09:44 AM6/11/22
to
On Saturday, June 11, 2022 at 4:05:13 AM UTC-4, Paul Rubin wrote:
> Rick C <gnuarm.del...@gmail.com> writes:
> > The coffee maker is a perfect example. Can you show me an under $25
> > coffee maker that uses an 8-bit processor? They don't exist because
> > the chip may only cost a penny more,
> Anton made a similar point, but "a penny more" is another claim I'm
> skeptical of. Would the 4-bit processor cost less by some amount x>0?
> Let's assume yes. Is x > $1? Obviously not. Is x > $0.0000001? I'm
> ok assuming yes. You are claiming x is around $0.01 but I would like to
> see evidence that it is that large. I'll toss out $0.001 as an
> alternative number. Most of the cost of that part is likely to be in
> the package, pin drivers, on-chip peripherals, etc. Not the ALU. The
> memory arrays are similar in size to those on small 8 bitters.

Peripherals??? We don't need no stinkin' peripherals! The memory is literally at the very low end of 8 bit devices and below. EVERYTHING about the 4-bit MCUs is smaller and less capable and used in designs where that doesn't matter. You don't need much RAM in a coffee maker. A stack processor could likely do without RAM at all, just the data stack and a couple of registers like in the F18A. I recall Chuck did some interesting things in the GA144 without using memory. One of his examples needed constants. He let the stack over/underflow to position the constants in the right spot (TOS was a register while a circular buffer was the rest of the stack) at the right time. Creative, if hard to use.


> One data point: the EM6607 mentioned earlier has 4k bytes (2k 16-bit
> words) of mask rom, and 96 nibbles of ram. Do they bother making
> versions with less ram and rom, since a coffee maker shouldn't need that
> much code? If not, it is because the cost of those extra transistors is
> insignificant compared to the other stuff in the chip.

I think you mean they don't make a lower RAM version because they aren't selling to the coffee pot makers of the world. Their markup is too high.


> I couldn't easily find a teardown of the Walmart coffee maker you
> linked, but here is a Mr. Coffee:
>
> https://www.eetimes.com/mr-coffee-teardown-simple-effective-design/
>
> The board in that thing isn't wasteful but you can see they were ok with
> passing costs along to the consumer.

Sorry, I don't follow at all. What costs are they passing on to the consumer?

BTW, please keep in mind that while the coffee maker may sell some large number of units, they typically have models with slight differences which would use the same CPU chip, so the importance of saving a fraction of a penny is all the more important.

I was hoping for a picture of the CPU chip.


> There was BOLTR video where AvE took apart an Instant Pot pressure
> cooker, which is more than $25 but is sort of a glorified coffee maker.
> IIRC it had an 8 bit cpu inside.
>
> Cheap FM broadcast receivers of 30 years ago had analog tuners and
> discriminators, but that stuff is all DSP now. Cost is low enough that
> basic 8 bit MCU's must be close to free.
>
> I made an error about the 3-cent PA150 earlier. Its program memory is
> OTP rather than reprogrammable flash. The more expensive versions (5
> cents etc.) are reprogrammable. For huge quantities you'd use mask rom,
> of course.
>
> https://jaycarlson.net/2019/09/06/whats-up-with-these-3-cent-microcontrollers/

Those are indeed low cost chips. I suppose that's one way of preventing counterfeits, price so low the counterfeiters can't compete. lol

--

Rick C.

+--- Get 1,000 miles of free Supercharging
+--- Tesla referral code - https://ts.la/richard11209

Rick C

unread,
Jun 11, 2022, 2:04:48 PM6/11/22
to
Here's one of the companies I was referring to as supplying 4-bit processors for the many, low cost and low power applications that don't need much performance.

http://upt-ic.com/

I can't find pricing, you would have to contact the company. Here's some data on the part.

https://bbs.21ic.com/icview-2816080-1-1.html

They don't hide, but they don't jump out in front of your car either. They don't have an advertising budget. They barely have a web site. But you can believe they sell a lot of these chips. This is the sort of outfit, if you give them your program for an 8051, for example, they will translate it into their chip's instruction set. Or so I've been told.

--

Rick C.

+--+ Get 1,000 miles of free Supercharging
+--+ Tesla referral code - https://ts.la/richard11209

none albert

unread,
Jun 11, 2022, 5:00:52 PM6/11/22
to
In article <c1656c4f-44b0-4d04...@googlegroups.com>,
Marcel Hendrix <m...@iae.nl> wrote:
>On Saturday, June 11, 2022 at 9:34:13 AM UTC+2, Anton Ertl wrote:
>[..]
>> And what if it costs 0 pennies more? In
>> <https://bernd-paysan.de/b16-presentation.pdf> (from 2005), Bernd
>> Paysan gives the size of the (small) b16 in the XC035 process (a
>> 0.35um (350nm) process) as 0.16mm^2. Intel and AMD were at 90nm by
>> that time. These days, my guess is that it takes 100x less area in a
>> process used now for embedded controllers. How much does 0.0016mm^2
>> in such a process cost? How much do you save by using, say 0.0000mm^2
>> (a 4-bit CPU will certainly need more area than that)? Will it save a
>> penny? I doubt it.
>
>I think the (possible) reason is not so much the number of bits in an
>instruction, but the the number of pins on the package (or the effort
>needed to check all pins). At least, that is the way it works for analog
>discrete stuff: all small-signal bipolar transistors have the same
>minimum cost. That's why I was surprised to see a 4-bit processor with
>24 pins in the list -- that should make absolutely no sense.

+1. I made the 3 eurocent comment, intended to that effect.

There are no 4/8 bit processors any more, only 4/8 microprocessors.
(Barring legacy products, in 2008 I had to port an 6809 processor
to a newer product. It was cancelled because a batch of chips
were found. Old processors never die.)
The pin count and the peripherals built in, are more important
than the processor itself, and contribute most to the cost.
As Anton Ertl has explained, the transistors used up by the
processor itself, is a diminishing part of the cost.
In new developments the software is a cost factor too,
and the development on processors that lack power is more
costly.

<SNIP>
>
>-marcel

Paul Rubin

unread,
Jun 11, 2022, 6:49:53 PM6/11/22
to
Marcel Hendrix <m...@iae.nl> writes:
> AFAIK some 4-bitters are produced for extreme environmental conditions
> (250 deg C ambient)

ZOMG, I didn't know that existed. A quick web search found this:

https://www.extremetech.com/extreme/179594-new-extreme-computer-chip-can-withstand-temperatures-up-to-300-degrees-celsius

It doesn't say the word size (or even that it is a microprocessor), but
it does say the line width is 0.35 micron. That page is from 2014 so I
wonder if parts are available now and what they cost. Maybe I can
figure out some applications.

This is a pretty old TI C2000 (32 bit realtime processor with DSP
features) part:

https://www.ti.com/product/SM320F28335-HT

It says it can operate up to 100 mhz at 210 celsius. Costs $335 per
unit in 100 qty. Heh.

> for which my argument does not hold (and evidently nothing holds for
> the current 4-bit superconducting processors).

I had no idea that these existed either. Wow!

Paul Rubin

unread,
Jun 11, 2022, 7:05:30 PM6/11/22
to
Rick C <gnuarm.del...@gmail.com> writes:
> Your example of a 0.0016mm^2 chip is not realistic. Find an 8 bit
> device that is made on a 35nm process node you are talking about. You
> won't.

That means that the cost of fabbing transistors is not a driving factor
in making such parts, compared with packaging etc. That also says not
much cost advantage for 4 bits over 8 bits. It's just transistors after
all.

IIRC there are AVR 8-bit processors with 384KB of program flash,
hardware multipliers and crypto instructions, etc. These can't be using
super ancient fab processes.

I believe the RP2040 is made in 40nm. It is quite a powerful chip, with
two ARM cores and 264KB of ram. My impression was that its purpose was
to provide Arduino-like control capabilities to the Raspberry Pi
ecosystem which was previously made of small Linux computers that were
computationally powerful but without much control capability. In other
words they could have used an 8 bit design and probably thought about
it, but the RP2040 turned out to be sufficiently cheap for their
purposes. The RP2040 is about $1 retail, similar to an AVR.

Paul Rubin

unread,
Jun 11, 2022, 7:08:45 PM6/11/22
to
Rick C <gnuarm.del...@gmail.com> writes:
>> [GA144] It has 64 words of program rom available at each node.
> It has 64 words of program RAM at each node.

Yes, it can run code from ram, but the rom is also there. RAM used for
program code is of course unavailable for data.

> It can also execute instructions arriving through a pipe

Yes, that is mostly a feature of the implementation. Nothing stops
other architectures from getting such an ability if there was a use for
it.

Paul Rubin

unread,
Jun 11, 2022, 7:35:04 PM6/11/22
to
Rick C <gnuarm.del...@gmail.com> writes:
> https://bbs.21ic.com/icview-2816080-1-1.html

Thanks, this is interesting, one of the embedded charts says it has 1K
instruction words (12 bit words) and 128 nibbles of ram. In fact it
looks like the program memory is OTP PROM rather than mask rom. I don't
see a description of the instruction set, but it says "4 bit risc",
and runs at up to 4 MIPS.

As for peripherals, the SOP8 version has five i/o pins (1 I), 3 timers,
2 PWM, WDT, and a 32khz xtal connection with an RTC. There is an
SOT23-6 version with less stuff connected to the pins. I don't see a
whole lot of difference from what you find in cheap 8-bit processors.

Actually here is another page about that part, with maybe more info:

http://www.upt-ic.com/en/products.aspx?id=0

It mentions that the "4-bit MCU can evade the patent of sop-8 package,
because it adopts 4-bit bus architecture wire." No idea what that is
about. There are quite a few configurations listed there. But, 1) I'd
like to know how the cost compares with the Padauk stuff, 2) how old the
design is. Certainly the market for super low cost stuff will never go
away. So we should continue to see new designs once in a while. What
do they look like?

> if you give them your program for an 8051, for example, they will
> translate it into their chip's instruction set.

Their chip runs at 16 mhz while the original 8051 was a heck of a lot
slower, besides using more cycles per instruction. That suggests a more
modern fab process than was used in the early 8-bit era.

Those 4-bit parts all seem to have program OTP, some of them up to 2K*16
bits. That is enough code space that if I expected to ship multiple
generations or variants of a product, recurring software development
would become a significant cost component, so a C environment (or Forth,
given where we are) would start looking more attractive than assembler
despite possibly being slightly less hardware-efficient.

Rick C

unread,
Jun 11, 2022, 9:03:31 PM6/11/22
to
On Saturday, June 11, 2022 at 7:05:30 PM UTC-4, Paul Rubin wrote:
> Rick C <gnuarm.del...@gmail.com> writes:
> > Your example of a 0.0016mm^2 chip is not realistic. Find an 8 bit
> > device that is made on a 35nm process node you are talking about. You
> > won't.
> That means that the cost of fabbing transistors is not a driving factor
> in making such parts, compared with packaging etc. That also says not
> much cost advantage for 4 bits over 8 bits. It's just transistors after
> all.

It says nothing of the sort. It says the more modern processes are not appropriate for making MCU devices that interface to 3.3 and 5 volt circuits. Nobody wants an MCU with 1.2V interfaces.


> IIRC there are AVR 8-bit processors with 384KB of program flash,
> hardware multipliers and crypto instructions, etc. These can't be using
> super ancient fab processes. 4

No, but they aren't competing with 4-bit processors.


> I believe the RP2040 is made in 40nm. It is quite a powerful chip, with
> two ARM cores and 264KB of ram. My impression was that its purpose was
> to provide Arduino-like control capabilities to the Raspberry Pi
> ecosystem which was previously made of small Linux computers that were
> computationally powerful but without much control capability. In other
> words they could have used an 8 bit design and probably thought about
> it, but the RP2040 turned out to be sufficiently cheap for their
> purposes. The RP2040 is about $1 retail, similar to an AVR.

I like the way you invent a narrative based on virtually no facts. If this part actually contains 264 kB of RAM, that is what drives the die size. That's a lot of RAM for an MCU.

--

Rick C.

+-+- Get 1,000 miles of free Supercharging
+-+- Tesla referral code - https://ts.la/richard11209

Rick C

unread,
Jun 11, 2022, 9:07:28 PM6/11/22
to
ROM exists, but has very little purpose. It was intended for small, common functions, like macros, not applications. Moore stated many times that the real power of the chip came from being able to install new software rapidly, hence the port execution.

Nothing stops anyone from doing anything. What's your point?

--

Rick C.

+-++ Get 1,000 miles of free Supercharging
+-++ Tesla referral code - https://ts.la/richard11209

Rick C

unread,
Jun 11, 2022, 9:13:35 PM6/11/22
to
On Saturday, June 11, 2022 at 7:35:04 PM UTC-4, Paul Rubin wrote:
> Rick C <gnuarm.del...@gmail.com> writes:
> > https://bbs.21ic.com/icview-2816080-1-1.html
>
> Thanks, this is interesting,

Interesting for what? Do you have an application that needs 100,000 devices?


> one of the embedded charts says it has 1K
> instruction words (12 bit words) and 128 nibbles of ram. In fact it
> looks like the program memory is OTP PROM rather than mask rom. I don't
> see a description of the instruction set, but it says "4 bit risc",
> and runs at up to 4 MIPS.
>
> As for peripherals, the SOP8 version has five i/o pins (1 I), 3 timers,
> 2 PWM, WDT, and a 32khz xtal connection with an RTC. There is an
> SOT23-6 version with less stuff connected to the pins. I don't see a
> whole lot of difference from what you find in cheap 8-bit processors.
>
> Actually here is another page about that part, with maybe more info:
>
> http://www.upt-ic.com/en/products.aspx?id=0
>
> It mentions that the "4-bit MCU can evade the patent of sop-8 package,
> because it adopts 4-bit bus architecture wire." No idea what that is
> about.

I don't either. I can only assume they are talking about some patent having to do with getting data in and out of the chip maybe. Don't know.


> There are quite a few configurations listed there. But, 1) I'd
> like to know how the cost compares with the Padauk stuff, 2) how old the
> design is. Certainly the market for super low cost stuff will never go
> away. So we should continue to see new designs once in a while. What
> do they look like?

They look like coffee makers and remote controls.


> > if you give them your program for an 8051, for example, they will
> > translate it into their chip's instruction set.
> Their chip runs at 16 mhz while the original 8051 was a heck of a lot
> slower, besides using more cycles per instruction. That suggests a more
> modern fab process than was used in the early 8-bit era.

Or it's faster because it's a smaller design with shorter paths. It's not hard to use a newer process than the 8051 was made on. That was what, 30 years ago?


> Those 4-bit parts all seem to have program OTP, some of them up to 2K*16
> bits. That is enough code space that if I expected to ship multiple
> generations or variants of a product, recurring software development
> would become a significant cost component, so a C environment (or Forth,
> given where we are) would start looking more attractive than assembler
> despite possibly being slightly less hardware-efficient.

I have no idea what you are talking about. Software development costs have to do with the software development. What does the memory on the chip have to do with it?

--

Rick C.

++-- Get 1,000 miles of free Supercharging
++-- Tesla referral code - https://ts.la/richard11209

Paul Rubin

unread,
Jun 11, 2022, 10:49:14 PM6/11/22
to
Rick C <gnuarm.del...@gmail.com> writes:
>> Thanks, this is interesting,
> Interesting for what? Do you have an application that needs 100,000 devices?

Interesting as info about the state of current hardware, which is part
of the state of the universe. I just read something in the news today
about the discovery of the first known black hole without a known
companion star (not counting the SMBH in galactic nuclei). Other people
other thought that was interesting, or else it wouldn't be news. Do any
of us have a use for it? Probably not. Things can be interesting
without being useful.

Yes I've worked on products made by the millions (a family of payment
terminals built around an ASIC). The ASIC had an ARM core and various
other stuff specific to the product family. Yes they were cost driven.
My boss told me they would sometimes launch significant engineering
projects with the sole purpose of getting 50 cents out of the unit cost
of those things. That 50 cents per unit figure suggested that they
would not do the same for 1 cent per unit.

The largest quantity of a real product I remember a regular here being
involved with is Bernd Paysan's b16-based battery controller, which
supposedly lives in a corner of an ASIC inside of around 1e8 Apple
phones. That is a 16 bit processor, programmed in Forth. Do you think
he could have saved by using 4 bits? His 16 bit cpu actually replaced
an 8051-based one.

>> So we should continue to see new designs once in a while. What
>> do they look like?
> They look like coffee makers and remote controls.

No I mean what do the microprocessors look like. 4 bits? 8 bits? New
architecture? Existing? Part of a remote control SOC? Many of those
Phaeton chips had seemingly special features like 150 volt switches.

> I have no idea what you are talking about. Software development costs
> have to do with the software development. What does the memory on the
> chip have to do with it?

If an MCU has 64 words of code space, it can only run an extremely
simple program. You can write such a program in assembly language and
maintain it across multiple versions without going crazy. If it has a
megabyte of code space, it is made to run complex software. Using
assembler would be nuts. Languages and dev tools to manage the
complexity will be indispensible.

That particular chip had 2K words of code space which is around the
point where complexity becomes significant enough that you benefit from
managing it using HLL's etc. If I'm going to use the whole 2Kw, I'd
certainly prefer using C or Forth instead of assembler. There are tons
of C and Forth implementations that work nicely on 8 bitters of that
size. I don't know if there are any for 4 bitters.

If I had a product feature list that I thought would take 2Kw of code to
implement, the software cost difference between C and assembler might
easily be in the 10k dollar range over the program's lifecycle. So with
1e6 units, that is 1 cent per unit all by itself. Therefore if the 4
bit cpu has to be programmed in asm while the 8 bitter can use C, then
the 4 bit hardware has to be at least 1 cent cheaper to be worth
thinking about. Is it that much cheaper? Only actual data can tell us
that, not philosophy.

Paul Rubin

unread,
Jun 11, 2022, 11:15:55 PM6/11/22
to
Rick C <gnuarm.del...@gmail.com> writes:
> It says nothing of the sort. It says the more modern processes are
> not appropriate for making MCU devices that interface to 3.3 and 5
> volt circuits. Nobody wants an MCU with 1.2V interfaces.

The RP2040 and ESP32 are both made in 40nm. I don't think they use 1.2V
interfaces. RP2040 might have been doable in a larger process:

https://semiwiki.com/forum/index.php?threads/cost-tradeoffs-at-28nm-vs-40nm-arm-m0.13887/

> I like the way you invent a narrative based on virtually no facts. If
> this part actually contains 264 kB of RAM, that is what drives the die
> size. That's a lot of RAM for an MCU.

ISTM that they had a budget and that told them the die size, and then
they put on whatever amount of ram would fit. They certainly didn't
need two cores or 264kB. I earlier said they wanted control
capabilities similar to an AVR, but it occurs to me, they likely wanted
to program it in MicroPython rather than just in C, so they needed 32
bits. But, we know from the BBC micro:bit v1 that MicroPython can limp
along with 16KB of ram, and from the Adafruit SAMD21 boards that it is
pleasant to use with 32KB. The RP2040 would have been quite comfortable
with 64KB. They put in 4x that much because they could afford to, not
because they had to.

Regarding coffee pots: here is a Cuisinart DCC 1200 teardown. No MCU
visible but it looks expensive inside. It is around $100 retail. Yet I
don't see significant features that the super cheap pots don't also
have.

https://fullychargd.blogspot.com/2015/06/cuisinart-dcc-1200-teardown-and-power_7.html

dxforth

unread,
Jun 11, 2022, 11:30:09 PM6/11/22
to
On 12/06/2022 12:49, Paul Rubin wrote:
>
> If I had a product feature list that I thought would take 2Kw of code to
> implement, the software cost difference between C and assembler might
> easily be in the 10k dollar range over the program's lifecycle. So with
> 1e6 units, that is 1 cent per unit all by itself. Therefore if the 4
> bit cpu has to be programmed in asm while the 8 bitter can use C, then
> the 4 bit hardware has to be at least 1 cent cheaper to be worth
> thinking about. Is it that much cheaper? Only actual data can tell us
> that, not philosophy.

Did Commodore charge a premium because somebody had to program the software
in asm? It was essentially free compared to the millions of units pumped out.
It's humans that are reprogrammable and disposable as current world events
make clear.

Paul Rubin

unread,
Jun 11, 2022, 11:43:53 PM6/11/22
to
dxforth <dxf...@gmail.com> writes:
> Did Commodore charge a premium because somebody had to program the
> software in asm? It was essentially free compared to the millions of
> units pumped out.

This I don't know. Hardware back then (cpu and memory) was stupendously
expensive by today's standards. Being able to stretch it a little
further made a real difference in your product's profitability. And
compilers at the time were not very good. That era was a bit before my
time, but I think asm language programming then was at least basically
sane.

If you go back even further, you get the story of Mel:

http://www.pbm.com/~lindahl/mel.html

So you could imagine some kind of engineering meeting within Commodore,
where they debated programming in "regular style" assembler on the
hardware that they used, vs "Mel style" assembler to save a few more
cents on memory. ISTM that "regular style" assembler by then would
have, or should have, won the day.

Wayne morellini

unread,
Jun 11, 2022, 11:48:18 PM6/11/22
to
I'll drop in here. It is strangely calming to catch up on others debates (except when protagonists are getting to long winded, resulting in everybody else matching them).

The obvious was a 4 or 1 bit misc chip is doable with whatever sized data inside. That we have nearly that already, which could be adopted in Dr Ting's designs. People want 4 bit, get together and do it. I looked at using the marc4 long ago, but it wasn't sufficient. But a 26 bit mips with a 4 internal memory or X size, or 1 or 4 bit memory interface sounds ok enough. I was meaning to contact a compact low energy high speed and low-cost everything, memory supplier with a in chip module scheme, so chips like misc, could easily have access to high bandwidth internal memory, without the survival processes. Just the type of thing I do on the background from time to time.

However, here is one of my other schemes for cost savings compared to 4 bits, a misc processor in a single pin that attaches to the main board, using a broadcast protocols like ethernet does. In my OS design. I came up with strategies to micromanage conflict avoidance, and maximise efficiency. I think modern protocols have developed something similar now. Most of the time the real time flows are predictable, and here limited, so it's very good for a low usage scheme. What people probably don't know, is I intended to shift to an optical broadcast version. There is a way for even a 32 bit misc to get into a coffee maker.

BTW, pull apart the coffee maker's control panel and display and see if there is a controlling 4 bit processor hidden in there? Bye!

Paul Rubin

unread,
Jun 12, 2022, 12:30:07 AM6/12/22
to
Wayne morellini <waynemo...@gmail.com> writes:
> However, here is one of my other schemes for cost savings compared to
> 4 bits, a misc processor in a single pin that attaches to the main
> board, using a broadcast protocols like ethernet does.

There is or was something called the Maxim 1-wire protocol but I think
its purpose was simplifying rugged hardware packaging, rather than cost
reduction per se. That stuff wasn't expensive but it wasn't super cheap
either.

But, I think these discussions of low end cpu costs are kind of silly.
Here is a thought experiment: call up some low end chip vendor and ask
for a quote for 100 million of the cheapest microprocessor they make.
They tell you $X. Then ask for a quote of 100 million of a different
chip in the same package, where instead of a microprocessor it only has
a couple of NAND gates like an old 7400 (i.e. there still has to be
silicon inside, wired to pins that do something). They tell you $Y.

Hopefully, Y < X. But I suspect that Y > X/2. In other words, most of
the cost of a cheap microprocessor today is packaging, testing,
shipping, sales overhead, yada yada, not the gates and transistors on
the die. It wasn't anything like that in the 8080 era.

dxforth

unread,
Jun 12, 2022, 12:44:53 AM6/12/22
to
On 12/06/2022 13:43, Paul Rubin wrote:
> dxforth <dxf...@gmail.com> writes:
>> Did Commodore charge a premium because somebody had to program the
>> software in asm? It was essentially free compared to the millions of
>> units pumped out.
>
> This I don't know. Hardware back then (cpu and memory) was stupendously
> expensive by today's standards.

Component count was expensive. Why Tramiel pushed at every step to reduce
it. It's the reason VIC and 64 had dedicated graphics ships and software
UART. By the time production ceased, board component count had reduced
further. I no longer recall details (fewer DRAM chips and ASICs to replace
discretes?) but can certainly attest to later boards looking "empty" in
comparison.

> Being able to stretch it a little
> further made a real difference in your product's profitability. And
> compilers at the time were not very good. That era was a bit before my
> time, but I think asm language programming then was at least basically
> sane.

The 80's needed asm programmers. It may be less so today. Programmer
costs (whatever it may be) remain.

Paul Rubin

unread,
Jun 12, 2022, 1:12:46 AM6/12/22
to
dxforth <dxf...@gmail.com> writes:
> Component count was expensive. Why Tramiel pushed at every step to reduce
> it. It's the reason VIC and 64 had dedicated graphics ships and software
> UART.

I'd be interested to know if the C64 factory software did much more than
the VIC and CBM software from earlier on. I thought the C64 was a
response to the then-amazing 64 kilobit DRAMs, giving vast amounts of
unused memory space to its users, i.e. coming with a similar BASIC to
what the VIC had. If yes, Commodore may not have used the C64's
inherently greater software flexibility.

> The 80's needed asm programmers. It may be less so today. Programmer
> costs (whatever it may be) remain.

Asm is super niche now. Even C is niche, outside of the embedded
sector, which itself is niche. I've certainly worked around some very
good fulltime web programmers who have never seen C code.

dxforth

unread,
Jun 12, 2022, 2:53:57 AM6/12/22
to
On 12/06/2022 15:12, Paul Rubin wrote:
> dxforth <dxf...@gmail.com> writes:
>> Component count was expensive. Why Tramiel pushed at every step to reduce
>> it. It's the reason VIC and 64 had dedicated graphics ships and software
>> UART.
>
> I'd be interested to know if the C64 factory software did much more than
> the VIC and CBM software from earlier on. I thought the C64 was a
> response to the then-amazing 64 kilobit DRAMs, giving vast amounts of
> unused memory space to its users, i.e. coming with a similar BASIC to
> what the VIC had. If yes, Commodore may not have used the C64's
> inherently greater software flexibility.

With the VIC20, CBM was competing with Atari. Originally planned as a
games console (Ultimax), the C64 improved on the VIC with better graphics,
screen resolution, memory etc.

>> The 80's needed asm programmers. It may be less so today. Programmer
>> costs (whatever it may be) remain.
>
> Asm is super niche now. Even C is niche, outside of the embedded
> sector, which itself is niche. I've certainly worked around some very
> good fulltime web programmers who have never seen C code.

Do they do less or cost less? Those who made the transition such as Tom
Zimmer did it for the work.

"Well, I describe myself as a C programmer, who is really a Forth programmer.
C has provided employment, and Forth provides tools for hardware and software
debugging."

Paul Rubin

unread,
Jun 12, 2022, 4:05:09 AM6/12/22
to
dxforth <dxf...@gmail.com> writes:
>> good fulltime web programmers who have never seen C code.
> Do they do less or cost less?

In the old days, good sailors knew how to navigate by the stars. Now
they use GPS. It's like that. One less thing to deal with, so you can
direct your energy elsewhere.

Rick C

unread,
Jun 12, 2022, 10:30:08 AM6/12/22
to
On Saturday, June 11, 2022 at 10:49:14 PM UTC-4, Paul Rubin wrote:
> Rick C <gnuarm.del...@gmail.com> writes:
> >> Thanks, this is interesting,
> > Interesting for what? Do you have an application that needs 100,000 devices?
> Interesting as info about the state of current hardware, which is part
> of the state of the universe. I just read something in the news today
> about the discovery of the first known black hole without a known
> companion star (not counting the SMBH in galactic nuclei). Other people
> other thought that was interesting, or else it wouldn't be news. Do any
> of us have a use for it? Probably not. Things can be interesting
> without being useful.
>
> Yes I've worked on products made by the millions (a family of payment
> terminals built around an ASIC). The ASIC had an ARM core and various
> other stuff specific to the product family. Yes they were cost driven.
> My boss told me they would sometimes launch significant engineering
> projects with the sole purpose of getting 50 cents out of the unit cost
> of those things. That 50 cents per unit figure suggested that they
> would not do the same for 1 cent per unit.

You said, "launch significant engineering projects". That's not remotely the same as picking a processor as a part of development.


> The largest quantity of a real product I remember a regular here being
> involved with is Bernd Paysan's b16-based battery controller, which
> supposedly lives in a corner of an ASIC inside of around 1e8 Apple
> phones. That is a 16 bit processor, programmed in Forth. Do you think
> he could have saved by using 4 bits? His 16 bit cpu actually replaced
> an 8051-based one.

His CPU was in an ASIC. We are talking about discrete 4 bit MCUs vs. 8 bit MCUs. I hope you understand the difference. You also know nothing of the requirements. Obviously they were designing a new chip for a reason, such as asking it to do something new or additional. The devil is in the details.

But if you don't have an application, you probably don't understand the requirements that drive such a selection. I believe that you don't do much hardware design, correct?


> >> So we should continue to see new designs once in a while. What
> >> do they look like?
> > They look like coffee makers and remote controls.
> No I mean what do the microprocessors look like. 4 bits? 8 bits? New
> architecture? Existing? Part of a remote control SOC? Many of those
> Phaeton chips had seemingly special features like 150 volt switches.
> > I have no idea what you are talking about. Software development costs
> > have to do with the software development. What does the memory on the
> > chip have to do with it?
> If an MCU has 64 words of code space, it can only run an extremely
> simple program. You can write such a program in assembly language and
> maintain it across multiple versions without going crazy. If it has a
> megabyte of code space, it is made to run complex software. Using
> assembler would be nuts. Languages and dev tools to manage the
> complexity will be indispensible.
>
> That particular chip had 2K words of code space which is around the
> point where complexity becomes significant enough that you benefit from
> managing it using HLL's etc. If I'm going to use the whole 2Kw, I'd
> certainly prefer using C or Forth instead of assembler. There are tons
> of C and Forth implementations that work nicely on 8 bitters of that
> size. I don't know if there are any for 4 bitters.

I still don't get your point. Forth is traditionally a self written tool. If it doesn't exist, it is not so hard to write. It would need to be a cross compiler, which is even easier in many respects, since you can start with an existing Forth and add to it. I've done that for a custom stack processor.


> If I had a product feature list that I thought would take 2Kw of code to
> implement, the software cost difference between C and assembler might
> easily be in the 10k dollar range over the program's lifecycle. So with
> 1e6 units, that is 1 cent per unit all by itself. Therefore if the 4
> bit cpu has to be programmed in asm while the 8 bitter can use C, then
> the 4 bit hardware has to be at least 1 cent cheaper to be worth
> thinking about. Is it that much cheaper? Only actual data can tell us
> that, not philosophy.

That's your thinking. Most of these devices have zero software maintenance. I know of products that are shipped with defects that were known when the product was introduced to the market and never fixed in production.

I was hoping to hear from people who actually use these devices. Whatever. This is not the thread to discuss it. Too much drift.

--

Rick C.

++-+ Get 1,000 miles of free Supercharging
++-+ Tesla referral code - https://ts.la/richard11209

Rick C

unread,
Jun 12, 2022, 10:35:19 AM6/12/22
to
What are you talking about??? What do you think drives the digital time display?

--

Rick C.

+++- Get 1,000 miles of free Supercharging
+++- Tesla referral code - https://ts.la/richard11209

Anton Ertl

unread,
Jun 12, 2022, 10:52:34 AM6/12/22
to
Marcel Hendrix <m...@iae.nl> writes:
>I think the (possible) reason is not so much the number of bits in an
>instruction, but the the number of pins on the package (or the effort
>needed to check all pins). At least, that is the way it works for analog
>discrete stuff: all small-signal bipolar transistors have the same
>minimum cost. That's why I was surprised to see a 4-bit processor with
>24 pins in the list -- that should make absolutely no sense.

For the kinds of small MCUs that we are discussing, both RAM and ROM
are on-chip, so the width of the data path has no influence on the
number of pins. The only pins there are are power, ground, and I/O.
And if 22 pins of I/O are needed, a 24-pin package makes a lot of
sense.

- anton
--
M. Anton Ertl http://www.complang.tuwien.ac.at/anton/home.html
comp.lang.forth FAQs: http://www.complang.tuwien.ac.at/forth/faq/toc.html
New standard: https://forth-standard.org/
EuroForth 2022: http://www.euroforth.org/ef22/cfp.html

S Jack

unread,
Jun 12, 2022, 11:31:00 AM6/12/22
to
On Sunday, June 12, 2022 at 3:05:09 AM UTC-5, Paul Rubin wrote:

> In the old days, good sailors knew how to navigate by the stars. Now
> they use GPS. It's like that. One less thing to deal with, so you can
> direct your energy elsewhere.

I'm often out under the night sky for extended periods and use the
stars to orient myself on arrival at new locations. Polaris is a
hand high so I know I'm in Texas and where North is. Scorpio is
the South horizon with the Milky Way behind its tail running into
Cygnus overhead which is pointing its head in the middle of the
Summer Triangle.
For someone who seldom gets his butt outside the night sky may look
daunting but it really isn't.
--
me

Stephen Pelc

unread,
Jun 12, 2022, 3:30:41 PM6/12/22
to
On 7 Jun 2022 at 11:52:46 CEST, "Wayne morellini" <waynemo...@gmail.com>
wrote:

> But recently, I saw a document on Colorforth for ARM, and comparisons to Swift
> Forth etc.

Reference? Link?

Stephen
--
Stephen Pelc, ste...@vfxforth.com
MicroProcessor Engineering, Ltd. - More Real, Less Time
133 Hill Lane, Southampton SO15 5AF, England
tel: +44 (0)23 8063 1441, +44 (0)78 0390 3612, +34 649 662 974
http://www.mpeforth.com - free VFX Forth downloads

Paul Rubin

unread,
Jun 12, 2022, 9:27:20 PM6/12/22
to
Rick C <gnuarm.del...@gmail.com> writes:
> You said, "launch significant engineering projects". That's not
> remotely the same as picking a processor as a part of development.

Yeah fair enough. The processor was already on an ASIC and had been
since long before I got there. I don't know if earlier versions of the
product used commodity processors or how they were picked.

> His CPU was in an ASIC. We are talking about discrete 4 bit MCUs
> vs. 8 bit MCUs. I hope you understand the difference. You also know
> nothing of the requirements. Obviously they were designing a new chip
> for a reason, such as asking it to do something new or additional.

He'd know better, but I don't think they were designing a new chip, as
opposed to iterating an existing design. The old chip had an 8051 core
doing the battery stuff, and the new chip had a b16 (maybe -small).

> But if you don't have an application, you probably don't understand
> the requirements that drive such a selection. I believe that you
> don't do much hardware design, correct?

I think you are obfuscating this. I'm using your picture of a coffee
pot engineer choosing an MCU to deploy in 1e6 coffee pots. She or he
chooses between an 8-bit chip that costs $X and a 4-bit chip that costs
$Y. The main issue here would be the cost difference $(X-Y). I can
accept the idea that if $(X-Y) > $0.01 and there aren't consequential
additional costs from the choice, then the 4 bit chip wins. I would
like to see evidence that the difference, today, in 2022, not 30 years
ago, is that large. If it's $0.0001 then I don't know what happens. If
it's negative, which AFAIK it might be, then 8 bits win.

I'm not even convinced that a coffee pot with a timer would have an MCU,
as opposed to e.g. some kind of dedicated power timer chip containing a
high voltage switch. I'm not convinced that a designer in 2022 would
face such a choice.

You are right that I don't design hardware, but I studied a lot of
mathematics and I can look at two numbers and figure out which one is
bigger. All the insinuation about hardware design is misdirection. I
need to see the actual numbers in order to be convinced.

> I still don't get your point. Forth is traditionally a self written
> tool. If it doesn't exist, it is not so hard to write. It would need
> to be a cross compiler, which is even easier in many respects

I have never heard of a Forth (or C) cross compiler targeted to a 4 bit
MCU. Do you know of any? For 8 bit MCU's, both are plentiful.

In either case, porting or writing a Forth compiler for a small MCU and
then writing an application with it has to be more work than programming
the application in assembler directly. "Small" here = 2k code words.
If you're going to write a lot of such applications, developing the
compiler as a one-time project might be worthwhile, but you'd budget it
differently.

>> the software cost difference between C and assembler might easily be
>> in the 10k dollar range over the program's lifecycle.
>
> That's your thinking. Most of these devices have zero software
> maintenance.

The device itself has no sw maintenance once shipped, but the company
will keep making new variants of the product, each with slightly
different features. White label coffee pot = customers keep coming up
with new requests. The software has to be modified for each of those.

> I was hoping to hear from people who actually use these
> devices. Whatever. This is not the thread to discuss it. Too much
> drift.

Comp.arch.embedded maybe.

Rick C

unread,
Jun 12, 2022, 9:38:39 PM6/12/22
to
On Sunday, June 12, 2022 at 9:27:20 PM UTC-4, Paul Rubin wrote:
> Rick C <gnuarm.del...@gmail.com> writes:
> > You said, "launch significant engineering projects". That's not
> > remotely the same as picking a processor as a part of development.
> Yeah fair enough. The processor was already on an ASIC and had been
> since long before I got there. I don't know if earlier versions of the
> product used commodity processors or how they were picked.
> > His CPU was in an ASIC. We are talking about discrete 4 bit MCUs
> > vs. 8 bit MCUs. I hope you understand the difference. You also know
> > nothing of the requirements. Obviously they were designing a new chip
> > for a reason, such as asking it to do something new or additional.
> He'd know better, but I don't think they were designing a new chip, as
> opposed to iterating an existing design. The old chip had an 8051 core
> doing the battery stuff, and the new chip had a b16 (maybe -small).

Sometimes we are in different worlds. "Iterating" a chip means making a new chip in my world. You don't "iterate" a chip unless the requirements are different, or there are bugs that must be fixed.


> > But if you don't have an application, you probably don't understand
> > the requirements that drive such a selection. I believe that you
> > don't do much hardware design, correct?
> I think you are obfuscating this. I'm using your picture of a coffee
> pot engineer choosing an MCU to deploy in 1e6 coffee pots. She or he
> chooses between an 8-bit chip that costs $X and a 4-bit chip that costs
> $Y. The main issue here would be the cost difference $(X-Y). I can
> accept the idea that if $(X-Y) > $0.01 and there aren't consequential
> additional costs from the choice, then the 4 bit chip wins. I would
> like to see evidence that the difference, today, in 2022, not 30 years
> ago, is that large. If it's $0.0001 then I don't know what happens. If
> it's negative, which AFAIK it might be, then 8 bits win.

Ok, so are you looking for the evidence?


> I'm not even convinced that a coffee pot with a timer would have an MCU,
> as opposed to e.g. some kind of dedicated power timer chip containing a
> high voltage switch. I'm not convinced that a designer in 2022 would
> face such a choice.

Again, I have no idea where you live. Coffee makers in my world have a digital clock with a 4 digit display and a handful of buttons that make it very hard to set the timer and use the damn thing. Why so few buttons? Because they cost something like $0.01 each and if they aren't essential, they get thrown away.


> You are right that I don't design hardware, but I studied a lot of
> mathematics and I can look at two numbers and figure out which one is
> bigger. All the insinuation about hardware design is misdirection. I
> need to see the actual numbers in order to be convinced.

You have to know what numbers to calculate. Any boob can use a calculator.


> > I still don't get your point. Forth is traditionally a self written
> > tool. If it doesn't exist, it is not so hard to write. It would need
> > to be a cross compiler, which is even easier in many respects
> I have never heard of a Forth (or C) cross compiler targeted to a 4 bit
> MCU. Do you know of any? For 8 bit MCU's, both are plentiful.

I thought the MARK4 was mentioned in this thread, no? It was intended for the key fob market and was programmed in Forth.


> In either case, porting or writing a Forth compiler for a small MCU and
> then writing an application with it has to be more work than programming
> the application in assembler directly. "Small" here = 2k code words.
> If you're going to write a lot of such applications, developing the
> compiler as a one-time project might be worthwhile, but you'd budget it
> differently.

I'm just tired of discussing this with you. Whatever. This is not in any way consequential and you forget the things that have been mentioned earlier in the conversation, like the digital time display.


> >> the software cost difference between C and assembler might easily be
> >> in the 10k dollar range over the program's lifecycle.
> >
> > That's your thinking. Most of these devices have zero software
> > maintenance.
> The device itself has no sw maintenance once shipped, but the company
> will keep making new variants of the product, each with slightly
> different features. White label coffee pot = customers keep coming up
> with new requests. The software has to be modified for each of those.

If the company makes a new product, that's a new product.


> > I was hoping to hear from people who actually use these
> > devices. Whatever. This is not the thread to discuss it. Too much
> > drift.
> Comp.arch.embedded maybe.

Or just not in someone else's thread.

--

Rick C.

++++ Get 1,000 miles of free Supercharging
++++ Tesla referral code - https://ts.la/richard11209

dxforth

unread,
Jun 12, 2022, 10:44:02 PM6/12/22
to
What's changed is the focus and rate. The more technology one surrounds
oneself with, the greater are the demands and pressure because now one is
reliant upon them. Diogenes may have had a point.

dxforth

unread,
Jun 12, 2022, 11:07:55 PM6/12/22
to
On 13/06/2022 11:38, Rick C wrote:
>
> I thought the MARK4 was mentioned in this thread, no? It was intended for the key fob market and was programmed in Forth.

I wasn't aware of any of that. Does this mean I have to change my opinion
of Atmel and its designs? :)

https://en.wikichip.org/w/images/4/44/MARC4_4-bit_Microcontrollers_Programmer%27s_Guide.pdf

dxforth

unread,
Jun 12, 2022, 11:13:55 PM6/12/22
to

Paul Rubin

unread,
Jun 12, 2022, 11:28:05 PM6/12/22
to
Rick C <gnuarm.del...@gmail.com> writes:
> Sometimes we are in different worlds. "Iterating" a chip means making
> a new chip in my world. You don't "iterate" a chip unless the
> requirements are different, or there are bugs that must be fixed.

Yes, and in the world of real products, requirements keep changing. And
if the product has a million lines of software and you change a few
thousand of them and recompile, I wouldn't call that writing a new
program. If you publish a novel and then fix some typos in the second
printing, I wouldn't call that writing a new novel. And if you fix some
errata in a chip, well you see where this is going. I suppose from the
fab perspective it's a new chip, but from the design perspective it
isn't. There's millions of lines of VHDL and you change a few.

And you constantly optimize and tweak stuff even if the requirements
don't change. Do you seriously think today's 555 timer is made from the
same masks as 50 years ago?

> Ok, so are you looking for the evidence?

I'm interested in seeing it. I'm not going crazy searching for it but
I've looked around a little bit and not found much. I'm not the one
making claims. I've expressed skepticism to your claims. I'm open to
being convinced by evidence.

> Coffee makers in my world have a digital clock with a 4 digit display
> and a handful of buttons that make it very hard to set the timer and
> use the damn thing. Why so few buttons? Because they cost something
> like $0.01 each and if they aren't essential, they get thrown away.

My coffee maker has no timer, but I use a rice cooker with a setup like
that, and it doesn't seem all that cheap. The buttons and display are
fairly large. Its marketroids also like to claim that the software
inside it is sophisticated, though idk what it is really doing. It
supposedly uses "fuzzy logic" to figure out how to cook the rice more
consistently than a traditional thermostat cooker would.

I also have a pressure cooker that has a timer but no digital clock. It
has quite a lot of buttons for cooking different kinds of stuff (soup,
grains, etc.) though afaict they really just run the cooker for
different amounts of time. So the extra buttons aren't really all that
necessary and if they cared about the $.01/button they could have just
left them out.

As mentioned, AvE took one apart and IIRC he found an 8 bit MCU inside,
but I would have to watch the video again. They do have more expensive
models with clocks so I guess they saved a few cents by leaving the
clock out of mine.

How about a temperature controlled soldering iron, is that simple
enough? This one has a 32 bit cpu and multitasking OS:
https://www.pine64.org/pinecil/

> You have to know what numbers to calculate. Any boob can use a
> calculator.

Yes, and so far you have consistently failed to put up any numbers that
have any convincing connection with reality.

> I thought the MARK4 was mentioned in this thread, no? It was intended
> for the key fob market and was programmed in Forth.

I remember mention of the MARC4 but didn't realize it was programmed in
Forth. That is interesting and I'll see what I can find about it. It
has been discontinued for a while, from what I understand. I guess its
cost advantage over 8 bitters didn't persist.

> mentioned earlier in the conversation, like the digital time display.

That would be one of the functions of the timer chip, if that's what you
are getting at.

> If the company makes a new product, that's a new product.

If the new product can be made by slightly modifying the old product,
they're not going to start a new design from scratch. There is a whole
engineering discipline about dealing with lots of product variants in
the field, at least for large products like aircraft. No idea about how
it is done for coffee pots but it's a safe bet that they aren't idiots.
It is loading more messages.
0 new messages