Is it time for another Forth chip?

1383 views
Skip to first unread message

Wayne morellini

unread,
Jun 7, 2022, 5:52:48 AMJun 7
to
I know we are waiting to hear what the 6Ghz chip Stephen has been working with will turn out like, and what Green Arrays will release for the glasses (which type of thing demands an advanced design). But recently, I saw a document on Colorforth for ARM, and comparisons to Swift Forth etc. Which got me wondering about a lower end design. Now, with the passing of Doctor Ting, it reminds me of the Mup21 he had that kicked things off, and Jeff's work latter. Isn't it time we had something more like these designs upgraded? 16 bit or more versions?

minf...@arcor.de

unread,
Jun 7, 2022, 12:38:59 PMJun 7
to
Wayne morellini schrieb am Dienstag, 7. Juni 2022 um 11:52:48 UTC+2:
> I know we are waiting to hear what the 6Ghz chip Stephen has been working with will turn out like, and what Green Arrays will release for the glasses (which type of thing demands an advanced design). But recently, I saw a document on Colorforth for ARM, and comparisons to Swift Forth etc. Which got me wondering about a lower end design. Now, with the passing of Doctor Ting, it reminds me of the Mup21 he had that kicked things off, and Jeff's work latter. Isn't it time we had something more like these designs upgraded? 16 bit or more versions?

A Micropython chip will be more attractive, I guess.

Paul Rubin

unread,
Jun 7, 2022, 2:58:01 PMJun 7
to
Wayne morellini <waynemo...@gmail.com> writes:
> Isn't it time we had something more like these designs upgraded? 16
> bit or more versions?

If you want to build it, no one is stopping you! There is a free fab
service from Google that you probably know about, for FOSS projects. I
think it would mostly be a research-ich project to see what performance
you can get with limited hw resources, compared with conventional
designs. A b16-like Forth macrocell might be more directly useful for
people making their own chips.

For anyone mostly interested in Forth as a programming language, special
chips are likely too much hassle to even think about, without some
overwhelming advantage over regular chips.

David Schultz

unread,
Jun 7, 2022, 5:13:08 PMJun 7
to
On 6/7/22 4:52 AM, Wayne morellini wrote:
> I know we are waiting to hear what the 6Ghz chip Stephen has been working with will turn out like, and what Green Arrays will release for the glasses (which type of thing demands an advanced design). But recently, I saw a document on Colorforth for ARM, and comparisons to Swift Forth etc. Which got me wondering about a lower end design. Now, with the passing of Doctor Ting, it reminds me of the Mup21 he had that kicked things off, and Jeff's work latter. Isn't it time we had something more like these designs upgraded? 16 bit or more versions?

The first step would be to implement your design on a FPGA to see how it
works. Then, if it works and there is some perceived need for more
speed, and a market to justify the expense, go full custom silicon. Not
standard cell please. That is a close cousin of FPGA.

The tools to dabble in custom silicon are out there. I used magic in
grad school long ago and it is still kicking and screaming on Linux.
IRSIM is the logic level simulation tool for that and it is also still
around. I used magic on a 2u process but the idea was that the design
rules were scalable. I think that assumption died a while back.

http://opencircuitdesign.com/


Alas the tiling tools (Lager) I used back then are harder to find.


Start with a FPGA. Jumping to full custom is going to be a big step
because you will almost certainly need a cache with all its complexity
if you use off chip memory. An FPGA might be slow enough that SRAM can
keep up.


--
http://davesrocketworks.com
David Schultz

Wayne morellini

unread,
Jun 8, 2022, 6:39:23 AMJun 8
to
A more fancy fashion accessory. Why waste the time on a dead end performance implementation that could never be as good?

Wayne morellini

unread,
Jun 8, 2022, 7:59:50 AMJun 8
to
On Tuesday, June 7, 2022 at 7:52:48 PM UTC+10, Wayne morellini wrote:
>

I wasn't expecting such negative answers, but more community spirit. If I wanted to do my own design, I would do it under agreement, this is more about community design. I remember when I wanted to implement my own uniquely improved design, and people wanted to seize control of it. In real life money flows one way, in business money tend to flow one opposite way (that's not a Chinese proverb, that's just me writing). :)

I had thought of colorforth as a sort of model of a stripped down x86 mode, like Arm has Thumb, where the instructions can use existing instruction circuitry. But it could also be an alternative to Thumb on the Arm too. A future basis They could make a super light weight computing chip (SLWC) out of. Not as light as the recent misc's, but also able to be put in arrays, and used in a similar way that Arm is used in custom chips. So, that companies could incorporate into many forms of chips.

So. a cell, isa and interfacing spec (basically what they use for Arm chips), and pursue a manufacturer to kick off with a io logic processor version (also suitable in arrays), a minimal microcontroller (also suitable as an array controller) and a full system on a chip (also useful to have an array it controls) in 16 and 32 bit versions. All the same two cells with different attachments. To kick things off, and attract other supporters instead of RiscV and Arm. Basically, largely coming in the low end, with the 32 bit being a simpler alternative to the high end. There is no reason you could not reliably emulate desktop OS administration and security. Sort of like a KaiOS/Firefox OS smart phone compared to an android phone. The simpler getting a larger share of lower end product category.

As far as my own. It's unaffordable, but maybe in the future if success is already obtained to afford it. Isn't Google non commercial open source or something. Maybe that helps here, but not for myself. Control of design and use of resources to enhance design and market is needed to maximise public good here. But, 2 micron chips? I'm talking about modern process sizes, certainly not less than the 180nm which is currently used for misc.

Commercially myself, I am more interested in simple printed circuits at 2.5 micro to 1 micron. I think I finally have a clean room solution to do this at home. That's more doable on an individual's level. 1 billion dollars to $100, it doesn't matter so much about making chips so small if the cost of making is cheap enough. I tried to contact Jeff a long time ago about a seperate technology, which might even be suitable for 180nm, where they could role stamp a circuit. I was envisioning printing on plastic sheets, and stacking them together in a stack to make a super handheld Game system. If you wanted to use normal silicon chip process structures, there are other sheets. A thin stack gives you the whole system. This sort of tech is probably more viable for GA's current mode sizes. I know somebody with a number of chip ovens in his low throughput factory. If he can do it, GA, could as well. There are places out there, just like with the Swiss watch manufacturers and 180nm process nodes. Imagine if they could make a all in one system board solution in 1 inch, for a fraction of the price compared to using conventional techniques and their normal process node, a big saving for them and their clients. But, that is only for low clock low energy applications, which is most things. High concentrations of processing required less stacking, different stack materials, and more cooling. So, you can apply this to a lot more situations requiring significant processing loads too, before you have to resort to conventional chips. My design drafting for my retro gaming computer and processing designs are turning up a lot of simple alternative ways of don't things. So, a lot of things can be done.

Anyway, I remember Jeff telling me he couldn't get any further help with money for chips from his family, unless it was the kind you get with fish. Considering what else happened to him with his prototype chip run, it was a real loss to the computer industry that he couldn't do the F series of processors. He had good intent of ideas, and his wafer arrays. Since my father unexpectedly died recently, before I could get back in to spend time with him, I am also coming to a similar dilemma, even if the Lymes treatment seems to be working very well, I'm now neurologically handicapped compared to my past work, bit at least I've got spades of past work to think about. Jeff had ability, lost money and got sick, and I've lost ability, getting better but loosing finance at the wrong time.

Anyway, this is not about what I can do and my work. It's about the community. There is a guy with a channel called the Unknown Cat, that I've seen. And, he was saying he's getting cranky and slowing down because he's getting older. Well, it seems to be a lot of that going around here, instead of community.

Everybody loosing their edge or mind, is getting negative. I've seen

Why bother with b16, I understand that is the one which was cut down for FPGA? Instead of say Dr Ting's 16 or 32 bit, or Jeff's VLSI design? What's wrong with using that compared to modern misc instruction set format?

FPGA makes more sense when the product is FPGA based, or you are prototyping towards final product. There were misc chip simulators out there, after you go through the extending forth to simulate the logicality of the instruction set architecture.

Anyway, some thoughts.

minf...@arcor.de

unread,
Jun 8, 2022, 8:24:51 AMJun 8
to
Just because you don't agree with what others wrote, you call them negative?

Isn't it positive at all that they "warned" before entering a possible dead end?

Rick C

unread,
Jun 8, 2022, 8:52:08 AMJun 8
to
On Tuesday, June 7, 2022 at 5:52:48 AM UTC-4, Wayne morellini wrote:
> I know we are waiting to hear what the 6Ghz chip Stephen has been working with will turn out like, and what Green Arrays will release for the glasses (which type of thing demands an advanced design). But recently, I saw a document on Colorforth for ARM, and comparisons to Swift Forth etc. Which got me wondering about a lower end design. Now, with the passing of Doctor Ting, it reminds me of the Mup21 he had that kicked things off, and Jeff's work latter. Isn't it time we had something more like these designs upgraded? 16 bit or more versions?

I saw your last post, but let me respond to this one. There have been many stack chips, even if virtually no Forth chips (a pedantic distinction, I know). You seem to want to fix a problem in some previous chip without indicating what needs to be addressed. If the Mup21 was a good CPU chip, why is it no longer in production? Without an understanding of that, why do you think a redo would be successful?

I also don't understand the interest in a 16 bit chip. In the microprocessor world, nearly all devices are 4/8, 16 or 32 bits. Over time the trends which have emerged are that 4/8 bit MCUs are still dominant in low end applications where cost is the determinant. 32 bit processors have come down in price enough to dominate in nearly every other application space. 16 bit processors are actually becoming less popular, both in percentages of design wins and numbers of devices shipped. There is no reason to think the applications for stack processors would be any different. A 16 bit device is the last design I would consider... well, maybe next to last, after the 18 bit design.

The next stack processor that should be marketed is a 32 bit design... IF there is a genuine interest in selling the chip to a general market. It also needs a number of features lacking in most stack processors.

1) A crystal clock oscillator, if not two. 32.768 kHz and some high frequency such as 12.0 MHz. While an asynch processor is a great idea, applications are seldom time invariant, so need clocking at some point in the design. Possibly this clock can be included in an external interface, but if no such interface exists, then a high speed clock is needed in the chip. A low speed clock is useful for wake up calls to poll the environment for activity. If the activity has a signal that can wake up the CPU, fine, but often the polling is needed to check an analog status or the state of a noisy I/O, such as a button, that you don't want to wake up for every transition.

2) 3.3V I/Os, if not 5V tolerant. Having to add parts to a design to provide level shifting is an absurdity in today's MCU world. With the vast majority of MCUs having 3.3V I/O capability, if not 5V tolerance, it is hard for a device to compete which requires level shifting.

3) There are others, but this is the real issue with a stack processor... C based development tools. Yes, this may be blasphemy in this group, but if an MCU is going to succeed in the world today, it must support the language that the vast majority of embedded programmers use. There must be the debugging tools the world is used to. Such a chip must allow the world to use it, the way *they* are accustomed to working. The conventional Forth programming paradigm is too radical, and too crude (compared to what they typically use) for wide spread adoption in the rest of the world. There is no CPU architecture that is going to make programmers abandon 50 years of progress in development techniques.

Just my opinion with a few facts.

--

Rick C.

- Get 1,000 miles of free Supercharging
- Tesla referral code - https://ts.la/richard11209

Wayne morellini

unread,
Jun 8, 2022, 9:21:39 AMJun 8
to
Don't you think that's a bit of a twist? They cut accross the conversation negatively towards incompatible directions. This is not a fantasy where down is dysfunctionally up and up is dysfunctionally down. Harder and more limited is better, and actual better is harder and more limited. Community becomes non community, and non community becomes community. It's extremely short sighted and in some places, reverse.

Rick C

unread,
Jun 8, 2022, 9:55:40 AMJun 8
to
What you say is so right, which must mean it is all wrong. Or is that backwards?

--

Rick C.

+ Get 1,000 miles of free Supercharging
+ Tesla referral code - https://ts.la/richard11209

minf...@arcor.de

unread,
Jun 8, 2022, 10:12:52 AMJun 8
to
Second this. In addition, minimal power consumption, idle states, and flexible interrupt
handling capabilities could be some more selling points. Since Forth code can be extremely
compact, memories can be held relatively small
.
Still one would have to be able to beat eg Arduino Pico.

Leave out such "annoying edge computing facilities", one would have to race against atiny4.
Guess who'll win.

Wayne morellini

unread,
Jun 8, 2022, 10:27:04 AMJun 8
to
Facts, hardly anything you said is not accommodated in what I'm saying. You can make it level tolerant. The starting point is not the finish. How many of those 4 bit CPU's are C processors. C is doable here too. This is exclusively a forth county, building forth usage excercise, where success means work for people here, and more new forth programmers. C didn't completely stop other programming where required in the past. In the Arduino like space it's a novel marketable challenge, and the space is ripe for modular future development of small manufacturer mass CE products, which are hard to source parts for cheaply at lower volumes, even to talk to somebody about buying. Here. We get towards commodity chips and modules, that are like chip packages in size. But, that all requires foresight in a ecosystem developer. How many of those 4 bit processors can you get that are smaller than a 1000 transistor plus 16 bit processor, where you don't have to use the extra lines and treat it like a 5 bit processor, or 4 bits. So, 4 bits has no real advantage often, and at time the 16 bits can offer a lot more flexibility and speed. Now, my original retro design proposal was a 8 bit processor with single thread 16 bit like performance, I could do it as 4 bits, as that is the target on the high side for instruction width. You could use a setting to clip the data width to a 4 or 8 bit, thumbnail, to maintain code repository for a 4 or 8 bit version. Many ways to do things. If the community would prefer 4 or 8 bits instead, they are welcome to it. But, I think 26 bit misc is a healthy compromise.

Now, considering DR Tings recent passing, let's not get into the reasons that the original mup didn't succeed, and note that I never said revive that, but something more complete, and I did point yo Jeff's later work which was a significantly useful improvement. But. Is there rally any reason that Tings p16 and p32 aren't good starting places. The suitability of the instruction set architecture is more important, as the underlying circuit can be completely redesigned at any point after. On the other hand, if the ISA isn't perfect, then comparability will pay when changed. We are only talking about a usable sellable proof 0f concept to start with. A first manufacturer, could completely design their own circuit to fit the isa, if they want. The actual circuit is a little moot. But good if it also is very ideal. As long as none of this is like the short memory misc array programming communications model. I'm not happy. My own proposals completely get away from that complexity at greatly increased throughput and reduced power consumption of pass through communications. Hence, that's mine, with several other major enhancements, for my own commercial chips. I get sick of having to talk companies into improvements. To have somebody punishing people for what is still good programming practices is intolerable. Our problems often are not the architecture
You can tell that Jeff privately told me of a number of things in this field of doing business. But another thing, is that I would do the video coprocessor in a radically different way than the mup. I would have simple nested counters and I'm looking at a list of display lists, to do complex things better simpler. Radically. Even compressed lossless graphics suitable for a 8 bit era computer. I'm going say this, because there is no market for such work, and I'm not interested in pursuing it, but I'm working out how to make a simple FPGA comparable to a blotter for an Atari VCS, and adding multicoloured bitmapped graphics, and some other radical stuff where you could make a game comparable to Sonic the Hedgehog on the VCS, that looks like the master system version. That's about as far as I can push it without putting other processing circuits for 3D. The stuff which could have potentially bbeen done in the 1970's and 1980's, is incredible. But the VCS was made down to a price and the 64 needed just a few more changes to get that bit more.

That's why places like .. frustrate me. You see how everything could be flipped and come out good, but you get stuff that makes people resist instead, as we both know in the past. Unfortunate, modern systems are too powerful and complex for simple things to make a significant difference outside of performance. However. Retro is a niche market and can help develop a viable chip for further use. Short term, up to a millions quantity product is possible within the third release year, but my end product is still in the tens of millions territory. Money, brains and talent, is what I'm looking for. Unfortunately, I've lost some of that, but these are still relatively simple products in untapped high appeal market segments. That's called promise, in mass marketing terms.

Facts and certainty.
.


Wayne morellini

unread,
Jun 8, 2022, 10:35:12 AMJun 8
to
Thanks for that. I'll have to have a look at the AVR stuff. The thing we 9ffer 9s better performance for less transistors on the ground, equalling less cost. I would be interested in finding out about anything that can clearly best that, or match it. But, even do. Of this just gets close enough then it would probably normally get picked up instead..

Paul Rubin

unread,
Jun 8, 2022, 8:40:46 PMJun 8
to
Wayne morellini <waynemo...@gmail.com> writes:
> Commercially myself, I am more interested in simple printed circuits
> at 2.5 micro to 1 micron. I think I finally have a clean room
> solution to do this at home. That's more doable on an individual's
> level.

I couldn't read that whole long post, but tons of very interesting chips
were done in 2.5 micron and larger sizes. The Mead-Conway revolution
happened in the 3 to 5 micron era. There was a #homecmos channel on
freenode for a while that was trying to do stuff at, iirc, something
like 12 microns. If you have an affordable way to make 2.5 micron chips
at home, that is really quite revolutionary, especially if you can do
mixed signal.

As for a MISC-like chip though, yes of course there is skepticism: who
would want a thing like that, at least as a separate chip rather than a
macro cell? DIY satisfaction (and I'm all in favor of that) seems like
the main reason to make it, unless you've got some pretty clear and
quantitative claims about how it would outperform conventional chips at
some meaningful application.

This morning I was thinking it would be interesting to have a chip with
wide but single core SMT, to allow super fast coroutine switching
without having to save and restore registers, sort of like a Forth
multitasker (that only has a couple of registers to save and restore, so
the task switcher is fast), but allowing something more like
conventional OS's and compilers, which do use lots of registers. That
might be more interesting than a MISC chip. It could be a RISC-V with
a special instruction added to call another context, and there might be
8 or so contexts available in a core, sort of like the Parallax
Propeller.

Paul Rubin

unread,
Jun 8, 2022, 8:57:48 PMJun 8
to
Wayne morellini <waynemo...@gmail.com> writes:
> Thanks for that. I'll have to have a look at the AVR stuff. The
> thing we offer is better performance for less transistors on the
> ground, equalling less cost. [2 typos fixed, I think]

ALU transistor costs in MCU's today are almost irrelevant compared to
transistors driving i/o pins, transistors in memory arrays, and mixed
signal peripherals. Raspberry Pi Foundation decided to make its own MCU
a few years ago (the RP2040). It sells for $1 retail, it has two 32-bit
ARM cores, 264KB of ram, its own weird programmable digital i/o
peripheral, A-D converters, timers, etc. The ARM cores are tiny
compared to the ram array and the other stuff. If you decreased the
transistor count in the ARM cores to zero by replacing ARM with a stack
architecture, the chip cost wouldn't decrease by enough for anyone to
care.

I wouldn't pay attention to AVR when making comparisons to your MISC
part. I do like AVR8 but really it's a legacy design. You have to
compare to ARM, RISC-V, and the like.


> I would be interested in finding out about anything that can clearly
> best that, or match it.

Before anyone can suggesting things better than X, you first have to
tell us what X is. X is your MISC design and V is the conventional
thing (RISC-V, say). Does X use fewer transistors? Don't care (see
above), ALU transistors are free. If your rationale involves transistor
count then you have already declared irrelevance. Does X provide more
operations per unit of energy? That is much more interesting. GA at
least focused on that. But you have to give us some numbers for X,
since you are the one who knows anything about it.

We do have the GA chips, which were done by smart people. Do they have
the advantage that GA claimed? I am not sure, but can provisionally
accept "yes". Is the advantage so overwhelming as to get people to use
GA144's in favor of the stuff they are used to? Apparently not.

Paul Rubin

unread,
Jun 8, 2022, 9:03:51 PMJun 8
to
Wayne morellini <waynemo...@gmail.com> writes:
> Facts, hardly anything you said is not accommodated in what I'm
> saying. You can make it level tolerant. The starting point is not
> the finish. How many of those 4 bit CPU's are C processors.

I couldn't read that very long post either (could you try shorter
ones?), but there really are no 4 bit processors these days, except
maybe as sequencer cells inside ASIC's. The Padauk PA150 is a super
cheap 8 bit processor, costing 3 cents (0.03 USD) retail in small
quantities. It can run C code. I was interested in it, but I figure I
can afford to splurge and get the more powerful 10 cent version.

The AVR is a step up from the Padauk and is also 8 bits and was designed
to run C. It has 32 8-bit registers and is a convenient compiler
target. THe Padauk is a minimal load-store design with a single
accumulator, so not that great for HLL's, but people do use it.

dxforth

unread,
Jun 9, 2022, 12:45:27 AMJun 9
to
On 9/06/2022 10:57, Paul Rubin wrote:
>
> I do like AVR8 but really it's a legacy design.

'The infliction of cruelty with a good conscience is a delight to
moralists. That is why they invented AVR8.' - Bertrand Russell

Rick C

unread,
Jun 9, 2022, 3:15:10 AMJun 9
to
The "idle states" are probably not required, IF a design technique is used like the F18, where CPUs are automatically stopped when there is no further data for them to process. The "idle state" is basically OFF with no penalty for restarting.

Forth code can be smaller, but "extremely" is a bit of an overstatement I think.


> Still one would have to be able to beat eg Arduino Pico.

Beat it in what regard exactly? Do you mean power consumption?


> Leave out such "annoying edge computing facilities", one would have to race against atiny4.
> Guess who'll win.

Not sure what you are saying with this.

--

Rick C.

-- Get 1,000 miles of free Supercharging
-- Tesla referral code - https://ts.la/richard11209

minf...@arcor.de

unread,
Jun 9, 2022, 3:34:47 AMJun 9
to
A hypothetical Forth chip could IMO only be successful in the embedded or microcontroller domain.
There flexible IOs to communicate with the outer world (edges) are essential. This has to be reflected
in the chip design.
For instance the Arduino Pico offers
https://arduino-pico.readthedocs.io/en/latest/pins.html
Or of the atiny family the small atiny4
https://www.microchip.com/en-us/product/ATTINY4


Rick C

unread,
Jun 9, 2022, 3:45:24 AMJun 9
to
Virtually every coffee maker, microwave oven, and remote control has a 4 bit processor in it. There's only one reason why anyone would use a 4 bit MCU, because it costs less. So your claims that the 4 bit devices are dead are, as Mark Twain said, "greatly exaggerated". You should try to keep in mind that, your experiences are quite different from a Chinese designer who is buying at a 1E6 piece price.

Rolling your own processor at home is a nice idea. Lots of potential fun in that. But it is never going to compete with commercial products. While micron level designs can have low leakage current, they will be very power hungry when active. The power consumption in CMOS devices is from charging and discharging capacitance in the circuit. Capacitance of a 1 micro feature size design is going to be more than an order of magnitude higher than the rather obsolete 180 nm process used for the GA144 and even worse in comparison to more modern processes used for MCUs.

--

Rick C.

-+ Get 1,000 miles of free Supercharging
-+ Tesla referral code - https://ts.la/richard11209

Rick C

unread,
Jun 9, 2022, 4:02:07 AMJun 9
to
I rather thought that was obvious. I didn't realize that needed to be pointed out to anyone.


> There flexible IOs to communicate with the outer world (edges) are essential. This has to be reflected
> in the chip design.

Again, obvious. That's why I mentioned the need for 3.3V I/Os and 5V tolerance. I'm also picturing a GA144 like structure, so dedicated peripherals are not needed. One of the few things the GA144 got right is the ability to use processors as software based peripherals.


> For instance the Arduino Pico offers
> https://arduino-pico.readthedocs.io/en/latest/pins.html
> Or of the atiny family the small atiny4
> https://www.microchip.com/en-us/product/ATTINY4

Yeah, the GA144 could emulate pretty much any common peripheral with the I/O CPUs. They seem to have munged the dedicated memory interface so it would not mate well with DRAM, so instead they developed apps with very expensive and power hungry static RAM (also very low density and on the way out). I tried to see if I could figure out how to use it with DRAM (you don't actually have to clock DRAM as fast as it can possibly run, you just need to meet all the timing specs). But at one point I needed to understand the timing between the three CPUs that were handling the interface, including the timing of the internal comms between them. They would not release any internal timing info. I was told to "play" with the chips to see if the design would work. That's the exact opposite of how design work should proceed. Why would anyone invest time and effort into a design if the data sheet says it is not possible? If they won't provide that data, I'm sure as heck not going to waste my time to measure it for them.

--

Rick C.

-+- Get 1,000 miles of free Supercharging
-+- Tesla referral code - https://ts.la/richard11209

none albert

unread,
Jun 9, 2022, 4:19:48 AMJun 9
to
In article <1cff2504-294a-4be1...@googlegroups.com>,
Rick C <gnuarm.del...@gmail.com> wrote:
<SNIP>
>Yeah, the GA144 could emulate pretty much any common peripheral with the
>I/O CPUs. They seem to have munged the dedicated memory interface so it
>would not mate well with DRAM, so instead they developed apps with very
>expensive and power hungry static RAM (also very low density and on the
>way out). I tried to see if I could figure out how to use it with DRAM
>(you don't actually have to clock DRAM as fast as it can possibly run,
>you just need to meet all the timing specs). But at one point I needed
>to understand the timing between the three CPUs that were handling the
>interface, including the timing of the internal comms between them.
>They would not release any internal timing info. I was told to "play"
>with the chips to see if the design would work.

At one point I had the same idea, but no expertise to pursue it.
Your experience (and mine) is exactly why the GA's fail and
are going to continue failing.

> waste my time

That sums it up nicely.

>Rick C.
--
"in our communism country Viet Nam, people are forced to be
alive and in the western country like US, people are free to
die from Covid 19 lol" duc ha
albert@spe&ar&c.xs4all.nl &=n http://home.hccnet.nl/a.w.m.van.der.horst

Anton Ertl

unread,
Jun 9, 2022, 4:27:44 AMJun 9
to
Rick C <gnuarm.del...@gmail.com> writes:
>Virtually every coffee maker, microwave oven, and remote control has a 4 bi=
>t processor in it.

Citation needed

When googling for "4-bit microcontroller market", I only find
references that don't give 4-bit market share. E.g.,
<https://www.verifiedmarketresearch.com/product/microcontroller-market/>
says

|[A microcontroller] is capable of processing a word length that ranges
|between 4-bit up to 64-bit.

But when it comes to market segmentation, one of the segmentations
they provide is into 8-bit, 16-bit, and 32-bit microcontrollers. It
seems that 4-bit microcontrollers are not longer relevant.

>There's only one reason why anyone would use a 4 bit MC=
>U, because it costs less.

These days, the main reason for 8-bit probably is that you don't need
to redesign the thing. And for 4-bit the prime is sufficiently far in
the past that even that is not a good reason for most uses: the
designs with 4-bit controllers from maybe the 1980s have been
redesigned with 8-bit or bigger microcontrollers in the meantime.

- anton
--
M. Anton Ertl http://www.complang.tuwien.ac.at/anton/home.html
comp.lang.forth FAQs: http://www.complang.tuwien.ac.at/forth/faq/toc.html
New standard: https://forth-standard.org/
EuroForth 2022: http://www.euroforth.org/ef22/cfp.html

Jurgen Pitaske

unread,
Jun 9, 2022, 10:32:00 AMJun 9
to
I at least found one with a quick google:
https://www.emmicroelectronic.com/product
https://www.emmicroelectronic.com/about
SWATCH might ring a bell ...
But the market is extremely narrow, and probably counted under custom silicon,
as they are probably masked, so only for very high volumes and not for the general market.

Anton Ertl

unread,
Jun 9, 2022, 11:06:03 AMJun 9
to
Jurgen Pitaske <jpit...@gmail.com> writes:
>I at least found one with a quick google:
>https://www.emmicroelectronic.com/product
>https://www.emmicroelectronic.com/about

No mention of "4 bit" or "4-bit" (except in the context of "64 bit")
on these pages (I did not follow any links).

Wayne morellini

unread,
Jun 9, 2022, 12:36:40 PMJun 9
to
Your following post wasn't that much shorter Paul. Time is passing on.

Wayne morellini

unread,
Jun 9, 2022, 12:42:08 PMJun 9
to
I was talking about macro cell, that manufacturers could use, and finding one to kick it off. Also large IO microcontroller and CE versions base on the same core.

Wayne morellini

unread,
Jun 9, 2022, 12:53:04 PMJun 9
to
Paul the transistors are consuming the energy doing the processing. That's how the GA gets its advantage
The problem is not what you are saying, it's lack of revolution of good design. The designs you are comparing conventional too, are strictly limited and difficult, not to be compared. What I'm saying is strictly more conventional outside of forth, making it easier to use. What I mentioned is things like Jeff's and Dr Tings, but at more conventional 16 and 32 bit sizes (and conventional or parasitic on chip memory). Reading gets you all the good bits. At University I read a lot, to.learn what was up. :). I think I made it clear I was t in favour of the reduced
misc model that the x18s now use.

Wayne morellini

unread,
Jun 9, 2022, 1:03:32 PMJun 9
to
Well. It's just handy for a small company or hobbyist to print their complete product or circuit in house than to go through the Hassel of trying to get an order of the lowest or fastest arm they then gave to pick and place or send off to a third party manufacturer, to get made when they don't need the low energy, small size or high speed of the commercial chip. Having said that, a misc are so small a machine could stop one in that's more affordable than an arm as an alternative. There is market for both. Saying here, that you could buy a tube of 10,000 misc chips and put an pick and place feeder attachment to a 3D printer. Print over the chip and circuit and you are done. There is room for both styles of technology.

Jurgen Pitaske

unread,
Jun 9, 2022, 1:14:53 PMJun 9
to
see the data sheet:
https://www.emmicroelectronic.com/sites/default/files/products/datasheets/em6607_ds.pdf

Features
‰ Low Power typical 1.8µA active mode
typical 0.5µA standby mode
typical 0.1µA sleep mode
@ 1.5V, 32kHz, 25 °C
‰ Low Voltage 1.2 to 3.3 V
‰ ROM 2k × 16 (Mask Programmed)
‰ RAM 96 × 4 (User Read/Write)
‰ 2 clocks per instruction cycle
‰ RISC architecture
‰ 5 software configurable 4-bit ports
‰ 1 High drive output port
‰ Up to 20 inputs (5 ports)
‰ Up to 16 outputs (4 ports)
‰ buzzer three tone
‰ Serial Write buffer – SWB
‰ Supply Voltage level detection (SVLD).
‰ Analogue and timer watchdog
‰ 8 bit timer / event counter
‰ Internal interrupt sources (timer, event counter,
prescaler)
‰ External interrupt sources (portA + portC)
Description
The EM6607 is a single chip low power, mask
programmed CMOS 4-bit microcontroller. It contains
ROM, RAM, watchdog timer, oscillation detection circuit,
combined timer / event counter, prescaler, voltage level
detector and a number of clock functions. Its low voltage
and low power operation make it the most suitable
controller for battery, stand alone and mobile equipment.
The EM6607 microcontroller is manufactured using EM’s
Advanced Low Power CMOS Process.
In 24 Pin package it is direct replacement for EM6603.

Wayne morellini

unread,
Jun 9, 2022, 10:20:13 PMJun 9
to
So, are you saying we should use this chip instead, or 24 pin 4 bit misc chip, or piggyback this ochip on a misc chip to give a high energy mode, or get EM's advanced low powered CMOS process, or EM should take over misc and buy out GA. It's potentially so confusing! Sorry, hard to resist giving a misconstrued reply that mistakes everything, like I get around here. :)

I like to see somebody try to make a Linux desktop computer out of these, like people say that problems can be broken down into smaller bit sizes. But, hie would they fix the resulting bias worth of computer on the desk alongside an equivalent top end PC configuration. Imagine the pipelining you would need. You might be able to run alongside the data as it propagates down to the other end of the bus, as it goes through a million deep pipeline. I can actually do this with normal chips and beat the signal to the end. I can do that with normal high end processors, just by standing on the chip, I can beat the signal to the other side of the chip before I even started. It's all that spooky quantum stuff, action at a distance maybe? :)

Rick C

unread,
Jun 9, 2022, 10:26:38 PMJun 9
to
On Thursday, June 9, 2022 at 4:27:44 AM UTC-4, Anton Ertl wrote:
> Rick C <gnuarm.del...@gmail.com> writes:
> >Virtually every coffee maker, microwave oven, and remote control has a 4 bi=
> >t processor in it.
>
> Citation needed

Yeah, wait here while I get that for you.


> When googling for "4-bit microcontroller market", I only find
> references that don't give 4-bit market share. E.g.,
> <https://www.verifiedmarketresearch.com/product/microcontroller-market/>
> says
>
> |[A microcontroller] is capable of processing a word length that ranges
> |between 4-bit up to 64-bit.
>
> But when it comes to market segmentation, one of the segmentations
> they provide is into 8-bit, 16-bit, and 32-bit microcontrollers. It
> seems that 4-bit microcontrollers are not longer relevant.

Not sure who "they" is. I don't put all my faith in any individual report. These reports have a target audience. If you understand the market you would understand why 4-bit processors are not very well reported.

I'm not sure what you are trying to say.


> >There's only one reason why anyone would use a 4 bit MC=
> >U, because it costs less.
>
> These days, the main reason for 8-bit probably is that you don't need
> to redesign the thing. And for 4-bit the prime is sufficiently far in
> the past that even that is not a good reason for most uses: the
> designs with 4-bit controllers from maybe the 1980s have been
> redesigned with 8-bit or bigger microcontrollers in the meantime.

You don't need to redesign what "thing"? I don't understand what you are saying. There are NEW designs that use 4-bit MCUs... the ones where they omit not absolutely essential resistors costing a fraction of a penny, because of the impact on profit margin.

There is literally no reason on earth to use an 8-bit processor in a $20 coffee maker. Every piece of this appliance is cost optimized. The software running on the 4 bit processor is probably not more than 200 instructions. Zero reason to use anything other than the cheapest 4-bit MCU they can find.

https://www.walmart.com/ip/Mainstays-12-Cup-Programmable-Coffee-Maker-1-8-Liter-Capacity-Black/476343008

They also use 4-bit MCUs in remote controls, microwave ovens and many, many other relatively low cost appliances and devices. You don't have to believe it, but it is reality. When they sell millions of an item, they work hard to squeeze literally every penny out of the cost. Oh, yeah, I forgot toys! Profit margins are king there. If a product is a loser, they want to save every penny and if a product is a winner, they want to milk every penny from the many units sold.

--

Rick C.

-++ Get 1,000 miles of free Supercharging
-++ Tesla referral code - https://ts.la/richard11209

Rick C

unread,
Jun 9, 2022, 10:34:57 PMJun 9
to
There was a 4-bit stack processor, designed for such devices as keyfobs, the MARK4 by Atmel. It lasted a few years and is no longer made. I'm not sure why. The companies selling 4-bitters, are mostly Chinese, no-name foundries. They are very hard to compete with on price. In return, they don't give you the time of day unless you are buying 100s of thousands to start with. That's why they don't show up on the radar.

--

Rick C.

+-- Get 1,000 miles of free Supercharging
+-- Tesla referral code - https://ts.la/richard11209

Paul Rubin

unread,
Jun 9, 2022, 11:42:08 PMJun 9
to
Rick C <gnuarm.del...@gmail.com> writes:
> There is literally no reason on earth to use an 8-bit processor in a
> $20 coffee maker.

I have a $20 coffee maker and afaict it has no processor at all. There
is a tank of water, a heating element, and a thermostat. You close the
cover over the water tank, which seals it with a gasket, and then turn
on the heat. Steam builds up in the tank which pushes the water up
through a siphon-like tube where it drops through the coffee grounds
into the carafe. When all the water is pushed out that way, the heating
element gets above 100C, and a thermostat shuts off the power. Rice
cookers work roughly the same way.

> The software running on the 4 bit processor is probably not more than
> 200 instructions. Zero reason to use anything other than the cheapest
> 4-bit MCU they can find.

Given that a flash programmable 8 bit cpu is 3 cents retail (i.e. you
can order 100 of them for 3 bucks from lcsc.com), I don't see how much
they can save by using 4 bitters. Stuff made in really huge quantity
likely uses ASIC's rather than MCU's.

> They also use 4-bit MCUs in remote controls, microwave ovens and many,
> many other relatively low cost appliances and devices.

I'd want to see a teardown before being convinced of this. It has to be
a model introduced in the past 10 years, not something from the 1980's,
since we're talking about new designs.

The TV remote my mom uses has voice recognition. It wouldn't surprise
me if it has a 32 bit cpu.

Rick C

unread,
Jun 10, 2022, 12:00:30 AMJun 10
to
On Thursday, June 9, 2022 at 11:42:08 PM UTC-4, Paul Rubin wrote:
> Rick C <gnuarm.del...@gmail.com> writes:
> > There is literally no reason on earth to use an 8-bit processor in a
> > $20 coffee maker.
> I have a $20 coffee maker and afaict it has no processor at all. There
> is a tank of water, a heating element, and a thermostat. You close the
> cover over the water tank, which seals it with a gasket, and then turn
> on the heat. Steam builds up in the tank which pushes the water up
> through a siphon-like tube where it drops through the coffee grounds
> into the carafe. When all the water is pushed out that way, the heating
> element gets above 100C, and a thermostat shuts off the power. Rice
> cookers work roughly the same way.

Yeah, there's no MCU in a fork either. I would ask what your point is, but you would probably try to explain it.


> > The software running on the 4 bit processor is probably not more than
> > 200 instructions. Zero reason to use anything other than the cheapest
> > 4-bit MCU they can find.
> Given that a flash programmable 8 bit cpu is 3 cents retail (i.e. you
> can order 100 of them for 3 bucks from lcsc.com), I don't see how much
> they can save by using 4 bitters. Stuff made in really huge quantity
> likely uses ASIC's rather than MCU's.

You are right. You don't see.


> > They also use 4-bit MCUs in remote controls, microwave ovens and many,
> > many other relatively low cost appliances and devices.
> I'd want to see a teardown before being convinced of this. It has to be
> a model introduced in the past 10 years, not something from the 1980's,
> since we're talking about new designs.
>
> The TV remote my mom uses has voice recognition. It wouldn't surprise
> me if it has a 32 bit cpu.

LOL! More likely it also has a wifi connection to the Internet!!! LOL

Sometimes you are so funny. You crack me up!

--

Rick C.

+-+ Get 1,000 miles of free Supercharging
+-+ Tesla referral code - https://ts.la/richard11209

Jurgen Pitaske

unread,
Jun 10, 2022, 2:24:30 AMJun 10
to
No, this was for Anton to find this 4 bit documentation at EM, and for others who want to see products out there now.

4 bit is useful where needed, and mostly for cost and volume.
Any Forth chip should have at least 16 bit

dxforth

unread,
Jun 10, 2022, 3:13:05 AMJun 10
to
On 10/06/2022 13:42, Paul Rubin wrote:
>
> The TV remote my mom uses has voice recognition. It wouldn't surprise
> me if it has a 32 bit cpu.

I'd settle for rubber buttons that function in the cold. Shouldn't have
to treat remotes like pampered poodles.

Anton Ertl

unread,
Jun 10, 2022, 3:28:26 AMJun 10
to
Rick C <gnuarm.del...@gmail.com> writes:
>On Thursday, June 9, 2022 at 4:27:44 AM UTC-4, Anton Ertl wrote:
>> Rick C <gnuarm.del...@gmail.com> writes:=20
>> >Virtually every coffee maker, microwave oven, and remote control has a 4=
> bi=3D=20
>> >t processor in it.=20
>>=20
>> Citation needed=20
>
>Yeah, wait here while I get that for you.=20
>
>
>> When googling for "4-bit microcontroller market", I only find=20
>> references that don't give 4-bit market share. E.g.,=20
>> <https://www.verifiedmarketresearch.com/product/microcontroller-market/>=
>=20
>> says=20
>>=20
>> |[A microcontroller] is capable of processing a word length that ranges=
>=20
>> |between 4-bit up to 64-bit.=20
>>=20
>> But when it comes to market segmentation, one of the segmentations=20
>> they provide is into 8-bit, 16-bit, and 32-bit microcontrollers. It=20
>> seems that 4-bit microcontrollers are not longer relevant.=20
>
>Not sure who "they" is.

In this case https://www.verifiedmarketresearch.com

>I don't put all my faith in any individual report.=
> These reports have a target audience. If you understand the market you w=
>ould understand why 4-bit processors are not very well reported. =20

So you claim that 4-bit processors are relevant, don't want to provide
any evidence for that, dismiss evidence against it out of hand, and
then say that one has to be in the know to understand why there is no
evidence for your claims. Maybe you should found the church of 4-bit
processing.

>> These days, the main reason for 8-bit probably is that you don't need=20
>> to redesign the thing. And for 4-bit the prime is sufficiently far in=20
>> the past that even that is not a good reason for most uses: the=20
>> designs with 4-bit controllers from maybe the 1980s have been=20
>> redesigned with 8-bit or bigger microcontrollers in the meantime.=20
>
>You don't need to redesign what "thing"?

Whatever things still use 8-bit MCUs. Maybe "every coffee maker,
microwave oven, and remote control".

>There are NEW designs that use 4-bit MCUs...

That's what you claim.

>You don't have to belie=
>ve it, but it is reality.

You don't provide any evidence, so the only choice is to believe or
not to believe.

none albert

unread,
Jun 10, 2022, 4:41:01 AMJun 10
to
In article <9b86fdb3-8c42-4954...@googlegroups.com>,
Jurgen Pitaske <jpit...@gmail.com> wrote:
<SNIP>
>Description
>The EM6607 is a single chip low power, mask
>programmed CMOS 4-bit microcontroller. It contains
>ROM, RAM, watchdog timer, oscillation detection circuit,
>combined timer / event counter, prescaler, voltage level
>detector and a number of clock functions. Its low voltage
>and low power operation make it the most suitable
>controller for battery, stand alone and mobile equipment.
>The EM6607 microcontroller is manufactured using EM’s
>Advanced Low Power CMOS Process.
>In 24 Pin package it is direct replacement for EM6603.

This is certainly a 3 eurocent processor.

Groetjes Albert

Jurgen Pitaske

unread,
Jun 10, 2022, 8:42:34 AMJun 10
to
Not many know EM in Switzerland or have been there.

Wikipedia gives a nice overview of the 4 bit history with more known names.
How many of these are still available is a question I am not really interested in.
I have my own MISC processor 16 bit in FPGA
A similar version had been done about 20 years ago as ASIC as a student project.
It uses minimum ressources, so not optimised for speed, but flexible

List of 4-bit processors at wikipedia, copy and paste from there


Intel C4004

NEC D63GS 4-bit microcontroller

NEC D63GS: a 4-bit microcontroller for infrared remote control transmission
card-edge PCB
Olympia CD700 Desktop Calculator using the National Semiconductor MAPS MM570X bit-serial 4-bit microcontroller
16-pin DIP
National Semiconductor MM5700CA/D bit-serial 4-bit microcontroller
Intel 4004
Intel 4040
TMS 1000
Atmel MARC4 core[26][27] – (discontinued: "Last ship date: 7 March 2015"[28])
Samsung S3C7 (KS57 Series) 4-bit microcontrollers (RAM: 512 to 5264 nibbles, 6 MHz clock)
Toshiba TLCS-47 series
HP Saturn
NEC μPD75X
NEC μCOM-4
NEC (now Renesas) μPD612xA (discontinued), μPD613x, μPD6x[23][29] and μPD1724x[30] infrared remote control transmitter microcontrollers[31][32]
EM Microelectronic-Marin EM6600 family,[33] EM6580,[34][35] EM6682,[36] etc.
Epson S1C63 family
National Semiconductor "COPS I" and "COPS II" ("COP400") 4-bit microcontroller families[37]
National Semiconductor MAPS MM570X
Sharp SM590/SM591/SM595[38]: 26–34 
Sharp SM550/SM551/SM552[38]: 36–48 
Sharp SM578/SM579[38]: 49–64 
Sharp SM5E4[38]: 65–74 
Sharp LU5E4POP[38]: 75–82 
Sharp SM5J5/SM5J6[38]: 83–99 
Sharp SM530[38]: 100–109 
Sharp SM531[38]: 110–118 
Sharp SM500[38]: 119–127  (ROM 1197×8 bit, RAM 40×4 bit, a divider and 56-segment LCD driver circuit)
Sharp SM5K1[38]: 128–140 
Sharp SM4A[38]: 141–148 
Sharp SM510[38]: 149–158  (ROM 2772×8 bit, RAM 128×4 bit, a divider and 132-segment LCD driver circuit)
Sharp SM511/SM512[38]: 159–171  (ROM 4032×8 bit, RAM 128/142×4 bit, a divider and 136/200-segment LCD driver circuit)
Sharp SM563[38]: 172–186 

S Jack

unread,
Jun 10, 2022, 10:21:30 AMJun 10
to
On Thursday, June 9, 2022 at 9:26:38 PM UTC-5, gnuarm.del...@gmail.com wrote:
>
> There is literally no reason on earth to use an 8-bit processor in a $20 coffee maker. Every > > >piece of this appliance is cost optimized. The software running on the 4 bit processor is >probably not more than 200 instructions. Zero reason to use anything other than the cheapest >4-bit MCU they can find.

It may take a few more instructions to get the coffee maker to talk to the cup.
--
me

Wayne morellini

unread,
Jun 10, 2022, 12:14:24 PMJun 10
to
Sorry. It was humour at the type of post you get around here, and on operator like technical forums, which just mistakes things in a chaotic fashion, rather than appropriately assess things.

Wayne morellini

unread,
Jun 10, 2022, 12:28:39 PMJun 10
to
Anyway, I'm going have to leave you guys. My father passed away unexpectedly last month, after not been able to see him more than a handful.of times I'm the last 6 months due to covid restrictions, and not been able to talk with him on the phone due to his hearing. Now, there are unnecessary petty will games, and the stress is getting too much, after a number of other overlapping stressful things trying to take advantage of my disability, which has partly lifted a bit for now, to address this. Anybody know of a civil causes body, that might want to invest the odd million, then I might be able to sort out everything which has been deliberately done to me.

But keep up the conversation, it almost seems like people favour a 4 bit misc version do far. :)

Thanks.

Wayne morellini

unread,
Jun 10, 2022, 12:32:41 PMJun 10
to
I'm basically not going get any commercial work done with all these people deliberately interfering with my life. What a waste. They cost much more than they gain.

Jurgen Pitaske

unread,
Jun 10, 2022, 2:48:09 PMJun 10
to
I hope you can sort your private issues soon. All the best.

Rick C

unread,
Jun 10, 2022, 4:35:59 PMJun 10
to
That is simply an opinion. There's no reason why a 8 or even 4 bit CPU can't be a stack processor and be useful to those who build things with 4 and 8 bit CPUs. Heck, it might turn out that a 5 bit CPU is preferable. My stack processor is data path agnostic. It can have any size instruction (if you recode them to suit), data path, address path. There is no direct dependence between the three.

--

Rick C.

++- Get 1,000 miles of free Supercharging
++- Tesla referral code - https://ts.la/richard11209

Rick C

unread,
Jun 10, 2022, 4:46:36 PMJun 10
to
Yes,

> don't want to provide
> any evidence for that,

Correct, because anyone familiar with the world of 4-bit processors, understands there is little evidence, unless as I've said, you want to actually build a product and contact the makers directly. People like TI don't want to bother with 4-bit MCUs because the profit margins are too small for them to make a profit.


> dismiss evidence against it out of hand,

Except no evidence was provided. I explained how they don't show on the same radars as 8, 16 and 32 bit MCUs.


> then say that one has to be in the know to understand why there is no
> evidence for your claims. Maybe you should found the church of 4-bit
> processing.

Indeed. I simply have been paying attention to that world long enough to know. You seem to be a total newbie. Believe what you want. Most people here do.


> >> These days, the main reason for 8-bit probably is that you don't need=20
> >> to redesign the thing. And for 4-bit the prime is sufficiently far in=20
> >> the past that even that is not a good reason for most uses: the=20
> >> designs with 4-bit controllers from maybe the 1980s have been=20
> >> redesigned with 8-bit or bigger microcontrollers in the meantime.=20
> >
> >You don't need to redesign what "thing"?
> Whatever things still use 8-bit MCUs. Maybe "every coffee maker,
> microwave oven, and remote control".

You aren't making sense. Whatever.


> >There are NEW designs that use 4-bit MCUs...
> That's what you claim.

So, are you claiming that 4-bit processors are no longer designed into new products? Show me a simple product, that could make use of a 4-bit processor, is made in qty over a million and has an 8-bit processor. The coffee maker is a perfect example. Can you show me an under $25 coffee maker that uses an 8-bit processor? They don't exist because the chip may only cost a penny more, but that's $10,000 in lost profits. Why would they give that up? Why do you refuse to believe the 4-bit processors are still viable at the very low cost end of the market?


> >You don't have to belie=
> >ve it, but it is reality.
> You don't provide any evidence, so the only choice is to believe or
> not to believe.

You only need to look. Juergen has already provided links to such devices. I'm not your search engine. I'm also a bit busy.

--

Rick C.

+++ Get 1,000 miles of free Supercharging
+++ Tesla referral code - https://ts.la/richard11209

Jurgen Pitaske

unread,
Jun 10, 2022, 4:47:00 PMJun 10
to
You have an opinion, I have an opinion.
My opinion about 4 bit very special and not for the general public seems to be supported by reality and market overviews.
Apart from this, anybody can have any processor.
Let's have as many Forth processors as there are Forths out there.
And none is successful. As it is now.

Rick C

unread,
Jun 10, 2022, 4:51:40 PMJun 10
to
Wikipedia is not very useful on this issue. This is largely an historical list of devices made by mainstream companies. I see a mention of a remote control MCU. That might be current and any of the Sharp devices may be current. The Intel 4004 was the first single chip CPU ever made, no? I don't see the Atmel MARC4. Maybe I need to add that.

--

Rick C.

---- Get 1,000 miles of free Supercharging
---- Tesla referral code - https://ts.la/richard11209

Rick C

unread,
Jun 10, 2022, 4:57:43 PMJun 10
to
Sorry, I don't understand some of what you say.

There is no disadvantage for a stack processor to have a 4 bit data path if it is being used in applications where this is appropriate, such as clocks, microwave ovens, coffee makers, remote controls, etc.

Why do you think a stack processor has to have a 16 bit data path?

The successful stack CPUs are in FPGAs and custom chips. We just don't know so much about them, something shared with the MIPS processor.

--

Rick C.

---+ Get 1,000 miles of free Supercharging
---+ Tesla referral code - https://ts.la/richard11209

Paul Rubin

unread,
Jun 10, 2022, 6:19:09 PMJun 10
to
Rick C <gnuarm.del...@gmail.com> writes:
> Why do you think a stack processor has to have a 16 bit data path?

If we're talking about Forth, we usually expect data cells to be able to
hold addresses. 4 bits is awfully small for that. 8 bits, maybe.
Also, I had thought we were talking about MCU's, which are packaged,
multipurpose programmable devices that control the rest of the product
through i/o pins. Stuff inside an ASIC or FPGA doesn't count as that.

Paul Rubin

unread,
Jun 10, 2022, 6:36:07 PMJun 10
to
Rick C <gnuarm.del...@gmail.com> writes:
> Indeed. I simply have been paying attention to that world long enough
> to know. You seem to be a total newbie.

I would say paying attention to the world for a long time means you
witnessed some stuff that happened in the distant past, which can be the
basis of much wisdom. But it doesn't make you an authority about what
is or isn't happening in the present.

> So, are you claiming that 4-bit processors are no longer designed into
> new products?

I don't think such a claim was made. Only that there hasn't been
convincing evidence shown against it.

> Show me a simple product, that could make use of a 4-bit processor, is
> made in qty over a million and has an 8-bit processor.

Every PC keyboard including in the pre-USB era had an 8035-style
processor even though it could have used a 4 bitter. In USB keyboards,
using 4 bitters may be unfeasible so I won't count them.

Paul Rubin

unread,
Jun 10, 2022, 6:49:34 PMJun 10
to
albert@cherry.(none) (albert) writes:
>>The EM6607 is a single chip ... 4-bit microcontroller.
> This is certainly a 3 eurocent processor.

I like how it comes with 2kx16 bits (4k bytes) of code space (mask
rom). Somehow GA thought that 64 18-bit words was enough.

It has 96 nibbles of ram, no mention of a hardware stack. In a Forth
chip with 4-bit data, how is the return stack supposed to be stored?

Paul Rubin

unread,
Jun 10, 2022, 6:51:59 PMJun 10
to
Wayne morellini <waynemo...@gmail.com> writes:
> Anyway, I'm going have to leave you guys. My father passed away
> unexpectedly last month, after not been able to see him more than a
> handful.of times I'm the last 6 months due to covid restrictions, and
> not been able to talk with him on the phone due to his hearing.

I'm very sorry to hear this about your father. My condolences. My mom
is hard of hearing and I've been meaning to set up a text chat system
that she can use.

James Brakefield

unread,
Jun 10, 2022, 6:59:18 PMJun 10
to
Have a FPGA 4-bit accumulator ISA core that supports return and data stacks
via general purpose index registers. Can be widened to 8/9 or 16/18-bits.
https://github.com/jimbrake/lem1_9min
If you need to have a FPGA chip anyway, this will take ~200 6LUTs and as little as a half of a block RAM.
~100 lines of fully commented straight-forward VHDL (sans initialization & overhead).
There are a number of 8 & 16-bit designs with under 200 LUTs. Even 32-bit if done serially.
Then again very few true Forth or stack designs under 200 LUTs?

Paul Rubin

unread,
Jun 10, 2022, 7:07:06 PMJun 10
to
James Brakefield <jim.bra...@ieee.org> writes:
> Then again very few true Forth or stack designs under 200 LUTs?

I think once any FPGA is involved, we're outside the realm of super low
cost parts. There are lots of sub-10-cent 8-bit MCU's. I don't know of
any sub-10-cent FPGA's even with 200 LUT-4's, much less LUT-6's.

Rick C

unread,
Jun 11, 2022, 1:21:03 AMJun 11
to
On Friday, June 10, 2022 at 6:19:09 PM UTC-4, Paul Rubin wrote:
> Rick C <gnuarm.del...@gmail.com> writes:
> > Why do you think a stack processor has to have a 16 bit data path?
> If we're talking about Forth, we usually expect data cells to be able to
> hold addresses. 4 bits is awfully small for that. 8 bits, maybe.

I think you are confused. Many 8 bit processors have 8 bit data paths and 16 bit addresses. The cell size is what you want it to be, not dictated by the hardware. The same applies to a 4-bit processor. Don't confuse cells with the address unit.


> Also, I had thought we were talking about MCU's, which are packaged,
> multipurpose programmable devices that control the rest of the product
> through i/o pins. Stuff inside an ASIC or FPGA doesn't count as that.

You didn't quote what you are replying to. You can define an MCU anyway you want. Juergen's comment wasn't about MCUs, it was about "Forth chips". I simply pointed out one of the successful uses of "Forth chips", even though, there's no such thing as a "Forth chip" other than in the Humpty Dumpty definition.

Don't get your knickers in a knot.

--

Rick C.

--+- Get 1,000 miles of free Supercharging
--+- Tesla referral code - https://ts.la/richard11209

Rick C

unread,
Jun 11, 2022, 1:30:30 AMJun 11
to
Actually, it couldn't at the time. For one, they simply copied the design from the previous designs, as they did with so many parts of the PC, firmware and all. There was always a huge amount of paranoia about compatibility in the PC world. Secondly, at the time, there were no 4-bit processors with the same I/O capabilities. At least, I never saw any. This is most likely because every I/O pin costs in test which can dominate the price of a low end device. In the 40 pin packages used in keyboards, the cost advantage of a 4-bit processor probably evaporated. Then lastly, a keyboard doesn't really fit the definition of a simple product. I'm referring to things like remote controls and coffee makers that have minimal requirements (obviously if you need an 8-bit CPU for any of various reasons, you need an 8-bit CPU). A $50 keyboard doesn't fit that description. I'm not sure you could get many keyboards for that price at the time. They used to be designed with excellent mechanical switches that cost around a buck apiece.

--

Rick C.

--++ Get 1,000 miles of free Supercharging
--++ Tesla referral code - https://ts.la/richard11209

Rick C

unread,
Jun 11, 2022, 1:34:09 AMJun 11
to
On Friday, June 10, 2022 at 6:49:34 PM UTC-4, Paul Rubin wrote:
> albert@cherry.(none) (albert) writes:
> >>The EM6607 is a single chip ... 4-bit microcontroller.
> > This is certainly a 3 eurocent processor.
> I like how it comes with 2kx16 bits (4k bytes) of code space (mask
> rom). Somehow GA thought that 64 18-bit words was enough.
a
Not sure how you can compare the two. The GA144 has an unlimited code space since the storage is all external to the chip.


> It has 96 nibbles of ram, no mention of a hardware stack. In a Forth
> chip with 4-bit data, how is the return stack supposed to be stored?

??? What does the data path have to do with the return stack? I think you are too accustomed to bigger processors where it's all the same number. As I said in another post, the cell size in Forth, just like the address size in a CPU has nothing to do with the data path. An 8080 has 8 bit data paths, and 16 bit addresses. Why do you keep getting confused about this?

--

Rick C.

-+-- Get 1,000 miles of free Supercharging
-+-- Tesla referral code - https://ts.la/richard11209

Jurgen Pitaske

unread,
Jun 11, 2022, 2:03:33 AMJun 11
to
No need to add.
it is included in the list I copied, just next to the TMS1000

Anton Ertl

unread,
Jun 11, 2022, 2:37:03 AMJun 11