Why use non-free compilers (Keil, etc) for architectures supported by SDCC?

84 views
Skip to first unread message

Philipp Klaus Krause

unread,
Jul 20, 2022, 7:49:28 AMJul 20
to
I wonder why some developers choose non-free compilers (Keil, IAR,
Cosmic, Raisonance, etc) when targeting architectures supported by the
free Small Device C Compiler (SDCC). Answears that also mention the
architecture besides the reasons would be particularly helpful.

Peter Heitzer

unread,
Jul 20, 2022, 10:23:43 AMJul 20
to
To develop a non trivial program you not only need a compiler but also
a good debugger or simulator that allows for testing any part of the
microcontroller. Most free simulators are either textbased (gdb type) or
support only a few parts and a subset of the controllers peripherals.
A further reason for choosing a non free toolchain might be support for
older designs created before SDCC was an alternative.

--
Dipl.-Inform(FH) Peter Heitzer, peter....@rz.uni-regensburg.de

David Brown

unread,
Jul 20, 2022, 12:34:01 PMJul 20
to
I rarely use microcontrollers that work with SDCC, but others at the
same company have. There are a few reasons, I think, that lead to SDCC
not being chosen despite its obvious advantages (cost being the main
one). These are not in order, and I don't know how relevant they are in
the grand scheme of things.

1. Manufacturers often recommend Keil or IAR, rarely SDCC.

2. Demo code is often for Keil or IAR. With these kinds of devices,
there is invariably non-standard code or extensions so code can't easily
be ported between compilers.

3. Pre-written code - either within the company, or from outside - is
hard to port to SDCC. You usually have to stick with the previous tools.

4. New developers get familiar with Keil or IAR from university.

5. There is a perception (I can't say if it is fair or not) that SDCC
gives less efficient results than the big name compilers.


Phil Hobbs

unread,
Jul 20, 2022, 3:56:02 PMJul 20
to
Plus,

6. When you hit a bug, there's somebody being paid to answer your phone
calls.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC / Hobbs ElectroOptics
Optics, Electro-optics, Photonics, Analog Electronics
Briarcliff Manor NY 10510

http://electrooptical.net
http://hobbs-eo.com

David Brown

unread,
Jul 21, 2022, 6:58:43 AMJul 21
to
On 20/07/2022 21:55, Phil Hobbs wrote:
> David Brown wrote:
>> On 20/07/2022 13:49, Philipp Klaus Krause wrote:
>>> I wonder why some developers choose non-free compilers (Keil, IAR,
>>> Cosmic, Raisonance, etc) when targeting architectures supported by
>>> the free Small Device C Compiler (SDCC). Answears that also mention
>>> the architecture besides the reasons would be particularly helpful.
>>
>> I rarely use microcontrollers that work with SDCC, but others at the
>> same company have.  There are a few reasons, I think, that lead to
>> SDCC not being chosen despite its obvious advantages (cost being the
>> main one).  These are not in order, and I don't know how relevant they
>> are in the grand scheme of things.
>>
>> 1. Manufacturers often recommend Keil or IAR, rarely SDCC.
>>
>> 2. Demo code is often for Keil or IAR.  With these kinds of devices,
>> there is invariably non-standard code or extensions so code can't
>> easily be ported between compilers.
>>
>> 3. Pre-written code - either within the company, or from outside - is
>> hard to port to SDCC.  You usually have to stick with the previous tools.
>>
>> 4. New developers get familiar with Keil or IAR from university.
>>
>> 5. There is a perception (I can't say if it is fair or not) that SDCC
>> gives less efficient results than the big name compilers.
>>
>>
>
> Plus,
>
> 6. When you hit a bug, there's somebody being paid to answer your phone
> calls.
>

My experience with big commercial toolchains is that this does not
always help - often the support people have very little technical
experience or knowledge. Maybe I just don't ask stupid enough
questions. But I have heard (reliably) of toolchain support departments
where the people dedicated to helping with dongles and software license
locking problems outnumber the technical toolchain support staff by a
factor of 3. Of course there are exceptions - some big name toolchains
have excellent support staff.

My experience with open source toolchains is that you don't have a
number to call, but you can find good help fast from forums, mailing
lists, etc. And you can quickly get in contact with people involved in
the development of the toolchains, not just a support monkey that won't
listen to your question until you have installed all the Windows service
packs and rebooted your PC.

I don't know about SDCC, but for gcc there are several ways to get
commercial support - including from companies heavily involved in the
development of the toolchains. Since you are paying for the support,
not the software, it always has to be good quality.

But none of that contradicts the fact that "there is someone on the
phone to help and/or yell at" being a significant reason for people to
choose big name commercial toolchains over free and open source solutions.


Grant Edwards

unread,
Jul 21, 2022, 11:05:35 AMJul 21
to
On 2022-07-21, David Brown <david...@hesbynett.no> wrote:

>> 6. When you hit a bug, there's somebody being paid to answer your
>> phone calls.

As pointed out below, "ansering your phone call" and "fixing your
problem" are two very different things. I don't care about the
former. I do care about the latter.

> My experience with big commercial toolchains is that this does not
> always help - often the support people have very little technical
> experience or knowledge. [...]
>
> My experience with open source toolchains is that you don't have a
> number to call, but you can find good help fast from forums, mailing
> lists, etc. [...]

Same here. Over the decades, my experiences with getting questions
answered and bugs fixed have been far, far better with open-source
tools than with commercial tools. However, that won't stop the
anti-open-source people from using "there's no tech support phone
number" as a reason to ignore open source tools.

> But none of that contradicts the fact that "there is someone on the
> phone to help and/or yell at" being a significant reason for people
> to choose big name commercial toolchains over free and open source
> solutions.

It is indeed a popular reason. It's just not a good reason.

--
Grant

Phil Hobbs

unread,
Jul 21, 2022, 1:19:03 PMJul 21
to
My experience is different, though it wasn't with Keil et al. For
instance, one time long ago, I found a fairly horrible linker bug in
Microchip C17--it was loading segments misaligned by one byte IIRC. I
sent in a support email at lunchtime, and the debugged linker executable
was in my email the following morning.

Of course I've had the same sort of thing happen with open source, e.g.
the late lamented ZeroBugs debugger for Linux, written by the estimable
Christian Vlasceanu. Nice piece of code, that, but he ran out of gas
and/or money and got a day job. A pity--it was very nearly as good as
the Visualage C++ 3.08 debugger (of song and story).

I expect that the distinction is mainly the size of the teams and the
user bases.

Paul Rubin

unread,
Jul 21, 2022, 2:10:45 PMJul 21
to
David Brown <david...@hesbynett.no> writes:
> 5. There is a perception (I can't say if it is fair or not) that SDCC
> gives less efficient results than the big name compilers.

I haven't used Keil or IAR, but comparing GCC to SDCC, it seems to me
that SDCC is a more primitive compiler. There's a bunch of features
absent and the error diagnostics were often unhelpful. I've used SDCC
twice. Once was starting a small project from scratch, which wasn't too
bad. I just dealt with issues as they arose. The other was attempting
to port a midsized project (around 5K SLOC) from GCC to SDCC. It became
clear pretty quickly that getting an SDCC version working at all would
be considerable effort and the resulting binary probably wouldn't fit on
the target cpus I was thinking of.

I'm not at all trying to bag on SDCC since it is obviously useful, but I
can understand why people sometimes look for more featureful compilers.

Finally, although both programs mentioned were written in C, GCC can
also compile C++, which has some attractions. I don't know if IAR or
Keil compile C++. I also remember thinking that it would be interesting
to write embedded applications in Ada, and GCC compiles that too. Right
now I think there are no non-GCC compilers for Ada-2012 or later.

Niklas Holsti

unread,
Jul 21, 2022, 2:56:08 PMJul 21
to
On 2022-07-21 21:10, Paul Rubin wrote:
>
> Finally, although both programs mentioned were written in C, GCC can
> also compile C++, which has some attractions. I don't know if IAR or
> Keil compile C++.


Both Keil and IAR support both C and C++, according to their webpages.
But perhaps you have to pay separately for a C compiler and for a C++
compiler, and probably separately per target architecture, too.


> I also remember thinking that it would be interesting
> to write embedded applications in Ada, and GCC compiles that too. Right
> now I think there are no non-GCC compilers for Ada-2012 or later.


There are some:

The GNAT Ada compiler from AdaCore, which initially used the GCC
back-end, will have a variant based on the LLVM back-end. Currently this
is still experimental, I believe. This compiler is the most up to date
in terms of language features.

ObjectAda from PTC is an Ada 2012 compiler, but I believe it supports
only x86 and x86_64 platforms, so not comparable to SDCC.

Janus/Ada from RR Software "supports the complete syntax and selected
features of" Ada 2012. However, it lacks a few Ada 95 features, and is
currently only available on and for x86 Windows. In the past, it
supported also embedded targets such as the Z80.

There may be others; I have not made a thorough survey.

David Brown

unread,
Jul 21, 2022, 2:56:59 PMJul 21
to
One key point here is that both IAR and Keil have toolchains targeting
"big" processors, such as ARM. These are more advanced toolchains, and
support C++ (I don't know what standard versions - but I'd be surprised
if they were fully up-to-date).

So when comparing features, SDCC features should be compared to those of
Keil or IAR for the same target - and I doubt if anyone is using much
C++ for the 8051 or 68HC05 processors.

As for Ada, the only "big name" commercial toolchain I know of for Ada
is Green Hills, and it is only for ARM, PowerPC, and perhaps a few other
32-bit devices. There is GCC Ada for the 8-bit AVR, though I believe
the run-time and library are somewhat incomplete.

There is no doubt that GCC is a vastly more feature-filled project than
SDCC. It is a world apart in terms of the languages it supports, the
standards it follows, the static error checking, the diagnostics, the
optimisations, etc. But while they are both free (and open) compilers,
the targets they support are very different.

Clifford Heath

unread,
Jul 21, 2022, 11:00:49 PMJul 21
to
Until Microchip bought the company, I think basically any substantive
question about Hitech C went directly to the founder (whose interesting
name I can't recall). There were based at 45 Colebard Street West
Acacia Ridge QLD 4110, next to the Archerfield airport in the south of
Brisbane.

There's something about small teams that ensures high-quality responses
- if you can get a response. Perhaps the opposites are true for large teams.

Clifford Heath

chris

unread,
Jul 22, 2022, 7:04:17 AMJul 22
to
I generally dislike proprietary tools, but back in the day, say
for 8051 series, the architecture was so dire that it was hard
work to program anything other than trivial projects in assembler.
Any vendor that could deliver a reasonably functional C complier
and ice adapter was on to a winner. Also, for 8051 series, the
Keil toolchain had support for code banking, an essential for the
limited address space. Just had to hold nose at the code produced,
never look at it, but it did at least work. Later 8051 series from
Silicon Labs and free toolchain were actually pretty good, but again,
just don't look at the asm output.

Now, we have luxury of options, ide's and debug options, but still
prefer the simplicity of a Makefile environment, with gdb for the
odd times where debug by module testing and inspection isn't
enough...

Chris

Phil Hobbs

unread,
Jul 22, 2022, 9:27:47 AMJul 22
to
C17 was actually their previous in-house effort, which was so buggy that
they bought Hitech and then quietly killed off their own compiler.

I bit the bullet and ported the code over to Hitech a few months later.
The C17 series never sold well, I don't think, but you can still get the
PIC17C456, twenty-odd years later. Microchip really rocks if you want
product longevity.

boB

unread,
Aug 25, 2022, 10:14:36 PMAug 25
to
Wasn't Hitech the small company (that 1 guy?) from Australia or NZ ??

I remember around 2007 +/- year or two, when there was a big Microchip
conference up here in the Seattle-Everett area, he came by my work at
the time and sitting there in our lab, made some changes to his
compiler or was helping us with some issue. This was before Hitech
was sold of course. VERY good person and I can't remember his name
either.

boB






chris

unread,
Aug 26, 2022, 12:18:04 PMAug 26
to
Hitech (sp ?) here in the uk were agents for Keil compilers, including
the 8051 8 bit series. We used that for a project around 1999 and it
produced consistently working code. Something like 6 x 32 K banks and
we never found a serous compiler bug.

You would not want to look at the asm source from it though, typically
pages of impenetrable asm just for a hello world...

Chris


>
>
>

David Brown

unread,
Aug 26, 2022, 12:46:03 PMAug 26
to
On 26/08/2022 18:17, chris wrote:

>
> Hitech (sp ?) here in the uk were agents for Keil compilers, including
> the 8051 8 bit series. We used that for a project around 1999 and it
> produced consistently working code. Something like 6 x 32 K banks and
> we never found a serous compiler bug.
>
> You would not want to look at the asm source from it though, typically
> pages of impenetrable asm just for a hello world...
>
> Chris
>

I have never used Keil's 8051 compiler myself, but I did once help
someone who was using it. It turned out to be a compiler issue - the
compiler was not correctly handling integer promotion rules. I don't
know if it was a bug as such, or an intentional non-conformance aimed at
giving users more useful code generation. (After all, the integer
promotion rules are often a PITA for 8-bit devices - on a device like
the 8051, when 8-bit arithmetic is all you need for a calculation, using
16-bit can take 5 to 10 times as long.)

I've known a number of commercial compilers for embedded systems that
break the normal working of the C language in order to give more
efficient results or simpler coding for users. That's not necessarily a
bad thing - compilers don't have to be conforming - but it's a serious
pain when it is the default behaviour and the documentation is poor.

Examples of this include skipping the zeroing of implicitly initialised
statically allocated data (i.e., the ".bss" segment) in the name of
faster startup, and abusing "const" to mean "put this in the flash
address space rather than the ram address space".

Don Y

unread,
Sep 2, 2022, 2:25:42 AMSep 2
to
Inertia and "expectations" of support -- can I call someone, today, and
get my problem serviced (cuz I can't sit on my hands waiting for someone
to "make spare time" to address my needs).

Note that embedded devices differ from desktop applications in that there
are often hooks to hardware, interfaces to ASM "helpers", etc. A company
may have developed a set of these from other products and wants to just
"drop them in" -- without worrying about keywords, pragmas, calling/return
conventions, crt0.s, etc.

Finally, one often needs/wants a debugger that "knows about" the rest of
the toolchain and any quirks it may have. E.g., I typically hook the
"debugger console" with a DEBUG() macro in my code. So, I can see
messages like:
Task05: Starting with arguments '123' and 'hello bob'
Task09: Waiting for memory
Task02: Opening output device 'tty03'
Task01: Waiting for user input
Task05: Initialization complete
without having to watch a "memory buffer"

(and not have to reinvent these mechanisms for the next project!)

Michael Schwingen

unread,
Sep 4, 2022, 4:40:02 PMSep 4
to
On 2022-07-20, Philipp Klaus Krause <p...@spth.de> wrote:
> I wonder why some developers choose non-free compilers (Keil, IAR,
> Cosmic, Raisonance, etc) when targeting architectures supported by the
> free Small Device C Compiler (SDCC).

For 8051, Keil seems to generate better code than SDCC - I am currently
doing some work on an old TI CC2511 (8051-core) chip, and tend to run
into data size issues because SDCC statically allocates variables for all
function parameters - Keil does have better optimizations for that (and
probably also a better code generator, but I don't have much experience with
keil).

Also, at work, we have used IAR because TI only supplied binary libraries
for the CC2541 for that compiler (we had to get the correct compiler
version, the latest-and-greatest would not do).

If I can choose the chip, I tend to choose something that has working gcc
support if possible.

cu
Michael

Philipp Klaus Krause

unread,
Sep 5, 2022, 11:33:41 AMSep 5
to
Thanks for all the replies, here and elsewhere. Since by now, further
ones are arriving very slowly only, I'd like to give a quick summary.

I'll quote just one reply in full, since in just a few lines it
illustrates the main points:

"In my case the customer requested SDCC based project but it failed to
compile into the small flash size. Debugging was quite difficult. Using
the Simplicity Studio and Keil Compiler pairing made the code small
enough to fit into the device and made debugging much easier."

The 3 most-cited reasons to not use SDCC were:

* Lack of efficiency of the code generated by SDCC.
* Better debug support and integration in non-free toolchains.
* Availability of paid support for non-free compilers.

In my opinion, the best way forward from here to make SDCC more
competitive vs. non-free compilers is:

0) Improve machine-independent optimizations
1) Improve machine-dependent optimizations for mcs51
2) Improve debug support and integration
3) Find and fix bugs

I'd estimate the total effort at a full-time position for slightly more
than a year, though even less effort should allow some improvements.

Philipp


Don Y

unread,
Sep 5, 2022, 1:33:10 PMSep 5
to
On 9/5/2022 8:33 AM, Philipp Klaus Krause wrote:
> Thanks for all the replies, here and elsewhere. Since by now, further ones are
> arriving very slowly only, I'd like to give a quick summary.
>
> I'll quote just one reply in full, since in just a few lines it illustrates the
> main points:
>
> "In my case the customer requested SDCC based project but it failed to
> compile into the small flash size. Debugging was quite difficult. Using
> the Simplicity Studio and Keil Compiler pairing made the code small
> enough to fit into the device and made debugging much easier."
>
> The 3 most-cited reasons to not use SDCC were:
>
> * Lack of efficiency of the code generated by SDCC.
> * Better debug support and integration in non-free toolchains.
> * Availability of paid support for non-free compilers.

I've rarely worried about code *size* and only seldom worried about
efficiency (execution speed).

But, I *do* get annoyed if the generated code doesn't do what it
was supposed to do! Or, does it with unexpected side-effects, etc.

To that end, the biggest win was vendor responsiveness; knowing
that reporting a bug will result in prompt attention to fix *that*
bug (so I don't have to explore alternative ways of writing the code
to avoid triggering it -- and then leaving a "FIXME" to remind myself
to restore the code to its "correct" form once the compiler is fixed.

When I was doing small processors (early 80's thru 90's), I developed
relationships with a few vendors that let me get overnight turnaround
on bug reports. In addition to the quick response, I *knew* that
the changes in the tools were only oriented towards my reported bug;
I didn't have to worry about some "major rewrite" that likely introduced
NEW bugs, elsewhere!

[I abandoned MS's tools when I reported a bug -- a pointer to a member
function -- and was offered a completely new version of the compiler,
"for free" (what, so I can debug THIS compiler, too??)]

Unfortunately (for you, supporting a product), the only way to get that
sort of responsiveness is to make "support" your full-time job. <frown>

The other big win I found in tools of that era was how well the "under
the hood" aspects of the code generator and support routines were
documented. As I would have to modify the generated code to exist in
a multitasking environment, I wanted to know where helper routines
stored any static data on which they relied. Or, rewrite standard
libraries to support reentrancy. Or, hook the debugger so I could
see *a* task's evolving state regardless of the actions of other tasks
(this isn't always trivial)

[The devices I used were unlike current offerings in that they didn't require
large "vendor/manufacturer libraries" to implement basic functionality
of on-chip components]

> In my opinion, the best way forward from here to make SDCC more competitive vs.
> non-free compilers is:
>
> 0) Improve machine-independent optimizations
> 1) Improve machine-dependent optimizations for mcs51
> 2) Improve debug support and integration
> 3) Find and fix bugs

If "uptake" is your goal, you might focus on just a single processor (8051
family seems a common application) and be known for how well you address
*that* segment of the market -- rather than trying to bring the quality
of all code generators up simultaneously.

Good luck!

Philipp Klaus Krause

unread,
Sep 6, 2022, 3:22:42 AMSep 6
to
Am 05.09.22 um 19:32 schrieb Don Y:
>
> I've rarely worried about code *size* and only seldom worried about
> efficiency (execution speed).
>
> But, I *do* get annoyed if the generated code doesn't do what it
> was supposed to do!  Or, does it with unexpected side-effects, etc.

However, the replies so far show that code size, not wrong code is the
problem. IMO, that is not surprising for mcs51: The mcs51 port in SDCC
is old, bug reports come in rarely, and in recnet years, most work on
mcs51 has been bugfixes. IMO, the mcs51 port is very stable. Improving
code generation always comes with the risk of introducing bugs. Still,
if time allows, it might be worth it (and I hope that most of the new
bugs will be found before a release).

> To that end, the biggest win was vendor responsiveness; knowing
> that reporting a bug will result in prompt attention to fix *that*
> bug […]
>
> Unfortunately (for you, supporting a product), the only way to get that
> sort of responsiveness is to make "support" your full-time job.  <frown>

Unpaid support with fixed response times for a free compiler doesn't
look like a good full-time job to me. IMO, in general, the SDCC support
channels (ticket trackers, mailing lists) are quite responsive; most of
the time, there is a reply within hours, but sometimes it takes much longer.

> […]
>
> [The devices I used were unlike current offerings in that they didn't
> require
> large "vendor/manufacturer libraries" to implement basic functionality
> of on-chip components]
>

That is still true for many 8-bit devices, which are the targets of SDCC.

>> In my opinion, the best way forward from here to make SDCC more
>> competitive vs. non-free compilers is:
>>
>> 0) Improve machine-independent optimizations
>> 1) Improve machine-dependent optimizations for mcs51
>> 2) Improve debug support and integration
>> 3) Find and fix bugs
>
> If "uptake" is your goal, you might focus on just a single processor (8051
> family seems a common application) and be known for how well you address
> *that* segment of the market -- rather than trying to bring the quality
> of all code generators up simultaneously.

Well, I asked for reasons why people are using non-free compilers
instead of SDCC. Many of the replies were indeed for mcs51. IMO, this is
because the mcs51 is a common µC where SDCC has fallen behind vs. the
non-free compilers.
SDCC has other ports, that got far less replies, because the
architectures are less common (e.g. ds390) or because SDCC is already
the leading compiler for them (e.g. stm8).
0)-3) were chosen is a way that I hope will make SDCC more competitive
for mcs51, while not neglecting other ports.


Don Y

unread,
Sep 6, 2022, 4:59:45 AMSep 6
to
On 9/6/2022 12:22 AM, Philipp Klaus Krause wrote:
> Am 05.09.22 um 19:32 schrieb Don Y:
>>
>> I've rarely worried about code *size* and only seldom worried about
>> efficiency (execution speed).
>>
>> But, I *do* get annoyed if the generated code doesn't do what it
>> was supposed to do! Or, does it with unexpected side-effects, etc.
>
> However, the replies so far show that code size, not wrong code is the problem.

Understood. I was merely relaying my experiences (e.g., abandoning MS
because of their approach to bug fixes). Most of my "smaller" projects
have had large codebases (it wasn't uncommon to have a 250KB binary running
on an 8b MCU; sewing various "bank switching" schemes into the toolkit
was a prerequisite)

> IMO, that is not surprising for mcs51: The mcs51 port in SDCC is old, bug
> reports come in rarely, and in recnet years, most work on mcs51 has been
> bugfixes. IMO, the mcs51 port is very stable. Improving code generation always
> comes with the risk of introducing bugs. Still, if time allows, it might be
> worth it (and I hope that most of the new bugs will be found before a release).
>
>> To that end, the biggest win was vendor responsiveness; knowing
>> that reporting a bug will result in prompt attention to fix *that*
>> bug […]
>>
>> Unfortunately (for you, supporting a product), the only way to get that
>> sort of responsiveness is to make "support" your full-time job. <frown>
>
> Unpaid support with fixed response times for a free compiler doesn't look like
> a good full-time job to me.

Exactly. FOSS projects that thrive seem to rely on lots of eyes and hands
so the "load" isn't too great on any one individual. But, many projects
are relatively easy to contribute without requiring specific knowledge
beyond "this code fragment looks broken". E.g., I have no problem commiting
patches for drivers and many services -- but don't bother doing so with gcc
as the "admission fee" is too high.

> IMO, in general, the SDCC support channels (ticket
> trackers, mailing lists) are quite responsive; most of the time, there is a
> reply within hours, but sometimes it takes much longer.

My experience with tools for small processors predates "internet forums".
I would typically have had to log on (with a modem) to a vendor's "BBS"
and leave a message, there; picking up a new binary (from there) when
available and transfering it via X/Y/ZMODEM to my own host.

One typically didn't see other correspondence from other customers.
Nor do I imagine they saw my bug reports or the vendors' announcements
of new binaries built in response to those (unless the vendor deliberately
reached out to them).

>>> In my opinion, the best way forward from here to make SDCC more competitive
>>> vs. non-free compilers is:
>>>
>>> 0) Improve machine-independent optimizations
>>> 1) Improve machine-dependent optimizations for mcs51
>>> 2) Improve debug support and integration
>>> 3) Find and fix bugs
>>
>> If "uptake" is your goal, you might focus on just a single processor (8051
>> family seems a common application) and be known for how well you address
>> *that* segment of the market -- rather than trying to bring the quality
>> of all code generators up simultaneously.
>
> Well, I asked for reasons why people are using non-free compilers instead of
> SDCC. Many of the replies were indeed for mcs51. IMO, this is because the mcs51
> is a common µC where SDCC has fallen behind vs. the non-free compilers.

It could also be that many of the 8b devices are just not seeing much
market share (or have fallen out of production). How many 68xx devices
win designs nowadays? Does Zilog even make processors anymore? Etc.

Other "small CPU" vendors often offer their own toolchains thus removing the
burden of that expense (free competing with free).

OTOH, the '51 (et al.) is a pretty ubiquitous architecture offered by
a variety of vendors. And, at relatively high levels of integration
(compared to 8b processors of days gone by)

> SDCC has other ports, that got far less replies, because the architectures are
> less common (e.g. ds390) or because SDCC is already the leading compiler for
> them (e.g. stm8).
> 0)-3) were chosen is a way that I hope will make SDCC more competitive for
> mcs51, while not neglecting other ports.

Again, good luck!

David Brown

unread,
Sep 6, 2022, 5:09:46 AMSep 6
to
On 06/09/2022 09:22, Philipp Klaus Krause wrote:
> Am 05.09.22 um 19:32 schrieb Don Y:
>>
>> I've rarely worried about code *size* and only seldom worried about
>> efficiency (execution speed).
>>
>> But, I *do* get annoyed if the generated code doesn't do what it
>> was supposed to do!  Or, does it with unexpected side-effects, etc.
>
> However, the replies so far show that code size, not wrong code is the
> problem. IMO, that is not surprising for mcs51: The mcs51 port in SDCC
> is old, bug reports come in rarely, and in recnet years, most work on
> mcs51 has been bugfixes. IMO, the mcs51 port is very stable. Improving
> code generation always comes with the risk of introducing bugs. Still,
> if time allows, it might be worth it (and I hope that most of the new
> bugs will be found before a release).
>
<snip>
>
> Well, I asked for reasons why people are using non-free compilers
> instead of SDCC. Many of the replies were indeed for mcs51. IMO, this is
> because the mcs51 is a common µC where SDCC has fallen behind vs. the
> non-free compilers.
> SDCC has other ports, that got far less replies, because the
> architectures are less common  (e.g. ds390) or because SDCC is already
> the leading compiler for them (e.g. stm8).
> 0)-3) were chosen is a way that I hope will make SDCC more competitive
> for mcs51, while not neglecting other ports.
>
>

One important question, which I certainly can't answer myself, is
whether this is worth the effort.

For the most part, 8-bit microcontrollers are a dying breed. The only
real exception is the AVR, which is a very different kind of processor
and well supported by gcc (and maybe clang/llvm?).

It used to be the case that whenever a chip manufacturer wanted a small
processor in their device - radio chip, complex analogue converter,
etc., - they put in an 8051. Now they put in an ARM Cortex-M device.

So these kinds of brain-dead 8-bit CISC cores are almost only for legacy
use - when a company already has so much time and money invested in
hardware or software that is tied tightly to such cores, that they
cannot easily change to something from this century. How many of these
users would switch toolchains, even if SDCC were made hugely better than
whatever they have now? I'd expect almost none, they'd stick to what
they have - most would not even upgrade to newer versions of the same
tools that they already use.

I would expect existing SDCC users to be more interested in upgrading,
and they would always be happy with better code generation. But I do
not imagine there are many /new/ users - either people starting working
on 8051 projects today, or moving from commercial toolchains.

It's great that there are still people interested in improving this
venerable toolchain. But when you start talking about a person-year of
work, that's a lot of effort - it is not going to happen unless there is
a clear justification for the cost. (Maybe it is possible to make this
a student project for someone studying compiler design?)


Philipp Klaus Krause

unread,
Sep 6, 2022, 6:12:42 AMSep 6
to
Am 06.09.22 um 10:59 schrieb Don Y:
>
> It could also be that many of the 8b devices are just not seeing much
> market share (or have fallen out of production).  How many 68xx devices
> win designs nowadays?  Does Zilog even make processors anymore?  Etc.
>

However, there are still plenty of people compiling code for the Z80 and
SM83. But practically no one uses non-free compilers to do that. Most
use SDCC either directly or via the z88dk fork. A few use zcc or ack.
All of these are free, so not covered by the question that started the
thread.

It is mostly a retrocomputing / -gaming crowd. Since many of them are
willing to try development snapshots, and report bugs, their use of SDCC
helps a lot in spotting bugs in SDCC early, so they can be fixed before
a release.

Philipp


Philipp Klaus Krause

unread,
Sep 6, 2022, 6:41:44 AMSep 6
to
Am 06.09.22 um 11:09 schrieb David Brown:
>
> One important question, which I certainly can't answer myself, is
> whether this is worth the effort.

That clearly depends on many aspects. What is the higher goal? What are
the available resources? IMO improving the free toolchain for 8-Bit
devices is worth it at this time.

> […]How many of these
> users would switch toolchains, even if SDCC were made hugely better than
> whatever they have now?  I'd expect almost none, they'd stick to what
> they have - most would not even upgrade to newer versions of the same
> tools that they already use.
>
> I would expect existing SDCC users to be more interested in upgrading,
> and they would always be happy with better code generation.  But I do
> not imagine there are many /new/ users - either people starting working
> on 8051 projects today, or moving from commercial toolchains.

Indeed there is a question of putting in effort to match the needs of
different user groups, such as current SDCC users targetting µC, current
SDCC retrocomputing and retrogaming, current users of non-free tools, etc.
Naturally, SDCC developers do have an idea about the needs and wants of
current SDCC users from the mailing lists, issue trackers, etc.
On the other hand, such information was not readily available about
users that currently use a non-free compiler for architectures supported
by SDCC. Knowing how much overlap there is between what could be done
for different user groups is already useful information. In particular
improving the machine-independent optimizations and debug support is
something that will benefit both current and potential new users.


Don Y

unread,
Sep 6, 2022, 7:26:20 AMSep 6
to
On 9/6/2022 3:12 AM, Philipp Klaus Krause wrote:
> Am 06.09.22 um 10:59 schrieb Don Y:
>>
>> It could also be that many of the 8b devices are just not seeing much
>> market share (or have fallen out of production). How many 68xx devices
>> win designs nowadays? Does Zilog even make processors anymore? Etc.
>
> However, there are still plenty of people compiling code for the Z80 and SM83.
> But practically no one uses non-free compilers to do that.

I think much of that has to do with *when* those devices came on the market.
The choices for toolchains in the 68xx(x) and 808x/Z8x eras was barely more
than manufacturer supplied tools (e.g., under ISIS on the Intellec, RIO on the
ZDS, Versados on the EXORmacs, etc.). Recall PCs only came into being
in the early 80's; CP/M boxen being more common for nonproprietary platforms.
I didn't use PC-hosted tools until the NS32K -- and even those weren't
actually hosted on an x86!

> Most use SDCC either
> directly or via the z88dk fork. A few use zcc or ack. All of these are free, so
> not covered by the question that started the thread.

I'm sure every device I designed is still using the toolchain that I selected
at the time -- hence my comment of "inertia" in my initial post in this thread.
There are a fair number of products that have very long lifetimes where the
cost of making a significant change (i.e., complete redesign) drags in so
many externalities that it becomes prohibitive. "If it ain't broke, don't
fix it!" (I have some devices that are still being supported 30+ years
after the initial design)

ISTR the US military still uses 6502's in some of their armaments. And I
know there was a radhard 8085 some time ago...

> It is mostly a retrocomputing / -gaming crowd. Since many of them are willing
> to try development snapshots, and report bugs, their use of SDCC helps a lot in
> spotting bugs in SDCC early, so they can be fixed before a release.

Most of the arcade pieces that I'm familiar with were developed in ASM
(though I have no idea what the design methodology was for consoles).

Often, the "OS" (more of an "executive") was tailored to a very low
overhead implementation that doesn't lend itself to use of HLLs
(e.g., a single stack so any multitasking has to ensure stack
protocol isn't violated across a task switch)

[There was also a lot of proprietary hardware to manipulate the video
out-of-band as the processors of that era couldn't update displays
as fast as they were refreshed!]

David Brown

unread,
Sep 6, 2022, 9:01:04 AMSep 6
to
On 06/09/2022 12:41, Philipp Klaus Krause wrote:
> Am 06.09.22 um 11:09 schrieb David Brown:
>>
>> One important question, which I certainly can't answer myself, is
>> whether this is worth the effort.
>
> That clearly depends on many aspects. What is the higher goal? What are
> the available resources? IMO improving the free toolchain for 8-Bit
> devices is worth it at this time.
>

Fair enough. You have a far better idea of the users, of the effort
involved, and the developer commitment than I do.
Reply all
Reply to author
Forward
0 new messages