Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

LPC900/80C51 Compiler Toolchain

42 views
Skip to first unread message

euge...@gmail.com

unread,
Jun 20, 2007, 4:48:49 PM6/20/07
to
Hello,

I am starting a small project on LPC932. There seems to be quite a few
toolchain options out there:
Keil
Rigel
Raisonance
IAR
SDCC (open-source)

Which one would you recommend?
In my case, it is a very small project (I might even fit into the 4K
limited commercial kits), but if SDCC is a decent compiler, it will be
useful if I have future projects on this platform.

Thanks
Eugene

Tilmann Reh

unread,
Jun 21, 2007, 2:23:58 AM6/21/07
to
euge...@gmail.com schrieb:

We are using the Raisonace tools, and are fairly satisfied with it. It
has a good price/performance ratio, compared to other brands.

You might also have a look at <http://www.wickenhaeuser.de/> for their
uC/51 package. It also is rather cheap, and the demo is limited to 8k
(which is enough for most LPC900 devices due to their flash sizes). I
didn't look at that software deeper yet, however.

Tilmann

--
http://www.autometer.de - Elektronik nach Maß.

Chris Hills

unread,
Jun 21, 2007, 3:54:36 AM6/21/07
to
In article <1182372529.7...@q19g2000prn.googlegroups.com>,
euge...@gmail.com writes

I would suggest in this order

Keil
IAR
Raisonance
Rigel
SDCC

Keil and IAR are very much more advanced than the rest. Keil
particularly can do aggressive data overlaying which is often more
crucial than the limit on memory addressing the eval version.

Infact I have known the Keil eval compiler to get programs to fit ant
run where the full versions of other programs can not.

SDCC is not what I would call "decent" It may be unlimited but as I
said the lack of optimisation means you will run out of Data space and
possibly code space.

Also there is a lot more to the 8051 variants than just changing the
header files. TO start with the LPC932 has INTERNAL eXternal DATA
memory..... The Keil compiler will handle this correctly.

Start with the Keil compiler.
I would put the SDCC as a last resort.


--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills Staffs England /\/\/\/\/
/\/\/ ch...@phaedsys.org www.phaedsys.org \/\/\
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/

Paul Taylor

unread,
Jun 21, 2007, 12:36:48 PM6/21/07
to
On Thu, 21 Jun 2007 08:54:36 +0100, Chris Hills wrote:

> I would suggest in this order


>
> Keil
> IAR
> Raisonance
> Rigel
> SDCC
>
> Keil and IAR are very much more advanced than the rest. Keil
> particularly can do aggressive data overlaying which is often more
> crucial than the limit on memory addressing the eval version.

Anyone got any figures comparing any of these with gcc? Maybe for a
bit of open source software? (e.g. Lewin's DOSFS - or a another similar
sized project). How big is it when compiled with gcc, and with any of the
above compilers. I'm more interested in code size rather than run-time
performance.

Regards,

Paul Taylor.

Paul Taylor

unread,
Jun 21, 2007, 12:53:49 PM6/21/07
to
On Thu, 21 Jun 2007 17:36:48 +0100, Paul Taylor wrote:

> Anyone got any figures comparing any of these with gcc? Maybe for a
> bit of open source software? (e.g. Lewin's DOSFS - or a another similar
> sized project). How big is it when compiled with gcc, and with any of the
> above compilers. I'm more interested in code size rather than run-time
> performance.

Doesn't have to be 8051 target.

Chris Hills

unread,
Jun 21, 2007, 1:10:13 PM6/21/07
to
In article <pan.2007.06.21....@tiscali.co.uk>, Paul Taylor
<paul_ng...@tiscali.co.uk> writes

Then it depends on the target. GCC is closer to commercial performance
int he 32 bit areas than the 8 or 16 bit.

Chris Hills

unread,
Jun 21, 2007, 1:09:22 PM6/21/07
to


For code size (I assume you mean both Code and Data data usually being
the critical one ) the only options are Keil and IAR in that order.
GCC is not in the running.

Paul Taylor

unread,
Jun 21, 2007, 2:20:13 PM6/21/07
to
On Thu, 21 Jun 2007 18:09:22 +0100, Chris Hills wrote:

> For code size (I assume you mean both Code and Data data usually being
> the critical one )

yes

> the only options are Keil and IAR in that order.

> GCC is not in the running.

oops - I haven't used/looked at the 8051 for years, and just assumed that
gcc was ported to 8051.

Chris Hills

unread,
Jun 21, 2007, 4:46:49 PM6/21/07
to

As far as I know it is but It would not be a practical answer.

Message has been deleted

CBFalconer

unread,
Jun 21, 2007, 9:29:27 PM6/21/07
to
Paul Taylor wrote:
> On Thu, 21 Jun 2007 18:09:22 +0100, Chris Hills wrote:
>
>> For code size (I assume you mean both Code and Data data usually
>> being the critical one ) the only options are Keil and IAR in

>> that order. GCC is not in the running.
>
> oops - I haven't used/looked at the 8051 for years, and just
> assumed that gcc was ported to 8051.

Remember, this is Chris Hills, operating from his anti-open-source
bias position. He might even be accurate here.

--
<http://www.cs.auckland.ac.nz/~pgut001/pubs/vista_cost.txt>
<http://www.securityfocus.com/columnists/423>
<http://www.aaxnet.com/editor/edit043.html>
cbfalconer at maineline dot net

--
Posted via a free Usenet account from http://www.teranews.com

Chris Hills

unread,
Jun 22, 2007, 4:14:14 AM6/22/07
to
In article <467B25F7...@yahoo.com>, CBFalconer
<cbfal...@yahoo.com> writes

>Paul Taylor wrote:
>> On Thu, 21 Jun 2007 18:09:22 +0100, Chris Hills wrote:
>>
>>> For code size (I assume you mean both Code and Data data usually
>>> being the critical one ) the only options are Keil and IAR in
>>> that order. GCC is not in the running.
>>
>> oops - I haven't used/looked at the 8051 for years, and just
>> assumed that gcc was ported to 8051.
>
>Remember, this is Chris Hills, operating from his anti-open-source
>bias position. He might even be accurate here.

No, I have to work with hard facts not the fanciful notions of the FOSS
Devotee

Gcc is a general purpose system. It is better suited to the 32 bit
systems.. Rather than the 8 bit ones especially the 8051.

As has been pointed out GCC uses technology that is about 10-15 years
behind the top commercial compilers in the 8-16 bit field.

The problem with the 8061 family is the memory map and the DATA space.
In this case the internal eXternal DATA space.

David Brown

unread,
Jun 22, 2007, 5:50:43 AM6/22/07
to
Chris Hills wrote:
> In article <pan.2007.06.21....@tiscali.co.uk>, Paul Taylor
> <paul_ng...@tiscali.co.uk> writes
>> On Thu, 21 Jun 2007 18:09:22 +0100, Chris Hills wrote:
>>
>>> For code size (I assume you mean both Code and Data data usually being
>>> the critical one )
>>
>> yes
>>
>>> the only options are Keil and IAR in that order.
>>
>>> GCC is not in the running.
>>
>> oops - I haven't used/looked at the 8051 for years, and just assumed that
>> gcc was ported to 8051.
>
> As far as I know it is but It would not be a practical answer.
>

As far as I know, there is no 8051 gcc port. If there was, I'm sure it
would be pretty poor for code generation, since gcc (as you said) is
aimed more at 32-bit, or at least 16-bit, processors with plenty of
registers. 8-bit accumulator based architectures like the 8051 are not
a good match.

The "standard" open source compiler for the 8051 is SDCC. I've never
used the 8051, so I can't really comment on the tools.

Chris Hills

unread,
Jun 22, 2007, 5:40:38 AM6/22/07
to
In article <467b95c2$0$1455$8404...@news.wineasy.se>, David Brown
<da...@westcontrol.removethisbit.com> writes

Then there is no open source option for the 8051 at the moment.

David Brown

unread,
Jun 22, 2007, 7:30:05 AM6/22/07
to
Chris Hills wrote:
> In article <467B25F7...@yahoo.com>, CBFalconer
> <cbfal...@yahoo.com> writes
>> Paul Taylor wrote:
>>> On Thu, 21 Jun 2007 18:09:22 +0100, Chris Hills wrote:
>>>
>>>> For code size (I assume you mean both Code and Data data usually
>>>> being the critical one ) the only options are Keil and IAR in
>>>> that order. GCC is not in the running.
>>>
>>> oops - I haven't used/looked at the 8051 for years, and just
>>> assumed that gcc was ported to 8051.
>>
>> Remember, this is Chris Hills, operating from his anti-open-source
>> bias position. He might even be accurate here.
>
> No, I have to work with hard facts not the fanciful notions of the FOSS
> Devotee
>
> Gcc is a general purpose system. It is better suited to the 32 bit
> systems.. Rather than the 8 bit ones especially the 8051.
>
> As has been pointed out GCC uses technology that is about 10-15 years
> behind the top commercial compilers in the 8-16 bit field.
>

That's apples and oranges. gcc uses some of the latest compiler
technologies, and is way ahead of many (but not all) commercially
available compilers *within its field*. There are no gcc ports for the
8051 or similar types of cpu - it is not "behind" or using out-of-date
technology, or anything of the kind. In the same way, no one is going
to claim ByteCraft's compilers are "out of date" because they don't
auto-vectorise loops or re-organise structures to minimise the cost of
cache misses - such features are not relevant to targets like the 8051.

Perhaps the only target that could be overlapped by compilers
specialised for small targets and gcc is the AVR - it is 8-bit, with
separate memory spaces, but has multiple registers and a gcc port. It
would be fascinating to know how the sorts of techniques used by
ByteCraft, IAR, and other commercial embedded compiler specialists
compare to those of gcc.

Chris Hills

unread,
Jun 22, 2007, 7:29:47 AM6/22/07
to
In article <467bad0e$0$1455$8404...@news.wineasy.se>, David Brown
<da...@westcontrol.removethisbit.com> writes

>Chris Hills wrote:
>> In article <467B25F7...@yahoo.com>, CBFalconer
>><cbfal...@yahoo.com> writes
>>> Paul Taylor wrote:
>>>> On Thu, 21 Jun 2007 18:09:22 +0100, Chris Hills wrote:
>>>>
>>>>> For code size (I assume you mean both Code and Data data usually
>>>>> being the critical one ) the only options are Keil and IAR in
>>>>> that order. GCC is not in the running.
>>>>
>>>> oops - I haven't used/looked at the 8051 for years, and just
>>>> assumed that gcc was ported to 8051.
>>>
>>> Remember, this is Chris Hills, operating from his anti-open-source
>>> bias position. He might even be accurate here.
>> No, I have to work with hard facts not the fanciful notions of the
>>FOSS Devotee
>> Gcc is a general purpose system. It is better suited to the 32 bit
>>systems.. Rather than the 8 bit ones especially the 8051.
>> As has been pointed out GCC uses technology that is about 10-15
>>years behind the top commercial compilers in the 8-16 bit field.
>>
>
>That's apples and oranges. gcc uses some of the latest compiler
>technologies, and is way ahead of many (but not all) commercially
>available compilers *within its field*.

Not at all... not according to the compiler people I know. The most
generous comment I heard was that the latest Gcc is at least 5 years
behind commercial compilers. They weren't looking at the 8 bit ones
either. Others will tell you it is 10 years behind.

However they are not going to tell you how they do it for obvious
reasons.

David Brown

unread,
Jun 22, 2007, 8:59:20 AM6/22/07
to

What do you expect a commercial compiler vendor to tell you - we've got
this great expensive compiler, but it is not as advanced as a free and
open source one? And where does this mythical time-line come from anyway?

There are many compilers available for many targets, and some are
stronger than others in certain ways and for certain types of code. If
you need to know details, do a benchmark yourself using your own code -
anything else is just ramblings.

Michael N. Moran

unread,
Jun 22, 2007, 9:10:31 AM6/22/07
to
Chris Hills wrote:
> In article <467bad0e$0$1455$8404...@news.wineasy.se>,
> David Brown <da...@westcontrol.removethisbit.com> writes
>> Chris Hills wrote:
>>> As has been pointed out GCC uses technology that is
>>> about 10-15 years behind the top commercial compilers
>>> in the 8-16 bit field.
>>
>> That's apples and oranges. gcc uses some of the latest
>> compiler technologies, and is way ahead of many (but
>> not all) commercially available compilers *within its
>> field*.
>
> Not at all... not according to the compiler people I
> know. The most generous comment I heard was that the
> latest Gcc is at least 5 years behind commercial
> compilers. They weren't looking at the 8 bit ones either.
> Others will tell you it is 10 years behind.

FUD alert. Facts please. No hearsay and/or innuendo.

> However they are not going to tell you how they do it for
> obvious reasons.

Perhaps because they would need to admit their
claims are baseless or exaggerated?

On the other hand, there is a price to be paid for
supporting a large number of targets of different
types, and vendors that specialize in particular
targets (e.g. 8051) will tend to be more optimized
in that narrow field.

--
Michael N. Moran (h) 770 516 7918
5009 Old Field Ct. (c) 678 521 5460
Kennesaw, GA, USA 30144 http://mnmoran.org

"So often times it happens, that we live our lives in chains
and we never even know we have the key."
The Eagles, "Already Gone"

The Beatles were wrong: 1 & 1 & 1 is 1

Eric

unread,
Jun 22, 2007, 2:05:54 PM6/22/07
to
On Jun 22, 4:14 am, Chris Hills <c...@phaedsys.org> wrote:

> Gcc is a general purpose system. It is better suited to the 32 bit
> systems.. Rather than the 8 bit ones especially the 8051.

I know you meant to say SDCC instead of gcc. But you need to consider
that SDCC, although similar to gcc in some respects, was specifically
designed to target small devices, hence its name. It also understands
the memory architecture of the 8051.

gcc has been ported to generate H8 code, hc11, and I probably others.
So it does support some 8-bit targets. I agree that it's nor a "good"
code generator for 8 bit targets. gcc likes to see a real software
stack and no excessive memory paging, which is why it doesn't support
8051, and PIC.

But gcc does OK with the PIC24/dsPIC devices because those have a
"serious" 16 bit architecture that is quite different from the lowly
PIC everyone knows (and loves?).

> As has been pointed out GCC uses technology that is about 10-15 years
> behind the top commercial compilers in the 8-16 bit field.

I'm sure this is true in some cases. But I expect that gcc and SDCC
lead ahead of commercial compilers in some cases. Commercial compilers
are developed and improved whenever there is a reasonable expectation
that paying customers will come forward, or continue to come forward.
Sometimes decent commercial compilers are not kept up to date because
the customer demand might be "soft" for a particular target family of
devices.

I'm not against commercial software because I enjoy a pay check as
much as anyone else! Open source serves a niche that can't easily be
served by commercial software in some cases.

Eric

Eric

unread,
Jun 22, 2007, 2:08:28 PM6/22/07
to
On Jun 22, 5:50 am, David Brown <d...@westcontrol.removethisbit.com>
wrote:

> 8-bit accumulator based architectures like the 8051 are not
> a good match.

The h8 and hc11 also have 8 bit accumulators and they are supported by
gcc. It's the blasted memory paging and poor stack that make the 8051
such a difficult device to target. These same limitations are seen in
the PIC.

Eric

Chris Hills

unread,
Jun 22, 2007, 5:24:45 PM6/22/07
to
In article <1182535554.0...@c77g2000hse.googlegroups.com>, Eric
<engle...@yahoo.com> writes

>On Jun 22, 4:14 am, Chris Hills <c...@phaedsys.org> wrote:
>
>> Gcc is a general purpose system. It is better suited to the 32 bit
>> systems.. Rather than the 8 bit ones especially the 8051.
>
>I know you meant to say SDCC instead of gcc.

No , I didn't

> But you need to consider
>that SDCC, although similar to gcc in some respects, was specifically
>designed to target small devices, hence its name. It also understands
>the memory architecture of the 8051.

I bet it doesn't do it that well

>> As has been pointed out GCC uses technology that is about 10-15 years
>> behind the top commercial compilers in the 8-16 bit field.
>
>I'm sure this is true in some cases. But I expect that gcc and SDCC
>lead ahead of commercial compilers in some cases.

Absolutely not. As many commercial companies have looked at both they
know how far *behind* GCC and SDCC are. Particularly in the case of
SDCC for 8051.

> Commercial compilers
>are developed and improved whenever there is a reasonable expectation
>that paying customers will come forward, or continue to come forward.

Yes. Which includes all the targets you have mentioned so far. Are you
suggesting that Gcc and SDC are written for MCU that have no market?

>Sometimes decent commercial compilers are not kept up to date because
>the customer demand might be "soft" for a particular target family of
>devices.

Usually because the particular MCU has been discontinued.

>I'm not against commercial software because I enjoy a pay check as
>much as anyone else! Open source serves a niche that can't easily be
>served by commercial software in some cases.

Yes but compilers isn't one of those niches

Stephen Pelc

unread,
Jun 22, 2007, 6:36:33 PM6/22/07
to
On Fri, 22 Jun 2007 22:24:45 +0100, Chris Hills <ch...@phaedsys.org>
wrote:

>>I'm not against commercial software because I enjoy a pay check as
>>much as anyone else! Open source serves a niche that can't easily be
>>served by commercial software in some cases.
>
>Yes but compilers isn't one of those niches

Yes it is! It's just that the business model of OSS is totally
different. With proprietary tools, you usually purchase the
compiler (cheap through expensive) and then pay a relatively
low cost for support and annual upgrades. In the the open source
world, you get the compiler free, but support is paid for at
the going rate. Richard Stallman charges in dollars per minute,
according to a page I once browsed.

Other models include the software rental model, which one of
our most successful clients uses. This model gets continuous
development money and permits "small and often" upgrades. Another
mode, e.g. the one used by Rowley and others for ARM, is to take
the gcc compiler and then to provide added value in terms of
IDEs and libraries focussed on embedded systems - not to mention
a real installer.

There are lots of software business models, and use of OSS in
business is just based on one that the conventional commercial
toolmakers do not use. Once you realise that all software
developers have to earn a living, it's just a question of
looking for the business model they use.

Stephen


--
Stephen Pelc, steph...@mpeforth.com
MicroProcessor Engineering Ltd - More Real, Less Time
133 Hill Lane, Southampton SO15 5AF, England
tel: +44 (0)23 8063 1441, fax: +44 (0)23 8033 9691
web: http://www.mpeforth.com - free VFX Forth downloads

Guy Macon

unread,
Jun 23, 2007, 12:24:20 PM6/23/07
to


Chris Hills wrote:

>Eric writes:
>
>> But you need to consider
>>that SDCC, although similar to gcc in some respects, was specifically
>>designed to target small devices, hence its name. It also understands
>>the memory architecture of the 8051.
>
>I bet it doesn't do it that well

And you know this...how?

>Absolutely not. As many commercial companies have looked at both they
>know how far *behind* GCC and SDCC are. Particularly in the case of
>SDCC for 8051.

No offense intended, but you have a long posting history of claiming
that Open Source Software is inferior to commercial software, and
your website ( http://www.phaedsys.org/ ) appears to be that of a
vendor selling commercial software, and thus reading your opinions
is a lot like asking a realtor whether this is a good time to buy a
house. Do you have any references or examples that support the
claim you made above? I am not saying you are wrong; I just want
to test the claim myself as best I can.

(Disclaimer: I personally have no use for anything called a "Small
Device C Compiler for 8051." I program 8051s in Assembly and Forth,
as God intended us to do when he gave the 8051 design spec to Moses.
I have nothing against those folks who like PDB8 assembly lang...
er, I mean C, but I personally prefer Forth and Assembly.)

Chris Hills

unread,
Jun 23, 2007, 8:39:36 PM6/23/07
to
In article <C6GdnVrx5dK...@giganews.com>, Guy Macon
<http@?.guymacon.com/.invalid> writes

>Chris Hills wrote:
>>Eric writes:
>>
>>> But you need to consider
>>>that SDCC, although similar to gcc in some respects, was specifically
>>>designed to target small devices, hence its name. It also understands
>>>the memory architecture of the 8051.
>>
>>I bet it doesn't do it that well
>And you know this...how?

1 I have a copy of it here.
2 benchmarks (which can of course be used to (dis)/prove anything :-)
3 I know what it doesn't do that some of the commercial compilers do.

>>Absolutely not. As many commercial companies have looked at both they
>>know how far *behind* GCC and SDCC are. Particularly in the case of
>>SDCC for 8051.
>
>No offense intended, but you have a long posting history of claiming
>that Open Source Software is inferior to commercial software, and
>your website ( http://www.phaedsys.org/ ) appears to be that of a
>vendor selling commercial software, and thus reading your opinions
>is a lot like asking a realtor whether this is a good time to buy a
>house. Do you have any references or examples that support the
>claim you made above? I am not saying you are wrong; I just want
>to test the claim myself as best I can.

Yes I sell tools. Therefore my opinion is as good as any user of FOSS
who promotes FOSS. Why is it assumed that any person who sells
commercial SW is unduly biased but any one who promotes FOSS is not?

I have generally found the reverse to be more true. I have to work in
Engineering reality . Not some utopian belief. I got cynical early.

However as over many years I have discussed in private with many
commercial compiler developers, code analysis developers and seen the
results of their tests I have been able to form a good picture. Most of
what I have seen or been able to examine is under NDA.

The problem is the FOSS Sw is just that. Open. So any development is
seen by all the commercial tool developers. However the reverse is not
true. So anything FOSS has the commercial people also have and
therefore FOSS can not be more advanced. At best it can only be equal.

On top of that the commercial developers get access to the silicon
companies and some sophisticated development tools and test suites. How
many GCC or SDCC compilers have been run against Plum-Hall or Perennial?

Test suites that are rigorous can cost more recourse to develop than the
compiler itself.

There are techniques used in the commercial tools that are not used in
Gcc or SDCC and the like. To maintain their advantage the commercial
tool companies are not going to tell FOSS developers what these are.
Though in some cases like DATA overlaying on the 8051 knowing what is
done is of little help. There is a lot of effort involved in data
overlaying and other forms of optimisation.

There is mention above that the SDCC handles the architecture of the
8051... WHICH architecture. There are several. Having worked for an ICE
manufacturer I know there are over 40 different cores and I think about
10 different timing models spread over the 600 odd variants.

Just compare the variants SDCC supports compared to say the Keil PK51...
Incidentally the KEil suite used, under the IDE, 2 different compilers
and linkers to support the full range to 8051's

Some parts use more than one memory model. I recall the fun when Philips
brought out the MX range... an 8051 with 256K contiguous linear memory.
(Internal eXternal Data etc) this caused compiler companies a lot
of..... opportunities. However they worked directly with, in this case,
Philips for many months before the part was launched. Therefore they
discuss the mechanisms inside the chip and how they will work in a way
the FOSS developers can't.


SO add together what the commercial compilers get
Full view of the SDCC and or Gcc compilers and how they work
Full testing with Industry standard test suites
Full co-operation of the silicon vendors.
Resource for sophisticated development and test tools
Addition all in house techniques
A lot of in house knowledge.

The FOSS compilers get
Nothing.

Recently some one challenged Byte craft on one of their claims saying it
was not possible. They showed it was possible but did not explain
exactly how they did it. The basic books on compiler theory take a lot
of reading and understanding. The problem is the commercial tools
companies have invested heavily in advancing the science. They just had
not published. Only FOSS does that. So any advances FOSS makes the
commercial guys get but not vice versa.

Then there are the customers. When you deal with the safety critical
world you find the customer run their own tests. This also gives a good
indication of what compilers can do. However again these are usually
under NDA's

FOSS Devotees will have to get over it. ALL their stuff is open in a
world where no one plays by those rules. It is like playing bridge where
there is a team of 3 against 1 and you always have the disclosed hand.

CBFalconer

unread,
Jun 24, 2007, 10:10:15 AM6/24/07
to
Chris Hills wrote:
>
... snip ...

>
> There are techniques used in the commercial tools that are not
> used in Gcc or SDCC and the like. To maintain their advantage the
> commercial tool companies are not going to tell FOSS developers
> what these are. Though in some cases like DATA overlaying on the
> 8051 knowing what is done is of little help. There is a lot of
> effort involved in data overlaying and other forms of optimisation.

That is not so, because most open-source is released under the GPL
or an equivalent license. Copying such is outright theft. Using
it, and releasing source, is not.

... snip ...


>
> Recently some one challenged Byte craft on one of their claims
> saying it was not possible. They showed it was possible but did
> not explain exactly how they did it. The basic books on compiler
> theory take a lot of reading and understanding. The problem is
> the commercial tools companies have invested heavily in advancing
> the science. They just had not published. Only FOSS does that.
> So any advances FOSS makes the commercial guys get but not vice
> versa.

I suspect you are referring to my quizzing Andrew on a
(mis)statement he made on compiler checking. He cleared that up.
It was NOT an explanation of how he did it. Your claim above is a
gross distortion.

Chris Hills

unread,
Jun 24, 2007, 11:24:22 AM6/24/07
to
In article <467E7B47...@yahoo.com>, CBFalconer
<cbfal...@yahoo.com> writes

>Chris Hills wrote:
>>
>... snip ...
>>
>> There are techniques used in the commercial tools that are not
>> used in Gcc or SDCC and the like. To maintain their advantage the
>> commercial tool companies are not going to tell FOSS developers
>> what these are. Though in some cases like DATA overlaying on the
>> 8051 knowing what is done is of little help. There is a lot of
>> effort involved in data overlaying and other forms of optimisation.
>
>That is not so, because most open-source is released under the GPL
>or an equivalent license. Copying such is outright theft. Using
>it, and releasing source, is not.

Looking at the source you understand what is it is doing. You can then
use the same idea in your own system... I didn't mean you literally copy
the source. In any event Sw is not pantentable.

>... snip ...
>>
>> Recently some one challenged Byte craft on one of their claims
>> saying it was not possible. They showed it was possible but did
>> not explain exactly how they did it. The basic books on compiler
>> theory take a lot of reading and understanding. The problem is
>> the commercial tools companies have invested heavily in advancing
>> the science. They just had not published. Only FOSS does that.
>> So any advances FOSS makes the commercial guys get but not vice
>> versa.
>
>I suspect you are referring to my quizzing Andrew on a
>(mis)statement he made on compiler checking. He cleared that up.
>It was NOT an explanation of how he did it. Your claim above is a
>gross distortion.

Andrew who?

OK then can some one PROVE how FOSS compilers are more advanced or *at
least* as good as commercial ones?

FreeRTOS.org

unread,
Jun 24, 2007, 4:31:41 PM6/24/07
to
>
> OK then can some one PROVE how FOSS compilers are more advanced or *at
> least* as good as commercial ones?
>

Wanting to remain an observer of this debate - however I am
curious......what is this advancement buying you? Considering the law of
diminishing returns, what does last years advancement get me? If its an
extra micro seconds per control loop, and this is important to me, then I
chose the wrong processor - not the wrong compiler.

--
Regards,
Richard.

+ http://www.FreeRTOS.org
A free real time kernel for 8, 16 and 32bit systems.

+ http://www.SafeRTOS.com
An IEC 61508 certified real time kernel for safety related systems.


Hans-Bernhard Bröker

unread,
Jun 24, 2007, 4:56:49 PM6/24/07
to
Chris Hills wrote:

> In any event Sw is not pantentable.

You should tell the USPTO that. And the EPO, too, while at it. And the
hundreds of developers who found themselves threatened with lawsuits
over the LZW patent, too.

Software is explicitly declared non-patentable in the documents
establishing the European Patent Office. Yet the GIF patent prevailed.

> OK then can some one PROVE how FOSS compilers are more advanced or *at
> least* as good as commercial ones?

Not unless we all agree on a quantifiable measur of "advanced" or "good"
first. Which, of course, is the core of the problem.

Proprietary tools cause certain risks to their users, primarily by tying
your project's long-term survival to that of the tool vendor, possibly
extending to a hostage-taking kind of situation. This is a risk that,
once expressed as a finanial risk factor, may forbid using the tool.

It's practically impossible for GCC to ever stop working. Proprietary
compilers can do that, some did, and some will do it in the future.

Chris Hills

unread,
Jun 24, 2007, 6:01:17 PM6/24/07
to
In article <NwAfi.11116$p8.1...@text.news.blueyonder.co.uk>,
FreeRTOS.org <noe...@address.com> writes

>>
>> OK then can some one PROVE how FOSS compilers are more advanced or *at
>> least* as good as commercial ones?
>>
>
>Wanting to remain an observer of this debate - however I am
>curious......what is this advancement buying you? Considering the law of
>diminishing returns, what does last years advancement get me? If its an
>extra micro seconds per control loop, and this is important to me, then I
>chose the wrong processor - not the wrong compiler.

Yes much better compression and execution speed.
I have seen several programs that the Keil 2/4/8k limited compilers
could get to run on a 51 that the unlimited SDCC could not.

The problem is when they want to add "just one more feature" without
changing the whole design. For example... smart cards and mobile SIMS
and many other things. Especially when by law you need to add something
to an old system or to change some IO because they are new sensors.

Ideally you would scrap the whole system and start again to ad a small
change to an end of life product.

The other problem is that there are new extended 8051 family members.
The ones with 8Mg adress space in both code and data areas. Internal
(on chip) External data space. And all sorts of things in the new parts
that the better commercial compilers will cope with the the FOSS ones
don't

I do find it strange that people are arguing so strongly for using
second rate tools in their profession. What would you think of a doctor,
dentist, aeronautical engineer who argued the same?

Chris Hills

unread,
Jun 24, 2007, 6:10:47 PM6/24/07
to
In article <f5mlqf$o0f$02$1...@news.t-online.com>, Hans-Bernhard Bröker
<HBBr...@t-online.de> writes

>Chris Hills wrote:
>
>> In any event Sw is not pantentable.
>
>You should tell the USPTO that. And the EPO, too, while at it. And
>the hundreds of developers who found themselves threatened with
>lawsuits over the LZW patent, too.
>
>Software is explicitly declared non-patentable in the documents
>establishing the European Patent Office.

Yes. There are no patents for software.

> Yet the GIF patent prevailed.

That was not the law that prevailed but commerce. It is not likely to
have the same effecting a year or two.

>> OK then can some one PROVE how FOSS compilers are more advanced or
>>*at least* as good as commercial ones?
>
>Not unless we all agree on a quantifiable measur of "advanced" or
>"good" first. Which, of course, is the core of the problem.
>
>Proprietary tools cause certain risks to their users, primarily by
>tying your project's long-term survival to that of the tool vendor,

Not seen that happen so far. SO far there always seems to be a way
around that.

> possibly extending to a hostage-taking kind of situation.

Not seen that happen do you have a case of that? Or is this just making
up FUD like everyone is saying MS are doing.

Also that can happen with FOSS too... Several FOSS companies are
starting to sound more like MS than MS do over trade marks and their IP.

> This is a risk that, once expressed as a finanial risk factor, may
>forbid using the tool.

We you make up the scenarios... I can make up equally ludicrous ones

>It's practically impossible for GCC to ever stop working. Proprietary
>compilers can do that, some did, and some will do it in the future.

I think that is called "scare tactics" and is not s real argument.
Incidentally now several FOSS companies are doing cross licensing with
the Great Satan I would be interested to see how accurate your scenario
is.

*IF* (and I think it is somewhat unlikely, but no more unlikely than
your scenarios) these MS cross licensing and patents things to take hold
GCC might be deemed as having broken various laws and could not then be
used legally.....

Robert Adsett

unread,
Jun 24, 2007, 7:17:53 PM6/24/07
to
In article <UYAfqtGn...@phaedsys.demon.co.uk>, Chris Hills says...

> In article <f5mlqf$o0f$02$1...@news.t-online.com>, Hans-Bernhard Bröker
> <HBBr...@t-online.de> writes
> >Chris Hills wrote:
> >
> >> In any event Sw is not pantentable.
> >
> >You should tell the USPTO that. And the EPO, too, while at it. And
> >the hundreds of developers who found themselves threatened with
> >lawsuits over the LZW patent, too.
> >
> >Software is explicitly declared non-patentable in the documents
> >establishing the European Patent Office.
>
> Yes. There are no patents for software.
>
> > Yet the GIF patent prevailed.
>
> That was not the law that prevailed but commerce. It is not likely to
> have the same effecting a year or two.
>
> >> OK then can some one PROVE how FOSS compilers are more advanced or
> >>*at least* as good as commercial ones?
> >
> >Not unless we all agree on a quantifiable measur of "advanced" or
> >"good" first. Which, of course, is the core of the problem.
> >
> >Proprietary tools cause certain risks to their users, primarily by
> >tying your project's long-term survival to that of the tool vendor,
>
> Not seen that happen so far. SO far there always seems to be a way
> around that.

I've not seen it yet for compilers but I have seen it for PCB CAD
software.

Case 1 - Product (PCB CAD) had a HW dongle and refused to work when HW
was updated. The product had been abandoned at that point (I seem to
remember takeovers being involved). A fix for new HW/OS not available.

Case 2 - This one a compler. Abandoned by the original company.
Thankful no HW or SW dongle so it could continue to be used through
multiple OS and HW updates.

Unfortunately compiler companies have moved to SW dongles (particularly
FlexLm (spit! Ack!)) so case 2 looks like an increasing doubtful
precedent to rely on.

Bottom line micros with a GCC port will get used before micros w/o. If
a micro has no open source compiler of reasonable (best not necessary)
quality and no undongled commercial SW then it has to be a lot
better/cheaper to be worth considering. Inexpensive ARM7s provide a
large hurdle for any proprietary micro with no GCC port to clear.

Since I've seen compilers add dongling after purchase, even undongled
compilers get second biling.

Robert

FreeRTOS.org

unread,
Jun 25, 2007, 2:15:10 AM6/25/07
to
>>Proprietary tools cause certain risks to their users, primarily by tying
>>your project's long-term survival to that of the tool vendor,
>
> Not seen that happen so far. SO far there always seems to be a way around
> that.


Umm - how about the Keil ARM compiler - now defunct. I'm just being
pedantic here, I don't actually see this as relevant. Switching from one
compiler to another is usually a very simple exercise.

I don't have too much experience in the 8051 market, but am asked a lot by
customers about ARM compilers. Once I go through the various pros and conns
of each there is one thing that normally swings the decision, and this one
thing is not actually a techincal attribute. Officially I'm compiler
neutral so I'm not going to say what the one thing is :o)

>> possibly extending to a hostage-taking kind of situation.
>
> Not seen that happen do you have a case of that? Or is this just making up
> FUD like everyone is saying MS are doing.

I know of cases of this in the OS market. Could not comment on the compiler
market.

David Brown

unread,
Jun 25, 2007, 4:29:27 AM6/25/07
to

I would normally expect a doctor, dentist or aeronautical engineer to
use "second rate" tools when appropriate. I expect a doctor to use a
very different level of quality in the tools used for brain surgery and
the tools used to sew up a small cut in a finger.

What people here are arguing for is not that we should prefer low
quality tools, but that we should prefer *appropriate* quality tools.
We need tools that are good enough for the job - beyond that, you can
compare on things like price, availability, support, additional
features, and whatever else interests you.

As Richard says, what do these advancements give you? If I have an 8051
with 8K Flash, and SDCC compiles my program to 4K, then what benefit is
a Keil compiler that compiles it to 2.5K ? It's a different matter if
the figures are 10K for SDCC and 7K for Keil. In the first case, SDCC
is good enough, and therefore a perfectly viable choice - in the second
case, it is *not* good enough.

Walter Banks

unread,
Jun 25, 2007, 9:12:44 AM6/25/07
to

David Brown wrote:

> If I have an 8051
> with 8K Flash, and SDCC compiles my program to 4K, then what benefit is
> a Keil compiler that compiles it to 2.5K ?

Lower EMI and power consumption to name two benefits.

Walter..

David Brown

unread,
Jun 25, 2007, 9:57:24 AM6/25/07
to

Yes, I know those are other benefits of having faster programs, but
that's irrelevant here. My point is that if SDCC (or some other free or
low-cost tool) does a good enough job for what you need, then is there
any reason for buying something more expensive and more advanced? The
answer is, of course, no. It seems, in this thread, that Chris Hills is
having a great deal of difficulty in understanding the concept of "good
enough", and fails to understand why anybody would ever be happy with a
tool that is not the absolute best available solution.

I *know* there are many advantages in a compiler that produces smaller
and faster code - but often that is irrelevant. I *know* there are
advantages in having a compiler that has been through all sorts of
industry standard torture tests, but that too can be irrelevant. I
*know* there are advantages in a compiler that is easy to install, or
easy to work with - but again, that may not matter (and it's a
subjective issue anyway).

"The best is the enemy of the good" (Voltaire, if google got it right.)

Chris Hills

unread,
Jun 25, 2007, 9:51:06 AM6/25/07
to
In article <467FBF4C...@bytecraft.com>, Walter Banks
<wal...@bytecraft.com> writes

Also because the program may still not run with the SDCC compiler Code
size is only one component on a Harvard Architecture.

The usual problem with the SDCC is running out of DATA space long before
the CODE space limit of the Keil compiler becomes a problem. I keep
saying this. It is important.

So your SDCC compiler produces 4K of CODE but still can not get the Data
to fit. What then?

If the Keil fits twice as much code into the space as the SDCC you will
run out of CODE space very much faster using the SDCC. (Code ALWAYS
expands:-) What then?

SDCC is OK if you are using a standard 8051 with a small program with no
data, no power or EMI limitations that will not expand or need to be
ported to another 8051. The program has to be simple... what are you
going to debug it with? AFAIK SDCC has no simulator/debugger

Chris

wilco.d...@ntlworld.com

unread,
Jun 25, 2007, 10:20:03 AM6/25/07
to
On 24 Jun, 01:39, Chris Hills <c...@phaedsys.org> wrote:

> However as over many years I have discussed in private with many
> commercial compiler developers, code analysis developers and seen the
> results of their tests I have been able to form a good picture. Most of
> what I have seen or been able to examine is under NDA.

I'm with you on this one. In 2006 the latest GCC compiler was not able
to match the codesize or performance of an ARM compiler that was
released in 1995. The only target GCC is good at is x86, partly
because it gets a lot more attention than any other target, partly
because most of the optimizations happen in hardware anyway.

> The problem is the FOSS Sw is just that. Open. So any development is
> seen by all the commercial tool developers. However the reverse is not
> true. So anything FOSS has the commercial people also have and
> therefore FOSS can not be more advanced. At best it can only be equal.

I don't believe commercial companies look closely at GCC source code.
Many companies have strict policies that stop people from even looking
at open source code to avoid accidentally copying stuff. However any
good compiler expert only needs to look at the generated code in order
to "borrow" optimizations.

> On top of that the commercial developers get access to the silicon
> companies and some sophisticated development tools and test suites. How
> many GCC or SDCC compilers have been run against Plum-Hall or Perennial?

Full access to silicon/FPGA and specs is essential to provide a well
tuned compiler indeed. I don't think that is a barrier to open source,
there are many well respected open source companies that would be
given access to these things (and I know that happens). I wouldn't
call the commercial test suites very good, most compiler teams develop
their own testsuites as none are good enough to test a compiler.

> Test suites that are rigorous can cost more recourse to develop than the
> compiler itself.

Quite possibly.

> There are techniques used in the commercial tools that are not used in
> Gcc or SDCC and the like. To maintain their advantage the commercial
> tool companies are not going to tell FOSS developers what these are.
> Though in some cases like DATA overlaying on the 8051 knowing what is
> done is of little help. There is a lot of effort involved in data
> overlaying and other forms of optimisation.

As I said above, one doesn't need to know what the techniques are,
it's relatively easily to deduce from the generated code.
Interestingly, open source compilers use more modern techniques than
commercial compilers (which were created in the 1980's). However
technology alone doesn't make a good compiler, you need to know how to
use it. The main difference between a good compiler and a mediocre one
is the amount of effort that has gone into fine tuning it.

And this is where I think commercial development has the advantage.
Compiler vendors find and pay the best people to do whatever it takes
to make the compiler as good as possible (with the idea that this
large investment will pay itself off later). I don't see this kind of
dedication in open source compilers, as developers usually don't get
paid for their work and so don't attract the best people. The ARM port
of GCC for example was neglected for years (with Thumb not working at
all) until ARM paid for it to be fixed and brought up to date with the
latest architectures.

That's said, I don't believe one or the other is inherently better. I
would be happy to put a few years into turning GCC into a really good
compiler that beats most commercial ones. However who is going to pay
me my consulting rate? Big companies with their own compilers... I'm
currently in Seattle :-)

Wilco

Michael N. Moran

unread,
Jun 25, 2007, 10:50:00 AM6/25/07
to
wilco.d...@ntlworld.com wrote:
> And this is where I think commercial development has the
> advantage. Compiler vendors find and pay the best people
> to do whatever it takes to make the compiler as good as
> possible (with the idea that this large investment will
> pay itself off later). I don't see this kind of
> dedication in open source compilers, as developers
> usually don't get paid for their work and so don't
> attract the best people.

This is non-sense. I think you'll find that most of the GCC
developers are being paid to work on GCC.

Just look at the GCC steering committee:
<http://gcc.gnu.org/steering.html>

Next, you may want to look at the list of contributers:
<http://gcc.gnu.org/onlinedocs/gcc/Contributors.html>

While GCC work *may* be done by anyone, serious
development and maintenance of this cornerstone of
Free Software is mostly done by paid skilled professionals,
whose employers understand the value of the GCC.

> The ARM port of GCC for example was neglected for years
> (with Thumb not working at all) until ARM paid for it to
> be fixed and brought up to date with the latest
> architectures.

Demand drives GCC just like demand drives the commercial
compilers.

Chris Hills

unread,
Jun 25, 2007, 10:53:22 AM6/25/07
to
In article <1182781203.1...@m36g2000hse.googlegroups.com>,
wilco.d...@ntlworld.com writes

>On 24 Jun, 01:39, Chris Hills <c...@phaedsys.org> wrote:
>
>> However as over many years I have discussed in private with many
>> commercial compiler developers, code analysis developers and seen the
>> results of their tests I have been able to form a good picture. Most of
>> what I have seen or been able to examine is under NDA.
>
>I'm with you on this one. In 2006 the latest GCC compiler was not able
>to match the codesize or performance of an ARM compiler that was
>released in 1995. The only target GCC is good at is x86, partly
>because it gets a lot more attention than any other target, partly
>because most of the optimizations happen in hardware anyway.

Thanks. How come if you and I (and Walter) can see it no one else can?
This is my argument FOSS Devotees are blinded by their "religion"

>And this is where I think commercial development has the advantage.
>Compiler vendors find and pay the best people to do whatever it takes
>to make the compiler as good as possible (with the idea that this
>large investment will pay itself off later). I don't see this kind of
>dedication in open source compilers, as developers usually don't get
>paid for their work and so don't attract the best people.

Now the FOSS people seem to argue the opposite. They say commercial
compilers that are closed source no one really cares about code
standards because the outside world can't see it. But because the Open
source can be seen by all far more care is taken....

> The ARM port
>of GCC for example was neglected for years (with Thumb not working at
>all) until ARM paid for it to be fixed and brought up to date with the
>latest architectures.

However they brought it up to date and left it there... which is not the
same as actively supporting it.

>That's said, I don't believe one or the other is inherently better. I
>would be happy to put a few years into turning GCC into a really good
>compiler that beats most commercial ones. However who is going to pay
>me my consulting rate? Big companies with their own compilers... I'm
>currently in Seattle :-)
>Wilco

I know that several companies have put an Engineer on to GCC for 5-10
days "for fun" * to see what they could do with it and all managed to
get huge increases in code density and speed. (Not on X86 though)

* Actually in all cases it was to see how competitive Gcc *might* be if
it was properly developed..

However much FOSS is effectively becoming commercial now anyway. The
only difference is the core programmers don't get paid, yet the FOSS
Devotees love it.

Turkeys voting for Christmas.

Chris Hills

unread,
Jun 25, 2007, 11:13:59 AM6/25/07
to
In article <FCQfi.7283$G9....@bignews6.bellsouth.net>, Michael N. Moran
<mnm...@bellsouth.net> writes

>wilco.d...@ntlworld.com wrote:
>> And this is where I think commercial development has the
>> advantage. Compiler vendors find and pay the best people
>> to do whatever it takes to make the compiler as good as
>> possible (with the idea that this large investment will
>> pay itself off later). I don't see this kind of dedication in open
>>source compilers, as developers
>> usually don't get paid for their work and so don't
>> attract the best people.
>
>This is non-sense. I think you'll find that most of the GCC
>developers are being paid to work on GCC.

Like Linux. Gcc is now commercial

>> The ARM port of GCC for example was neglected for years
>> (with Thumb not working at all) until ARM paid for it to
>> be fixed and brought up to date with the latest
>> architectures.
>
>Demand drives GCC just like demand drives the commercial
>compilers.

The problem is that the FOSS Devotees think that everyone who does any
work on GCC is a Devotee. There are many cynical non-belivers who are
not helping in order to help the faith..... they have other objectives
and don't care if FOSS sinks or swims.

FreeRTOS.org

unread,
Jun 25, 2007, 12:06:28 PM6/25/07
to
>> The ARM port
>>of GCC for example was neglected for years (with Thumb not working at
>>all) until ARM paid for it to be fixed and brought up to date with the
>>latest architectures.
>
> However they brought it up to date and left it there... which is not the
> same as actively supporting it.


I was under the impression - possibly mistakenly - that CodeSourcery were
the official guardians of ARM GCC, amongst other ports. Cortex-M3 support
has been added very recently.

Michael N. Moran

unread,
Jun 25, 2007, 12:35:41 PM6/25/07
to
Chris Hills wrote:
> In article <FCQfi.7283$G9....@bignews6.bellsouth.net>,
> Michael N. Moran <mnm...@bellsouth.net> writes
>> wilco.d...@ntlworld.com wrote:
>>> And this is where I think commercial development has
>>> the advantage. Compiler vendors find and pay the best
>>> people to do whatever it takes to make the compiler
>>> as good as possible (with the idea that this large
>>> investment will pay itself off later). I don't see
>>> this kind of dedication in open source compilers, as
>>> developers usually don't get paid for their work and
>>> so don't attract the best people.
>>
>> This is non-sense. I think you'll find that most of the
>> GCC developers are being paid to work on GCC.
>
> Like Linux. Gcc is now commercial

Free-as-in-speech, not (necessarily) free-as-in-beer.
Contrary to popular FUD, Free Software is not about
preventing commerce.

>>> The ARM port of GCC for example was neglected for
>>> years (with Thumb not working at all) until ARM paid
>>> for it to be fixed and brought up to date with the
>>> latest architectures.
>>
>> Demand drives GCC just like demand drives the
>> commercial compilers.
>
> The problem is that the FOSS Devotees think that everyone
> who does any work on GCC is a Devotee.

<pictures a guy singing with a flower pot on his head>
"Correct at will..." :-)

What difference does that make?

> There are many cynical non-belivers who are not helping
> in order to help the faith..... they have other
> objectives and don't care if FOSS sinks or swims.

Huh? Do you mean:

There are many cynical non-believers who are not
helping in order to keep the faith, but are instead
helping because they have other objectives. These
same cynical non-believers don't care if FOSS sinks
or swims.

If this is what you meant then ...

"The enemy of my enemy is my friend" comes to mind.

Regardless of their objectives, these people must
find some advantage in helping GCC. That's OK
with the Free Software people and well within their
"belief system."

Michael N. Moran

unread,
Jun 25, 2007, 12:43:05 PM6/25/07
to
Chris Hills wrote:
> However much FOSS is effectively becoming commercial now
> anyway.

Whatever that means. As long as the source code remains open
that's great!

> The only difference is the core programmers don't get
> paid

Chris, that's just crap. Perhaps you should follow the
development mailing list g...@gcc.gnu.org for a while
and get a clue, instead of spewing nonsense like that.

Chris Hills

unread,
Jun 25, 2007, 1:30:10 PM6/25/07
to
In article <p9Sfi.3521$s8....@bignews1.bellsouth.net>, Michael N. Moran

To give you some idea the US military used that strategy in Afghanistan
and trained Al-qeada and others.... Look where it got them!

--

Michael N. Moran

unread,
Jun 25, 2007, 2:01:38 PM6/25/07
to
Chris Hills wrote:
> In article <p9Sfi.3521$s8....@bignews1.bellsouth.net>, Michael N. Moran
> <mnm...@bellsouth.net> writes
>> Huh? Do you mean:
>>
>> There are many cynical non-believers who are not
>> helping in order to keep the faith, but are instead
>> helping because they have other objectives. These
>> same cynical non-believers don't care if FOSS sinks
>> or swims.
>>
>> If this is what you meant then ...
>>
>> "The enemy of my enemy is my friend" comes to mind.
>>
>> Regardless of their objectives, these people must
>> find some advantage in helping GCC. That's OK
>> with the Free Software people and well within their
>> "belief system."
>
> To give you some idea the US military used that strategy in Afghanistan
> and trained Al-qeada and others.... Look where it got them!

Are you suggesting that "these people" are actively
sabotaging GCC?

Chris Hills

unread,
Jun 25, 2007, 2:22:49 PM6/25/07
to
In article <%pTfi.2716$da....@bignews4.bellsouth.net>, Michael N. Moran
<mnm...@bellsouth.net> writes
>Chris Hills wrote:
>> In article <p9Sfi.3521$s8....@bignews1.bellsouth.net>, Michael N.
>>Moran <mnm...@bellsouth.net> writes
>>> Huh? Do you mean:
>>>
>>> There are many cynical non-believers who are not
>>> helping in order to keep the faith, but are instead
>>> helping because they have other objectives. These
>>> same cynical non-believers don't care if FOSS sinks
>>> or swims.
>>>
>>> If this is what you meant then ...
>>>
>>> "The enemy of my enemy is my friend" comes to mind.
>>>
>>> Regardless of their objectives, these people must
>>> find some advantage in helping GCC. That's OK
>>> with the Free Software people and well within their
>>> "belief system."
>> To give you some idea the US military used that strategy in
>>Afghanistan and trained Al-qeada and others.... Look where it got them!
>
>Are you suggesting that "these people" are actively
>sabotaging GCC?

Not in the slightest. I did not say or even suggest that.

In Afghanistan when the US trained AQ they worked for the US Very well
for some years. Both wanted the USSR out of Afghanistan. However AQ also
wanted the US out of other places but it put that on hold for the
duration.

Just because for a short time they travel the same path does not mean
they are on your side or even agree with your principals. Laterin a
different situation they may be actively against you.

Just look at the situation with FOSS at the moment. Linux and FSF
can't agree GPL3 whilst other Linux distros are acutely signing
agreements with the Great Satan (MS :-)

Not everything is as it may seem at first sight.

Michael N. Moran

unread,
Jun 25, 2007, 3:17:44 PM6/25/07
to

OK. So there may be people contributing to Free Software who
will one day turn against it, or already have. How does this
affect the quality of GCC?

> Just look at the situation with FOSS at the moment. Linux and FSF
> can't agree GPL3 whilst other Linux distros are acutely signing
> agreements with the Great Satan (MS :-)

So what is your point? The existing source code will remain
available. You are free to maintain the source and use it
according to the terms of the current license.

Again, how does this affect the quality of GCC?

FreeRTOS.org

unread,
Jun 25, 2007, 3:42:18 PM6/25/07
to
"Chris Hills" <ch...@phaedsys.org> wrote in message
news:RVYOp7J5...@phaedsys.demon.co.uk...


Do you really think this is a relevant angle for your argument? Where are
you getting your shrooms from :o)

wilco.d...@ntlworld.com

unread,
Jun 26, 2007, 12:15:44 AM6/26/07
to
On 25 Jun, 17:06, "FreeRTOS.org" <noem...@address.com> wrote:
> >> The ARM port
> >>of GCC for example was neglected for years (with Thumb not working at
> >>all) until ARM paid for it to be fixed and brought up to date with the
> >>latest architectures.
>
> > However they brought it up to date and left it there... which is not the
> > same as actively supporting it.
>
> I was under the impression - possibly mistakenly - that CodeSourcery were
> the official guardians of ARM GCC, amongst other ports. Cortex-M3 support
> has been added very recently.

You're entirely right. It is basic support though, not comparable with
the amount of effort that went into the Thumb-2 backend in the ARM
compiler.

Wilco

wilco.d...@ntlworld.com

unread,
Jun 26, 2007, 1:28:05 AM6/26/07
to
On 25 Jun, 15:50, "Michael N. Moran" <mnmo...@bellsouth.net> wrote:

> wilco.dijks...@ntlworld.com wrote:
> > And this is where I think commercial development has the
> > advantage. Compiler vendors find and pay the best people
> > to do whatever it takes to make the compiler as good as
> > possible (with the idea that this large investment will
> > pay itself off later). I don't see this kind of
> > dedication in open source compilers, as developers
> > usually don't get paid for their work and so don't
> > attract the best people.
>
> This is non-sense. I think you'll find that most of the GCC
> developers are being paid to work on GCC.
>
> Just look at the GCC steering committee:
> <http://gcc.gnu.org/steering.html>
>
> Next, you may want to look at the list of contributers:
> <http://gcc.gnu.org/onlinedocs/gcc/Contributors.html>

I'd be surprised if the number of paid contributors is larger than the
unpaid ones, or are you counting employees of companies whose main
business is not open source? How many companies are there whose main
business is developing or maintaining GCC? Do they pay competitive
rates to hire top compiler experts? Big businesses have their reasons
for contributing, but most have their own commercial compilers already
- and that is where much of the effort goes.

> While GCC work *may* be done by anyone, serious
> development and maintenance of this cornerstone of
> Free Software is mostly done by paid skilled professionals,
> whose employers understand the value of the GCC.

If that was true I would expect GCC to be far better than it is today.
>From what I've seen issues can take a long time to fix. I remember the
__irq issue on ARM that went unfixed for quite a while (numerous
people encountered that one), and I think the register allocator still
has problems in generating many unnecessary move instructions since a
change several years ago. In a commercial environment these would be
"must fix before release" kind of bugs.

Other things like -O0 generating ridiculously inefficient code and
emitting frame pointers when few compilers do so today do not instill
a professional image. I once read a paper that showed a 5% codesize
improvement on ARM by changing a few defaults. Again, that was a few
years ago, has it been implemented yet?

> > The ARM port of GCC for example was neglected for years
> > (with Thumb not working at all) until ARM paid for it to
> > be fixed and brought up to date with the latest
> > architectures.
>
> Demand drives GCC just like demand drives the commercial
> compilers.

It would be good if GCC was developed more like a commercial compiler
indeed. Maybe that is what will happen in the future, but I don't
think it is anywhere near yet. GCC may have lots of fashionable
optimizations but I'd prefer stuff to work reliably and efficiently
first.

Wilco

Colin Paul Gloster

unread,
Jun 26, 2007, 3:38:36 AM6/26/07
to
In news:1182835685.7...@m36g2000hse.googlegroups.com
timestamped Mon, 25 Jun 2007 22:28:05 -0700,
wilco.d...@ntlworld.com posted:

"On 25 Jun, 15:50, "Michael N. Moran" <mnmo...@bellsouth.net> wrote:
> wilco.dijks...@ntlworld.com wrote:
> > And this is where I think commercial development has the
> > advantage. Compiler vendors find and pay the best people
> > to do whatever it takes to make the compiler as good as
> > possible (with the idea that this large investment will
> > pay itself off later). I don't see this kind of
> > dedication in open source compilers, as developers
> > usually don't get paid for their work and so don't
> > attract the best people.
>
> This is non-sense. I think you'll find that most of the GCC
> developers are being paid to work on GCC."

Even if all people to contribute to GCC are all being paid to do so,
Wilco Dijkstra's point still valid: a dedicated team being paid to
work in a fulltime manner to make a proprietary compiler will probably
try with quite good success to make a quite good compiler. (Exceptions
have existed.)

">
> Just look at the GCC steering committee:
> <http://gcc.gnu.org/steering.html>
>
> Next, you may want to look at the list of contributers:
> <http://gcc.gnu.org/onlinedocs/gcc/Contributors.html> "

Many people who contribute to GCC are not listed on any of those webpages.



"I'd be surprised if the number of paid contributors is larger than the
unpaid ones,"

I expect that few of these people are unemployed. They may be paid to
do something, and they may deem GCC to be useful for performing some
of their jobs' duties so may choose to use some of their paid time to
contribute to GCC.

" or are you counting employees of companies whose main
business is not open source?"

Of course such employees are counted.

" How many companies are there whose main

business is developing or maintaining GCC? [..]"

I suspect very few.



"> While GCC work *may* be done by anyone, serious
> development and maintenance of this cornerstone of
> Free Software is mostly done by paid skilled professionals,
> whose employers understand the value of the GCC.

If that was true I would expect GCC to be far better than it is today.

>From what I've seen issues can take a long time to fix. [..]

[..]"

Patches for AVR-GCC can take over a year to get through to the main
GCC repository, chiefly because the active maintainers of the time of
the AVR-GCC port did not have permission to write into the main GCC
repository and most people who did have permission did not care. As
mentioned by the lesser inactive of the two official AVR-GCC
maintainers of the time on
HTTP://lists.GNU.org/archive/html/avr-gcc-list/2006-09/msg00010.html
:"[..]

Generally, they (GCC peoples) don't bothered about AVR port. It's just
an 8-bits microcontroller. They are right.

[..]"

Regards,
Colin Paul Gloster

David Brown

unread,
Jun 26, 2007, 4:07:20 AM6/26/07
to
Michael N. Moran wrote:
> Chris Hills wrote:
>> In article <FCQfi.7283$G9....@bignews6.bellsouth.net>,
>> Michael N. Moran <mnm...@bellsouth.net> writes
>>> wilco.d...@ntlworld.com wrote:
>>>> And this is where I think commercial development has
>>>> the advantage. Compiler vendors find and pay the best
>>>> people to do whatever it takes to make the compiler
>>>> as good as possible (with the idea that this large
>>>> investment will pay itself off later). I don't see
>>>> this kind of dedication in open source compilers, as
>>>> developers usually don't get paid for their work and
>>>> so don't attract the best people.
>>>
>>> This is non-sense. I think you'll find that most of the
>>> GCC developers are being paid to work on GCC.
>>
>> Like Linux. Gcc is now commercial
>
> Free-as-in-speech, not (necessarily) free-as-in-beer.
> Contrary to popular FUD, Free Software is not about
> preventing commerce.
>

As an example, so that Chris can understand the principle involved, Code
Sourcery sell gcc toolchains for the ARM and the ColdFire (and possibly
others). You have three options - free downloadable versions, paid
subscription versions (which lead the free versions by about 6 months,
and come with better hardware debugger support and integrated Eclipse
setup), and professional subscription versions (which come with full
support contracts). Each is available in windows and linux versions,
both as easily install binary and source. So Code Sourcery makes money
out of selling open source tools along with support contracts and added
extras. This money pays for their business, and it pays the salaries of
their programmers - who work on gcc (and related tools). Code Sourcery
is the official maintainer of the ARM and ColdFire ports, and if you
look at the changelogs of gcc you'll see their names scattered over wide
ranges of gcc.

This gives the customer a wide choice of how they want to work, and how
much they want to pay for it. You get everything from free to
top-quality commercial service, and you can choose from
compile-from-source to gui install and IDE, and there is a solid and
serious commercial company behind it all.

There are, of course, lots of other major companies backing gcc and
paying developers (Intel, AMD, IBM, Atmel, Red Hat, Novel, etc., etc.) -
Code Sourcery is just one example.

Colin Paul Gloster

unread,
Jun 26, 2007, 7:38:26 AM6/26/07
to
In news:7a70arEK...@phaedsys.demon.co.uk timestamped Mon, 25 Jun
2007 14:51:06 +0100, Chris Hills <ch...@phaedsys.org> posted:
"[..]

[..] AFAIK SDCC has no simulator/debugger"

SDCC does have its own simulator and debugger. They however were not
integrated very well by default with the C debugging information from
a version of SDCC (sic) from 2007: the debugger would show assembly
instructions near but away from the instructions which were really
being stepped through.

One of the compilers which is apparently supported very well by
the 8051 simulation and debugging facilities of BoxView IDE is SDCC:
WWW.DomainTec.com/BoxViewIDEDSP.html

Michael N. Moran

unread,
Jun 26, 2007, 9:23:46 AM6/26/07
to
wilco.d...@ntlworld.com wrote:
> On 25 Jun, 15:50, "Michael N. Moran" <mnmo...@bellsouth.net> wrote:
>> I think you'll find that most of the GCC
>> developers are being paid to work on GCC.
>>
>> Just look at the GCC steering committee:
>> <http://gcc.gnu.org/steering.html>
>>
>> Next, you may want to look at the list of contributers:
>> <http://gcc.gnu.org/onlinedocs/gcc/Contributors.html>
>
> I'd be surprised if the number of paid contributors is larger than the
> unpaid ones, or are you counting employees of companies whose main
> business is not open source?

Why wouldn't I count those whose main business is not
open source? Many have an interest in having their products
supported by GCC, and so they invest.

> How many companies are there whose main
> business is developing or maintaining GCC? Do they pay competitive
> rates to hire top compiler experts?

Do you have evidence to the contrary? I suppose
you could e-mail and ask them. However, my impression
is that the GCC maintainers are well respected. I have
been following the g...@gcc.gnu.org mailing list for
years, and I can tell you that my perception is there
are plenty of compiler experts that guide and contribute
and plenty of others to do the more "mundane."

> Big businesses have their reasons
> for contributing, but most have their own commercial compilers already
> - and that is where much of the effort goes.

Or perhaps they have found that having their own compiler is
unjustified when they could instead simply invest in the
community and have a comparable or better product by drawing
on a larger expertise.

>> While GCC work *may* be done by anyone, serious
>> development and maintenance of this cornerstone of
>> Free Software is mostly done by paid skilled professionals,
>> whose employers understand the value of the GCC.
>
> If that was true I would expect GCC to be far better than it is today.

Apparently you've had a bad experience. My experience
has been much better. Are there bugs? Sure. Have I seen
bugs in commercial compilers? Sure. I have received a
*very* fast bug fix for GCC H8. Compiler's have bugs.

>>From what I've seen issues can take a long time to fix. I remember the
> __irq issue on ARM that went unfixed for quite a while (numerous
> people encountered that one), and I think the register allocator still
> has problems in generating many unnecessary move instructions since a
> change several years ago. In a commercial environment these would be
> "must fix before release" kind of bugs.

OK, GCC is not perfect. What compiler is? And yes, I
have used GCC for ARM, building the Linux kernel and
many user-space applications on two different ARM
platforms without issues.

> Other things like -O0 generating ridiculously inefficient code and
> emitting frame pointers when few compilers do so today do not instill
> a professional image.

Uhhh... -O0 is turning all optimization off. Why would
you expect efficient code?

> I once read a paper that showed a 5% codesize
> improvement on ARM by changing a few defaults. Again, that was a few
> years ago, has it been implemented yet?

In the "words" of my son ... idk ;-)

>>> The ARM port of GCC for example was neglected for years
>>> (with Thumb not working at all) until ARM paid for it to
>>> be fixed and brought up to date with the latest
>>> architectures.
>> Demand drives GCC just like demand drives the commercial
>> compilers.
>
> It would be good if GCC was developed more like a commercial compiler
> indeed. Maybe that is what will happen in the future, but I don't
> think it is anywhere near yet. GCC may have lots of fashionable
> optimizations but I'd prefer stuff to work reliably and efficiently
> first.

Unlike commercial compilers, GCC supports a huge number
of targets including 64,32,16 and 8 bit processors. Evolving
an infrastructure capable of doing this is time consuming,
and as a result GCC *may* not always have the absolute best
code generation for any particular target, but the compiler
is passionately maintained and is constantly evolving.

wilco.d...@ntlworld.com

unread,
Jun 26, 2007, 11:24:18 AM6/26/07
to
On 26 Jun, 14:23, "Michael N. Moran" <mnmo...@bellsouth.net> wrote:

> wilco.dijks...@ntlworld.com wrote:
> > On 25 Jun, 15:50, "Michael N. Moran" <mnmo...@bellsouth.net> wrote:
> >> I think you'll find that most of the GCC
> >> developers are being paid to work on GCC.
>
> >> Just look at the GCC steering committee:
> >> <http://gcc.gnu.org/steering.html>
>
> >> Next, you may want to look at the list of contributers:
> >> <http://gcc.gnu.org/onlinedocs/gcc/Contributors.html>
>
> > I'd be surprised if the number of paid contributors is larger than the
> > unpaid ones, or are you counting employees of companies whose main
> > business is not open source?
>
> Why wouldn't I count those whose main business is not
> open source? Many have an interest in having their products
> supported by GCC, and so they invest.

The companies that hire fulltime staff to work on GCC are often just
supporting their own products (eg. a backend for their proprietary
CPU) and don't improve competing targets or GCC as a whole. Few large
companies hire fulltime staff to improve the core of GCC, especially
if they already have their in-house compiler. If resources are
constrained, which is going to win?

> > Big businesses have their reasons
> > for contributing, but most have their own commercial compilers already
> > - and that is where much of the effort goes.
>
> Or perhaps they have found that having their own compiler is
> unjustified when they could instead simply invest in the
> community and have a comparable or better product by drawing
> on a larger expertise.

That is true for smaller companies who cannot afford to put a full
compiler team in place. I know GCC is very popular with startups.
However when you dig deeper many would create their own compiler if
they could afford it as they are not that happy with the code quality
they get. I do not believe that if you want comparable quality to
commercial compilers that GCC would ultimately be a cheaper option.

> >> While GCC work *may* be done by anyone, serious
> >> development and maintenance of this cornerstone of
> >> Free Software is mostly done by paid skilled professionals,
> >> whose employers understand the value of the GCC.
>
> > If that was true I would expect GCC to be far better than it is today.
>
> Apparently you've had a bad experience. My experience
> has been much better. Are there bugs? Sure. Have I seen
> bugs in commercial compilers? Sure. I have received a
> *very* fast bug fix for GCC H8. Compiler's have bugs.

I wouldn't call it a bad experience. I am simply used to the best
compilers as I've worked on them and improved myself. What I'm saying
is that GCC looks bad if you compare it to commercial compilers. A
while ago I wrote some optimised C code for a client and found that
GCC produced 40% larger code... If you don't care about this then you
can be perfectly happy with GCC.

> >>From what I've seen issues can take a long time to fix. I remember the
> > __irq issue on ARM that went unfixed for quite a while (numerous
> > people encountered that one), and I think the register allocator still
> > has problems in generating many unnecessary move instructions since a
> > change several years ago. In a commercial environment these would be
> > "must fix before release" kind of bugs.
>
> OK, GCC is not perfect. What compiler is? And yes, I
> have used GCC for ARM, building the Linux kernel and
> many user-space applications on two different ARM
> platforms without issues.

Sure, I'm not expecting it to be perfect or even as good as commercial
compilers. But open source advocates often claim that they can go in
and fix bugs much faster than in a commercial environment. This is
simply untrue in most cases - if anything, the timescales for bugfixes
in GCC are worse. Of course if you *pay* for a support contract then
your experience may be better.

> > Other things like -O0 generating ridiculously inefficient code and
> > emitting frame pointers when few compilers do so today do not instill
> > a professional image.
>
> Uhhh... -O0 is turning all optimization off. Why would
> you expect efficient code?

Turning off all optimizations achieves what exactly? Usually the goal
of -O0 is fast compilation and generate code that is easy to debug.
Turning off all optimizations does not achieve either goal. It may be
counter intuitive, but compiling optimised code is often faster than
compiling unoptimized code. When I debug code, I'd like to see local
variables in registers rather than being distracted by all the spill
code.

> > It would be good if GCC was developed more like a commercial compiler
> > indeed. Maybe that is what will happen in the future, but I don't
> > think it is anywhere near yet. GCC may have lots of fashionable
> > optimizations but I'd prefer stuff to work reliably and efficiently
> > first.
>
> Unlike commercial compilers, GCC supports a huge number
> of targets including 64,32,16 and 8 bit processors. Evolving
> an infrastructure capable of doing this is time consuming,
> and as a result GCC *may* not always have the absolute best
> code generation for any particular target, but the compiler
> is passionately maintained and is constantly evolving.

Agreed. I hope it does improve further.

Wilco

David Brown

unread,
Jun 26, 2007, 12:53:30 PM6/26/07
to
wilco.d...@ntlworld.com wrote:
> On 26 Jun, 14:23, "Michael N. Moran" <mnmo...@bellsouth.net> wrote:
>> wilco.dijks...@ntlworld.com wrote:
>>> On 25 Jun, 15:50, "Michael N. Moran" <mnmo...@bellsouth.net> wrote:
>>>> I think you'll find that most of the GCC
>>>> developers are being paid to work on GCC.
>>>> Just look at the GCC steering committee:
>>>> <http://gcc.gnu.org/steering.html>
>>>> Next, you may want to look at the list of contributers:
>>>> <http://gcc.gnu.org/onlinedocs/gcc/Contributors.html>
>>> I'd be surprised if the number of paid contributors is larger than the
>>> unpaid ones, or are you counting employees of companies whose main
>>> business is not open source?
>> Why wouldn't I count those whose main business is not
>> open source? Many have an interest in having their products
>> supported by GCC, and so they invest.
>
> The companies that hire fulltime staff to work on GCC are often just
> supporting their own products (eg. a backend for their proprietary
> CPU) and don't improve competing targets or GCC as a whole. Few large
> companies hire fulltime staff to improve the core of GCC, especially
> if they already have their in-house compiler. If resources are
> constrained, which is going to win?
>

Companies with a particular interest in the performance of gcc for a
given backend will support improvements to that backend, and to gcc as a
whole, as that's what benefits them. You are correct that they have
little interest in improving other backends, but front-end improvements
help them too.

>>> Big businesses have their reasons
>>> for contributing, but most have their own commercial compilers already
>>> - and that is where much of the effort goes.
>> Or perhaps they have found that having their own compiler is
>> unjustified when they could instead simply invest in the
>> community and have a comparable or better product by drawing
>> on a larger expertise.
>
> That is true for smaller companies who cannot afford to put a full
> compiler team in place. I know GCC is very popular with startups.
> However when you dig deeper many would create their own compiler if
> they could afford it as they are not that happy with the code quality
> they get. I do not believe that if you want comparable quality to
> commercial compilers that GCC would ultimately be a cheaper option.
>

I'm sure Altera, Xilinx, and Atmel, amongst others, appreciate you
referring to them as "startups" or implying they have gone for the
cheapo option because they are unwilling or incapable of "digging deeper".

Of course, they may perhaps have actively chosen to work on gcc ports on
the basis of past successes, expected future successes, value for their
investment in time and money, customer pressure, and supported source
code (such as a linux port to the architecture in question). In
particular, it is extremely unlikely that both Altera and Xilinx would
have made such total commitments to their gcc ports (there are, as far
as I know, no non-gcc compilers for their soft processors) if they
thought that a non-gcc compiler (in-house or external) would be
significantly better. Their competitiveness runs too deep to miss out
on such an opportunity - especially if, as you claim, it would be
cheaper overall.

<snip>

>>> Other things like -O0 generating ridiculously inefficient code and
>>> emitting frame pointers when few compilers do so today do not instill
>>> a professional image.
>> Uhhh... -O0 is turning all optimization off. Why would
>> you expect efficient code?
>
> Turning off all optimizations achieves what exactly? Usually the goal
> of -O0 is fast compilation and generate code that is easy to debug.
> Turning off all optimizations does not achieve either goal. It may be
> counter intuitive, but compiling optimised code is often faster than
> compiling unoptimized code. When I debug code, I'd like to see local
> variables in registers rather than being distracted by all the spill
> code.
>

The answer is quite simple - don't run with all optimisations turned off
if you want some optimisations turned on. Like you, I prefer some
minimal optimisations, such as variables in registers, when looking at
the generated code. Normally I'd always use -O2 (or -Os, depending on
the target), but for some debugging -O1 is convenient. No one who knows
what they are doing compiles with -O0 on any compiler, gcc or otherwise.

mvh.,

David

Paul Taylor

unread,
Jun 26, 2007, 4:44:54 PM6/26/07
to
On Mon, 25 Jun 2007 22:28:05 -0700, wilco.dijkstra wrote:

> and
> emitting frame pointers when few compilers do so today do not instill
> a professional image.

The embedded code I have written is mostly C code, and I very rarely look
at the assembler code. But that comment caught my attention because I
thought that almost all compilers, particularly 32-bit ones, use stack
frames and with frame pointers - or am I interpreting that comment
incorrectly?

Regards,

Paul.


Paul Taylor

unread,
Jun 26, 2007, 5:10:30 PM6/26/07
to
On Mon, 25 Jun 2007 22:28:05 -0700, wilco.dijkstra wrote:

> Other things like -O0 generating ridiculously inefficient code

I'm possibly showing my naivity here - I'm certainly not on expert on
these matters, but isn't there a step in the compilation process just
before optimisations (after tokenizing/parsing) that gets an internal
representation of the source code (RTL with the GCC?), and where that step
in the process is essentially a "dumb" part of the process, with the
really clever bit, the optimisations, occurring *after* this step?

If so, then a compiler writer surely is going to need access to this
unoptimised code at least for unit tests (or whatever gcc uses)? In which
case -O0 is how you get at it. Unoptimised code is always going to be
ridiculously inefficient, and you really wouldn't want to use it. Again
I'm not an expert on the compilation process, but the above is my
understanding as of now - please enlighten me :-)

Regards,

Paul.


CBFalconer

unread,
Jun 26, 2007, 6:10:46 PM6/26/07
to
Paul Taylor wrote:
> wilco.dijkstra wrote:
>
>> and emitting frame pointers when few compilers do so today do not
>> instill a professional image.
>
> The embedded code I have written is mostly C code, and I very
> rarely look at the assembler code. But that comment caught my
> attention because I thought that almost all compilers, particularly
> 32-bit ones, use stack frames and with frame pointers - or am I
> interpreting that comment incorrectly?

If the machine uses a stack, and the compiler keeps careful track
of the state of that stack, it can generate SP relative addresses.
However this normally requires other restrictions on the generated
code.

--
<http://www.cs.auckland.ac.nz/~pgut001/pubs/vista_cost.txt>
<http://www.securityfocus.com/columnists/423>
<http://www.aaxnet.com/editor/edit043.html>
cbfalconer at maineline dot net

--
Posted via a free Usenet account from http://www.teranews.com

Michael N. Moran

unread,
Jun 26, 2007, 10:29:24 PM6/26/07
to
wilco.d...@ntlworld.com wrote:
> On 26 Jun, 14:23, "Michael N. Moran" <mnmo...@bellsouth.net> wrote:
>> Apparently you've had a bad experience. My experience
>> has been much better. Are there bugs? Sure. Have I seen
>> bugs in commercial compilers? Sure. I have received a
>> *very* fast bug fix for GCC H8. Compiler's have bugs.
>
> I wouldn't call it a bad experience. I am simply used to the best
> compilers as I've worked on them and improved myself. What I'm saying
> is that GCC looks bad if you compare it to commercial compilers. A
> while ago I wrote some optimised C code for a client and found that
> GCC produced 40% larger code...

I'm not sure what "optimized C code" is. Source that
is optimized for one compiler may be pessimized to another.
BTW, 40% is a bad experience. ;-)

>> Uhhh... -O0 is turning all optimization off. Why would
>> you expect efficient code?
>
> Turning off all optimizations achieves what exactly? Usually the goal
> of -O0 is fast compilation and generate code that is easy to debug.

Yep.

> Turning off all optimizations does not achieve either goal. It may be
> counter intuitive, but compiling optimised code is often faster than
> compiling unoptimized code.

Maybe.

> When I debug code, I'd like to see local
> variables in registers rather than being distracted by all the spill
> code.

If your debugging in assembler mode then I suppose that's
reasonable, but at the source code level it wouldn't be
an advantage.

>> Unlike commercial compilers, GCC supports a huge number
>> of targets including 64,32,16 and 8 bit proces
>>> It would be good if GCC was developed more like a commercial compiler
>>> indeed. Maybe that is what will happen in the future, but I don't
>>> think it is anywhere near yet. GCC may have lots of fashionable
>>> optimizations but I'd prefer stuff to work reliably and efficiently

>>> first.sors. Evolving


>> an infrastructure capable of doing this is time consuming,
>> and as a result GCC *may* not always have the absolute best
>> code generation for any particular target, but the compiler
>> is passionately maintained and is constantly evolving.
>
> Agreed. I hope it does improve further.

Roger
> Wilco

CBFalconer

unread,
Jun 26, 2007, 11:12:56 PM6/26/07
to
CBFalconer wrote:
> Paul Taylor wrote:
>> wilco.dijkstra wrote:
>>
>>> and emitting frame pointers when few compilers do so today do not
>>> instill a professional image.
>>
>> The embedded code I have written is mostly C code, and I very
>> rarely look at the assembler code. But that comment caught my
>> attention because I thought that almost all compilers, particularly
>> 32-bit ones, use stack frames and with frame pointers - or am I
>> interpreting that comment incorrectly?
>
> If the machine uses a stack, and the compiler keeps careful track
> of the state of that stack, it can generate SP relative addresses.
> However this normally requires other restrictions on the generated
> code.

BTW, about 30 years ago, when I did that for the 8080, I thought I
was breaking new ground. Tweren't so.

wilco.d...@ntlworld.com

unread,
Jun 27, 2007, 12:31:26 AM6/27/07
to
On 26 Jun, 23:10, CBFalconer <cbfalco...@yahoo.com> wrote:
> Paul Taylor wrote:
> > wilco.dijkstra wrote:
>
> >> and emitting frame pointers when few compilers do so today do not
> >> instill a professional image.
>
> > The embedded code I have written is mostly C code, and I very
> > rarely look at the assembler code. But that comment caught my
> > attention because I thought that almost all compilers, particularly
> > 32-bit ones, use stack frames and with frame pointers - or am I
> > interpreting that comment incorrectly?
>
> If the machine uses a stack, and the compiler keeps careful track
> of the state of that stack, it can generate SP relative addresses.
> However this normally requires other restrictions on the generated
> code.

Most compilers stopped using frame pointers a long time ago. They are
inefficient and don't actually provide any benefit. Rather than
changing SP repeatedly inside a function, SP is adjusted only on entry
and exit of the function, further improving efficiency. This also
makes it easier to track stack variables in debuggers (as offsets from
SP are fixed). The drawback is that stacksize can grow in some
circumstances. Functions containing alloca or C99 arrays could still
use a frame pointer.

Wilco

Paul Taylor

unread,
Jun 27, 2007, 1:14:52 AM6/27/07
to

OK - you have lost me..... :-)

My understanding is that a frame pointer only gets set up at entry of a
function and is restored at exit? And variables are easy to track because
with a frame pointer offsets to the variables are fixed?

Regards,

Paul.

David Brown

unread,
Jun 27, 2007, 3:06:26 AM6/27/07
to

Yes, that's exactly the point of the frame pointer. It gives you a
fixed base so that local variables have constant offsets from the FP,
while the SP may change during the function as things are pushed or
popped onto the stack (contrary to Wilco's wild generalisations, the SP
*is* adjusted during the function execution - if another function is
called which requires parameters on the stack, then it is very likely
that the SP will be changed, especially on processors with push and pop
primitives).

However, since the compiler knows (hopefully!) what code it has
produced, then at any given time the frame pointer is a fixed offset
from the stack pointer. Thus you can save a little of the function
prologue and epilogue, as well as freeing an extra register to play
with, if you access the stack as offsets from the stack pointer, rather
than having an explicit frame pointer (using "virtual frame pointer", if
you like).

There are several situations where frame pointers are still useful,
however. While compilers are generally now clever enough to keep track
of the "virtual frame pointer", debuggers are not necessarily so - some
find the frame pointer useful, especially if they don't have full
debugging information about the code in question. On some processors,
such as the AVR, there is little or no support for (SP + offset)
addressing modes - a frame pointer in a pointer register solves that
problem. And sometimes (as Wilco suggested) there is not quite such a
neat relationship between the stack pointer and the frame pointer, such
as after using alloca() or variable length local arrays. Finally, a
frame pointer can make the code shorter or faster for some types of code
- exceptions, gotos, or other jumps across large parts of the code may
be best implemented with a frame pointer, so that individual branches
can manipulate the stack pointer (for function calls) while the
exception code still knows where everything is. Even with simple
branches, a frame pointer may let the compiler pile up on the stack
without bothering to clean up after function calls, leaving the tidying
to the epilogue (which uses the frame pointer to clear up the stack).
That might or might not be smaller and faster - it depends on the
architecture and the code.

In most practical cases, however, the best code is generated without
using a frame pointer.

mvh.,

David

Colin Paul Gloster

unread,
Jun 27, 2007, 5:46:18 AM6/27/07
to
In news:46813e27$0$8383$8404...@news.wineasy.se timestamped Tue, 26
Jun 2007 18:53:30 +0200, David Brown
<da...@westcontrol.removethisbit.com> posted:

"wilco.d...@ntlworld.com wrote:
> On 26 Jun, 14:23, "Michael N. Moran" <mnmo...@bellsouth.net> wrote:
>> wilco.dijks...@ntlworld.com wrote:
>>> On 25 Jun, 15:50, "Michael N. Moran" <mnmo...@bellsouth.net> wrote:
[..]

Less than five years ago, one of the main companies in the Symbian consortium
sent a recruitment advertisement for a short term contract which
interested me. I had applied but by the time someone with technical
knowledge has spoken with me, he has revealed that upon reconsideration it was
so difficult to find a suitable person that the notion of a short term
contract has been replaced with a longer term job. This did not suit
me as I would (and did) go to a planned better position which could
not begin until a number of months later. So I never worked on the job
of the advertisement nor the real job which replaced it. However, I
have been told during the technical conversation that the job would
entail replacing parts of the Symbian consortium's own inhouse C++
compiler with parts of the GNU C++ compiler's (at least the
frontend). Recruitment advertisements of the Symbian consortium's
which I looked at tended to offer rates of pay of approximately a few
hundred pounds sterling (approximately a few hundred dollars) a day or
a week or a month (I do not remember which, but even at a few
hundred pounds sterling a month it is not a very low rate of pay). I
do not remember if the rate of pay for this job was supposed to be
comparable.


"I'm sure Altera, Xilinx, and Atmel, amongst others, appreciate you
referring to them as "startups" or implying they have gone for the
cheapo option"

They have gone for what would naively seem to be a cheap option.

"because they are unwilling or incapable of "digging deeper".

Of course, they may perhaps have actively chosen to work on gcc ports on
the basis of past successes,"

In fairness, all of the compilers for Atmel AVRs except for GCC and
perhaps except for the Pascal compiler are apparently significantly
better than GCC. Atmel actively provided favoritism to a compiler
vendor to provide a good compiler for Atmel AVRs and eventually became
supportive to GCC for Atmel AVRs (even when GCC was one of the worst
compilers for these targets) and went much further by actively porting
GCC for AVR32s by itself before the first AVR32 was released. People
in Atmel may realize that no matter what price which is not gratis,
many people will prefer to spend money on an Internet connection and a
bigger chip to naively avoid paying for a cross compiler.

" expected future successes, value for their
investment in time and money, customer pressure, and supported source
code (such as a linux port to the architecture in question). In
particular, it is extremely unlikely that both Altera and Xilinx would
have made such total commitments to their gcc ports (there are, as far
as I know, no non-gcc compilers for their soft processors) if they
thought that a non-gcc compiler (in-house or external) would be
significantly better. Their competitiveness runs too deep to miss out
on such an opportunity - especially if, as you claim, it would be
cheaper overall."

I do not believe this. Altera and Xilinx could vastly improve items
essential to their businesses (e.g. their synthesis backends) in order
to be competitive but they do not. Third parties could write their own
compilers for NiosII and MicroBlaze if they wanted to.


"[..]

[..] No one who knows


what they are doing compiles with -O0 on any compiler, gcc or otherwise.

[..]"

Symbian runs on ARMs. The Symbian person I mentioned above seemed to
think that machine code is the lowest level anyone can go (in fairness,
in that job it probably would not have been possible to go any
lower). An out of date webpage written after I spoke to him is
WWW.Symbian.com/developer/techlib/v9.2docs/doc_source/faqsdk/faq_1026.html
which is clearly written by someone who did not (or who wrote for
people who do not) know how to type
info gcc "Invoking GCC" "Optimize Options"
as
WWW.Symbian.com/developer/techlib/v9.2docs/doc_source/faqsdk/faq_1026.html
contains:
"[..]
Created: 04/05/2004 Modified: 10/17/2005
[..]

[..]

Question:
I'm porting some code from standard C++ to Symbian OS and I'm using the newer GCC 3.x, which has some really good optimisation options. Can I use this for Symbian OS ?

Answer:
[..]

The short answer is 'No' - you can't use any GCC version beyond 2.9.
[..]

Now, the reasons why Symbian chose not to use anything other than -o0 is because that GCC 2.9x wouldn't handle some ARM optimisations very well, in many cases. Moreover, -o0 doesn't mean it is less optimised than -o2 for example; the 1-2-3 switches denote different kinds of optimisation (one is for speed, the other is for space, etc.)

Maybe if you really want to optimise some functions, you could compile them to assembler source first. In particular for number-crunching function that you may have written, you should consider compiling the source to assembler first and optimize by hand the few critical paths to gain the improvements you require."

Regards,
Colin Paul Gloster

David Brown

unread,
Jun 27, 2007, 6:33:42 AM6/27/07
to
Paul Taylor wrote:
> On Mon, 25 Jun 2007 22:28:05 -0700, wilco.dijkstra wrote:
>
>> Other things like -O0 generating ridiculously inefficient code
>
> I'm possibly showing my naivity here - I'm certainly not on expert on
> these matters, but isn't there a step in the compilation process just
> before optimisations (after tokenizing/parsing) that gets an internal
> representation of the source code (RTL with the GCC?), and where that step
> in the process is essentially a "dumb" part of the process, with the
> really clever bit, the optimisations, occurring *after* this step?
>

There are optimisations done at all stages. The front-end can do some
optimisations (like turning "x = y + 2*4" into "x = y + 8"). Then the
code is turned into an intermediary representation, and some
optimisations are done during this "middle-end" part (like turning "x =
y + 2; z = y + 2;" into "x = y + 2; z = x;"). Then comes the back-end
which does the code generation, and some optimisations are applied after
that (like turning "jsr foo; ret" into "jmp foo").

> If so, then a compiler writer surely is going to need access to this
> unoptimised code at least for unit tests (or whatever gcc uses)? In which
> case -O0 is how you get at it. Unoptimised code is always going to be
> ridiculously inefficient, and you really wouldn't want to use it. Again
> I'm not an expert on the compilation process, but the above is my
> understanding as of now - please enlighten me :-)
>

There are a few reasons for using unoptimised code, that I can think of.
One is that it is easy for source code debugging (not assembly level
debugging - that is easier with a bit of optimisation). Another is, as
you say, during testing and development of the tools themselves.

Colin Paul Gloster

unread,
Jun 27, 2007, 6:23:48 AM6/27/07
to
In news:V4SfCNGt...@phaedsys.demon.co.uk timestamped Sun, 24 Jun
2007 23:01:17 +0100, Chris Hills <ch...@phaedsys.org> posted:
"[..]

[..] much better compression [..]
I have seen several programs that the Keil 2/4/8k limited compilers
could get to run on a 51 that the unlimited SDCC could not.

The problem is when they want to add "just one more feature" without
changing the whole design. For example... smart cards and mobile SIMS
and many other things. Especially when by law you need to add something
to an old system or to change some IO because they are new sensors.

Ideally you would scrap the whole system and start again to ad a small
change to an end of life product."

If you intend to have some spare memory then the smallest output from
the compilers is not necessarily the most important criterion.


"[..]

I do find it strange that people are arguing so strongly for using
second rate tools in their profession."

Who has argued for using second rate tools in their profession?
Provide exact references to justify that claim.

"What would you think of a doctor,
dentist, aeronautical engineer who argued the same?"

Microcontrollers running software are used by such people. Should they
use instead things such as FPGAs because FPGAs are better? FPGAs are
used by such people. Should they use ASICs instead because FPGAs are
inferior? Electronics can suffer from electromagnetic
interference. Should such people not use electronics?

Chris Hills said in
news:7s7UXLAV...@phaedsys.demon.co.uk
in the thread "Re: What's more important optimisations or debugging?" on 2007 June 4th:
"[..]

Note some safety critical systems do not permit optimisations."

Chris Hills said in
news:27nnSYEo...@phaedsys.demon.co.uk
in the thread "Re: What's more important optimisations or debugging?" on 2007 June 13th:
"[..]

[..] disable all compiler optimisations."

Does Chris Hills berate people for disabling all optimizations?

Regards,
Colin Paul Gloster

David Brown

unread,
Jun 27, 2007, 6:48:05 AM6/27/07
to
Colin Paul Gloster wrote:

Would you *please* learn to use a newsreader? You have some interesting
things to say, some of which warrant response, but have such an absurd
quoting "style" that it is impossible to hold a proper thread of
conversation with you.

>
> Question: I'm porting some code from standard C++ to Symbian OS and
> I'm using the newer GCC 3.x, which has some really good optimisation
> options. Can I use this for Symbian OS ?
>
> Answer: [..]
>
> The short answer is 'No' - you can't use any GCC version beyond 2.9.
> [..]
>
> Now, the reasons why Symbian chose not to use anything other than -o0
> is because that GCC 2.9x wouldn't handle some ARM optimisations very

Gcc 2.9x for the ARM was very poor, as pretty much everyone knows. Many
proponents of closed source alternatives are so happy about this that
they make all sorts of claims of generating code that is several times
faster than gcc's code, when they really mean the early ARM gcc.

> well, in many cases. Moreover, -o0 doesn't mean it is less optimised
> than -o2 for example; the 1-2-3 switches denote different kinds of
> optimisation (one is for speed, the other is for space, etc.)
>

That's total and utter drivel - certainly regarding gcc, but also
regarding any other compiler I have ever used.

> Maybe if you really want to optimise some functions, you could
> compile them to assembler source first. In particular for
> number-crunching function that you may have written, you should
> consider compiling the source to assembler first and optimize by hand
> the few critical paths to gain the improvements you require."
>

With modern compilers (gcc or otherwise), there is seldom good reason
for hand-optimising your assembly unless you are taking advantage of
specific features that your compiler is unaware of. It is often more
useful to compile to assembly, study the assembly, and then modify your
source code to get better results.

Walter Banks

unread,
Jun 27, 2007, 7:24:17 AM6/27/07
to
Stack frames are only needed for functions that are re-entrant. If
a function does not need to be re-entrant quite a bit of execution
time and ram bandwidth can be saved by having compile time
allocation the auto/local variables in some overlaid ram space.

Regards


Walter Banks
--
Byte Craft Limited
Tel. (519) 888-6911
http://www.bytecraft.com
email wal...@bytecraft.com

Walter Banks

unread,
Jun 27, 2007, 7:39:15 AM6/27/07
to

David Brown wrote:

>
> In most practical cases, however, the best code is generated without
> using a frame pointer.
>

I agree.

w..


CBFalconer

unread,
Jun 27, 2007, 7:00:54 AM6/27/07
to
David Brown wrote: ** to Colin Paul Gloster, top-posted **

>
> Would you *please* learn to use a newsreader? You have some
> interesting things to say, some of which warrant response, but
> have such an absurd quoting "style" that it is impossible to hold
> a proper thread of conversation with you.

I second that motion.

CBFalconer

unread,
Jun 27, 2007, 6:52:05 AM6/27/07
to
wilco.d...@ntlworld.com wrote:
>
... snip ...

>
> Most compilers stopped using frame pointers a long time ago. They
> are inefficient and don't actually provide any benefit. Rather
> than changing SP repeatedly inside a function, SP is adjusted only
> on entry and exit of the function, further improving efficiency.
> This also makes it easier to track stack variables in debuggers
> (as offsets from SP are fixed). The drawback is that stacksize can
> grow in some circumstances. Functions containing alloca or C99
> arrays could still use a frame pointer.

You are confused as to the use of an 'SP'. This is a stack
pointer, which is altered whenever a value is pushed or popped. It
can't be constant, so the compiler has to keep track of it. You
are probably thinking of a 'BP', or block pointer, which normally
holds specific SP values such as the value on function entry.

Walter Banks

unread,
Jun 27, 2007, 7:54:19 AM6/27/07
to

David Brown wrote:

> There are a few reasons for using unoptimised code, that I can think of.
> One is that it is easy for source code debugging (not assembly level
> debugging - that is easier with a bit of optimisation).

The down side to debugging with optimization off is that the code that is
shipped is not the code with an experience base that has been debugged.

The optimizations that make a difference are not easily turned off
or on without a serious impact on overall tool implementations.

David Brown

unread,
Jun 27, 2007, 8:56:31 AM6/27/07
to
Walter Banks wrote:
>
> David Brown wrote:
>
>> There are a few reasons for using unoptimised code, that I can think of.
>> One is that it is easy for source code debugging (not assembly level
>> debugging - that is easier with a bit of optimisation).
>
> The down side to debugging with optimization off is that the code that is
> shipped is not the code with an experience base that has been debugged.
>
> The optimizations that make a difference are not easily turned off
> or on without a serious impact on overall tool implementations.
>

You certainly don't want to just do your debugging and testing on some
low-optimisation test version. But source level debugging of
unoptimised code can be a useful aid when your code is not working as
expected. You can single-step through code reliably, and view each
variable with values that match their values at the source code level,
and thus can see what is going on in a different way. It's an aid, but
it is not an alternative to testing and debugging the code you will
actually ship.

Chris Hills

unread,
Jun 27, 2007, 8:28:43 AM6/27/07
to
In article <46824366...@yahoo.com>, CBFalconer
<cbfal...@yahoo.com> writes

>David Brown wrote: ** to Colin Paul Gloster, top-posted **
>>
>> Would you *please* learn to use a newsreader? You have some
>> interesting things to say, some of which warrant response, but
>> have such an absurd quoting "style" that it is impossible to hold
>> a proper thread of conversation with you.
>
>I second that motion.

Strange for me to agree with both David and CBF at the same time but I
have been asking Paul to change his style of quoting to something that
is readable.

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills Staffs England /\/\/\/\/
/\/\/ ch...@phaedsys.org www.phaedsys.org \/\/\
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/

CBFalconer

unread,
Jun 27, 2007, 10:27:30 AM6/27/07
to
Chris Hills wrote:
> <cbfal...@yahoo.com> writes
>> David Brown wrote: ** to Colin Paul Gloster, top-posted **
>>>
>>> Would you *please* learn to use a newsreader? You have some
>>> interesting things to say, some of which warrant response, but
>>> have such an absurd quoting "style" that it is impossible to hold
>>> a proper thread of conversation with you.
>>
>> I second that motion.
>
> Strange for me to agree with both David and CBF at the same time
> but I have been asking Paul to change his style of quoting to
> something that is readable.

Oh well. It is a strange world. :-)

wilco.d...@ntlworld.com

unread,
Jun 27, 2007, 11:24:17 AM6/27/07
to
On 27 Jun, 11:52, CBFalconer <cbfalco...@yahoo.com> wrote:

> wilco.dijks...@ntlworld.com wrote:
>
> ... snip ...
>
> > Most compilers stopped using frame pointers a long time ago. They
> > are inefficient and don't actually provide any benefit. Rather
> > than changing SP repeatedly inside a function, SP is adjusted only
> > on entry and exit of the function, further improving efficiency.
> > This also makes it easier to track stack variables in debuggers
> > (as offsets from SP are fixed). The drawback is that stacksize can
> > grow in some circumstances. Functions containing alloca or C99
> > arrays could still use a frame pointer.
>
> You are confused as to the use of an 'SP'. This is a stack
> pointer, which is altered whenever a value is pushed or popped. It
> can't be constant, so the compiler has to keep track of it. You
> are probably thinking of a 'BP', or block pointer, which normally
> holds specific SP values such as the value on function entry.


No. If you remove the frame pointer and don't change anything else,
you're left with a constantly changing stack pointer. This makes it
harder to access variables on the stack. For example, if the SP+offset
addressing mode has a limited offset, you may be able to use that
addressing mode only if the offset is small enough. Additionally, as
David said, most debuggers can't deal with these changing offsets
(Dwarf3 can describe this, but few debuggers support this format, let
alone the builtin interpreter that can do this stuff).

The solution to this is to use fixed size stackframe across the whole
function - ie. SP does not change. This is always possible as you can
reserve space for function arguments at fixed offsets from SP and
store them rather than push. The advantage is that you avoid having to
adjust SP to undo pushes, which further improves the saving.

Although the fixed-size frame optimization is orthogonal to removing
the frame pointer, the advantages (debugging is possible and code
usually more efficient) mean that they typically go together.

Wilco

wilco.d...@ntlworld.com

unread,
Jun 27, 2007, 12:43:22 PM6/27/07
to
On 27 Jun, 03:29, "Michael N. Moran" <mnmo...@bellsouth.net> wrote:

> wilco.dijks...@ntlworld.com wrote:
> > On 26 Jun, 14:23, "Michael N. Moran" <mnmo...@bellsouth.net> wrote:
> >> Apparently you've had a bad experience. My experience
> >> has been much better. Are there bugs? Sure. Have I seen
> >> bugs in commercial compilers? Sure. I have received a
> >> *very* fast bug fix for GCC H8. Compiler's have bugs.
>
> > I wouldn't call it a bad experience. I am simply used to the best
> > compilers as I've worked on them and improved myself. What I'm saying
> > is that GCC looks bad if you compare it to commercial compilers. A
> > while ago I wrote some optimised C code for a client and found that
> > GCC produced 40% larger code...
>
> I'm not sure what "optimized C code" is. Source that
> is optimized for one compiler may be pessimized to another.
> BTW, 40% is a bad experience. ;-)

I meant code that doesn't contain any inefficiencies on the critical
paths and making full use of the target architecture. Such code is
usually compiler independent and will result in efficient code on most
compilers, but of course not all compilers get close to what one could
write in assembler. As an example, I often make use of ARM's
conditional execution by writing if (x != 0 && y != 0) z++; and expect
it to avoid branches:

CMP x,#0
CMPNE y,#0
ADDNE z,z,#1

The 40% extra code was due to not recognising cases like the above
(using 2 extra branches) and other ARM specific instructions. There
was also code that tested these optimised functions, and GCC generated
only 5% worse code here. Unfortunately the size or performance of the
test code didn't matter...

Wilco

Michael N. Moran

unread,
Jun 27, 2007, 1:46:40 PM6/27/07
to
wilco.d...@ntlworld.com wrote:
> On 27 Jun, 03:29, "Michael N. Moran" <mnmo...@bellsouth.net> wrote:
>> I'm not sure what "optimized C code" is. Source that
>> is optimized for one compiler may be pessimized to another.
>> BTW, 40% is a bad experience. ;-)
>
> I meant code that doesn't contain any inefficiencies on the critical
> paths and making full use of the target architecture. Such code is
> usually compiler independent and will result in efficient code on most
> compilers, but of course not all compilers get close to what one could
> write in assembler. As an example, I often make use of ARM's
> conditional execution by writing if (x != 0 && y != 0) z++; and expect
> it to avoid branches:
>
> CMP x,#0
> CMPNE y,#0
> ADDNE z,z,#1

"Usually compiler independent"? On what standard are you
depending? Is this a documented optimization for a
particular ARM compiler, or was this discovered by
trial and error with a particular compiler?

BTW, using an xscale-elf-gcc --version 4.02 that I have
hanging around, I get the following:

---- arm.cpp ---
int f(int x,int y,int z)
{


if (x != 0 && y != 0) z++;

return z;
}

-----------------

# xscale-elf-gcc -c -O2 arm.cpp
# xscale-elf-objdump --disassemble arm.o
arm.o: file format elf32-littlearm

Disassembly of section .text:

00000000 <_Z1fiii>:
0: e3500000 cmp r0, #0 ; 0x0
4: 13510000 cmpne r1, #0 ; 0x0
8: e1a00002 mov r0, r2
c: 12820001 addne r0, r2, #1 ; 0x1
10: e12fff1e bx lr


Since your "if (x != 0 && y != 0) z++;" was out
of context, this was the closest I could come
in 2 minutes time ;-)

In the context of a larger function where the
compiler has more visibility into the context
ymmv.

Paul Taylor

unread,
Jun 27, 2007, 3:45:09 PM6/27/07
to

Thanks for the clear description

:-)

Paul.

CBFalconer

unread,
Jun 27, 2007, 9:49:20 PM6/27/07
to

IMO you are designing some ugly code generators. For example,
consider the addressing code need to push versus a dedicated
store. Something like one byte for the push, and three up for the
store.

David Brown

unread,
Jun 28, 2007, 8:17:54 AM6/28/07
to

That's very much dependent on the cpu architecture. On some
architectures, storing data at (SP + offset) takes exactly the same time
and space as a push - on others, it is significantly different. On
superscaler 32-bit RISC cpus it is likely that the (SP + offset) store
will be faster, as a push which changes SP will lead to stalls. On an
AVR, the push is a single instruction which is much smaller and faster
as there is no (SP + offset) mode.

And for some types of function, trying to use a single fixed-size stack
frame will make a real pig's ear out of the code, and result in lots of
wasted stack space - a frame pointer really does give better code for
some functions.

As with anything in this field, unless you are going to give specific
and tight limits on the processors and the type of code, any
generalisations like this are going to be wrong.

Chris Hills

unread,
Jun 28, 2007, 2:57:12 PM6/28/07
to
In article <468273D2...@yahoo.com>, CBFalconer
<cbfal...@yahoo.com> writes

>Chris Hills wrote:
>> <cbfal...@yahoo.com> writes
>>> David Brown wrote: ** to Colin Paul Gloster, top-posted **
>>>>
>>>> Would you *please* learn to use a newsreader? You have some
>>>> interesting things to say, some of which warrant response, but
>>>> have such an absurd quoting "style" that it is impossible to hold
>>>> a proper thread of conversation with you.
>>>
>>> I second that motion.
>>
>> Strange for me to agree with both David and CBF at the same time
>> but I have been asking Paul to change his style of quoting to
>> something that is readable.
>
>Oh well. It is a strange world. :-)

Oh bugger! I am going to have to agree with you again :-)

wilco.d...@ntlworld.com

unread,
Jun 30, 2007, 1:45:42 PM6/30/07
to
On 28 Jun, 05:17, David Brown <d...@westcontrol.removethisbit.com>
wrote:
> CBFalconer wrote:
> > wilco.dijks...@ntlworld.com wrote:

> >> The solution to this is to use fixed size stackframe across the whole
> >> function - ie. SP does not change. This is always possible as you can
> >> reserve space for function arguments at fixed offsets from SP and
> >> store them rather than push. The advantage is that you avoid having to
> >> adjust SP to undo pushes, which further improves the saving.
>
> >> Although the fixed-size frame optimization is orthogonal to removing
> >> the frame pointer, the advantages (debugging is possible and code
> >> usually more efficient) mean that they typically go together.
>
> > IMO you are designing some ugly code generators. For example,
> > consider the addressing code need to push versus a dedicated
> > store. Something like one byte for the push, and three up for the
> > store.

It's possible to temporarily increment the stack pointer and then push
the arguments so you get the best of both worlds. This is never worse
than pushing first and later incrementing SP. Also note that on many
CPUs the first few arguments are passed by registers, so the cost of
storing stack arguments is less important in such cases.

> That's very much dependent on the cpu architecture. On some
> architectures, storing data at (SP + offset) takes exactly the same time
> and space as a push - on others, it is significantly different. On
> superscaler 32-bit RISC cpus it is likely that the (SP + offset) store
> will be faster, as a push which changes SP will lead to stalls.

Indeed, and many RISCs don't even support push/pop.

> On an AVR, the push is a single instruction which is much smaller and faster
> as there is no (SP + offset) mode.

Actually on AVR push has the same size and speed as the (base +
offset) addressing mode. Usually the Y or Z register is used as the
actual stack. SP only supports push/pop/call/return, so it is useless
for general parameter passing or local variables. Using a frame
pointer would be a very bad idea as it takes many instructions to set
up and uses up another 2 registers.

> And for some types of function, trying to use a single fixed-size stack
> frame will make a real pig's ear out of the code, and result in lots of
> wasted stack space - a frame pointer really does give better code for
> some functions.

You can always find functions where it is worse of course, but
omitting frame poiners and fixed size frames are better on average on
many CPUs. Optimization is all about improving things on average.

Wilco

David Brown

unread,
Jul 3, 2007, 5:04:18 PM7/3/07
to

Many RISC architectures *do* support push and pop, through register
indirect with automatic pre/post increment/decrement modes, though they
are not too fussed about which register is called "SP". But pipeline
stalls are likely if you have several "push" or "pop" instructions in a row.

>
>> On an AVR, the push is a single instruction which is much smaller and faster
>> as there is no (SP + offset) mode.
>
> Actually on AVR push has the same size and speed as the (base +
> offset) addressing mode. Usually the Y or Z register is used as the
> actual stack. SP only supports push/pop/call/return, so it is useless
> for general parameter passing or local variables. Using a frame
> pointer would be a very bad idea as it takes many instructions to set
> up and uses up another 2 registers.
>

That all depends on whether you want to have a separate return stack and
data stack, or a combined stack. There are advantages in each - a data
stack (indexed by Y) gives you (Y + index) addressing modes, but
effectively locks the Y register and complicates stack arrangements
since every task needs two individual stacks. A combined stack avoids
these issues, but requires Y to be set up as a frame pointer for
functions with data on the stack (not as much of a problem as it might
seem, as the AVR has plenty of registers for local data).

>> And for some types of function, trying to use a single fixed-size stack
>> frame will make a real pig's ear out of the code, and result in lots of
>> wasted stack space - a frame pointer really does give better code for
>> some functions.
>
> You can always find functions where it is worse of course, but
> omitting frame poiners and fixed size frames are better on average on
> many CPUs. Optimization is all about improving things on average.
>

No, "optimisation" is about getting the best possible code (faster,
smaller, or whatever other definition of "best" you are using). Look up
the word in a dictionary. Thus a good optimiser will use a frame
pointer or not, a fixed frame size or a variable frame size, according
to the combination that generates the best code for the task in hand.

> Wilco
>

sensei141

unread,
Aug 2, 2007, 8:29:13 AM8/2/07
to
I have been using sdcc with lpc900 microcontrollers for a few months now
and it works great! You can quickly try sdcc and see if it works for you.
Just get the latest snapshot and don't use the include files of comercial
compilers (such as keil) as their definition of special function registers
(SFR) is different. Fortunatelly sdcc comes with an include file for the
LPC932!

>Hello,
>
>I am starting a small project on LPC932. There seems to be quite a few
>toolchain options out there:
>Keil
>Rigel
>Raisonance
>IAR
>SDCC (open-source)
>
>Which one would you recommend?
>In my case, it is a very small project (I might even fit into the 4K
>limited commercial kits), but if SDCC is a decent compiler, it will be
>useful if I have future projects on this platform.
>
>Thanks
>Eugene
>
>


Chris Hills

unread,
Aug 2, 2007, 9:25:35 AM8/2/07
to
In article <PLWdncEYcv2EUyzb...@giganews.com>, sensei141
<sens...@hotmail.com> writes

>I have been using sdcc with lpc900 microcontrollers for a few months
>now
>and it works great!

Not compared to a commercial compiler.

>You can quickly try sdcc and see if it works for you.
> Just get the latest snapshot and don't use the include files of
>comercial
>compilers (such as keil) as their definition of special function
>registers
>(SFR) is different.

Why would you use their (copyrighted) files anyway?
Incidentally the Keil SFR names will be those used by the silicon
company as Keil works with the silicon companies when they design the
chips.

> Fortunatelly sdcc comes with an include file for the
>LPC932!

So it should..... how many of the other 600 odd 8051's does it support?
Does it also support the other memory models other than the standard
one?

BTW does SDCC do DATA overlaying? If not it is not really worth using
on the 8051's


>
>>Hello,
>>
>>I am starting a small project on LPC932. There seems to be quite a few
>>toolchain options out there:
>>Keil
>>Rigel
>>Raisonance
>>IAR
>>SDCC (open-source)
>>
>>Which one would you recommend?
>>In my case, it is a very small project (I might even fit into the 4K
>>limited commercial kits),

Then use the (FREE eval) Keil compiler

>>but if SDCC is a decent compiler,

It's not compared to the commercial ones

ChrisQuayle

unread,
Aug 2, 2007, 6:08:26 PM8/2/07
to
Chris Hills wrote:

> OK then can some one PROVE how FOSS compilers are more advanced or *at
> least* as good as commercial ones?
>
>
>
>

Have been wary about getting into this, but it irritates me that the
commercial vendors are quite happy to steal new ideas from open source
at no cost to improve their product, but then have the chutzpah to
suggest that gcc is inferior. Neither are they willing to disclose their
own improvements that might contribute generally to the state of the
art. Of course, in the commercial world ip matters, but probably most of
the really good new ideas in software in the last decade or so have come
out of open source or academia, so what have the commercial vendors got
to offer that makes them so good ?. After seeing how Linux has bloated
over the past few years, have had a few things to say about open source
myself, but the gcc suite is one exception that disproves any general
criticism.

It's no good saying they use advanced techniques without at least giving
some idea of why I should pay $k's over something that works and that I
can download and build for free. For example, I still use (old) gcc
2.7.2 for 68k work. An unpretentious compiler that i'm familiar with,
produces acceptable code and still have to catch it out with even the
most complex data structures and constructs etc. Have not found a show
stopper bug in several years use, which is more than I can say about
some commercial product. I think the commercial vendors are at a very
serious disadvantage now. Tool development is such a labour intensive
and skilled process, with shrinking market, that they have to charge
high prices just to stay in business, never mind keep up with the state
of the art. A highly unstable market, with thousands of devices and
variants to support and guessing which will be successfull and sell
volume. Not a business I would want to be in at all.

It's not just the compiler either - The gcc suite has an excellent set
of binary utilities, linker, make and archiver / librarian etc, which
blend together seemlessly in a unix environment to produce a really top
class development suite. Again, miuch more than can be said for some of
the commercial vendors. If you want a good laugh, have a look at the
utilities associated with earlier versions of the IAR H8 compiler and
the command line options. Even Keil C wasn't that brilliant in terms of
added value and that's a compiler that i've used extensively and respect
quite a lot, even if the code it produces looks horrendous at times.

So Chris, what's the unique selling point ?. Have failed to be convinced
thus far...

Chris

--

----------------------
Greenfield Designs Ltd
Electronic and Embedded System Design
Oxford, England
(44) 1865 750 681

Paul Burke

unread,
Aug 3, 2007, 3:19:48 AM8/3/07
to
ChrisQuayle wrote:

>..what's the unique selling point ?.

FUD of course. If you buy a $xxxx product, when the project crashes, you
have your arse covered. If you recommended cutting costs, it will be
siezed on by everyone else as an excuse for failure, and you will find
yourself climbing the pyramid steps to face the jade knife.

It's a problem endemic in British industry. Every project gets gold
plated, layer upon layer, so much so that a relatively simple
north-to-south cross- London rail line renewal, costed at £625 million
10 years ago, is now the flagship of the government's new rail "policy"
at £5.5 BILLION. In the end, everything is too expensive and nothing
gets done.

Paul Burke

David Brown

unread,
Aug 3, 2007, 3:53:26 AM8/3/07
to
ChrisQuayle wrote:
> Chris Hills wrote:
>
>> OK then can some one PROVE how FOSS compilers are more advanced or *at
>> least* as good as commercial ones?
>>
>>
>>
>>
>
> Have been wary about getting into this, but it irritates me that the
> commercial vendors are quite happy to steal new ideas from open source
> at no cost to improve their product, but then have the chutzpah to
> suggest that gcc is inferior. Neither are they willing to disclose their
> own improvements that might contribute generally to the state of the
> art. Of course, in the commercial world ip matters, but probably most of
> the really good new ideas in software in the last decade or so have come
> out of open source or academia, so what have the commercial vendors got
> to offer that makes them so good ?. After seeing how Linux has bloated
> over the past few years, have had a few things to say about open source
> myself, but the gcc suite is one exception that disproves any general
> criticism.
>

Many Linux *distributions* have bloated - Linux itself has not bloated
(it's got bigger and better, but not bloated). There are plenty of open
source projects that have suffered from "bloat", and plenty of distros
that have grown huge, but the great thing about open source systems is
that you mix and match what you need, and can get big, fancy, powerful
software ("bloated" software) or small, lean and dedicated software.

> It's no good saying they use advanced techniques without at least giving
> some idea of why I should pay $k's over something that works and that I
> can download and build for free. For example, I still use (old) gcc
> 2.7.2 for 68k work. An unpretentious compiler that i'm familiar with,
> produces acceptable code and still have to catch it out with even the
> most complex data structures and constructs etc. Have not found a show
> stopper bug in several years use, which is more than I can say about
> some commercial product. I think the commercial vendors are at a very

(If you want to try a newer version, go to www.codesourcery.com. They
maintain the gcc m68k port, and there have been a fair number of
improvements over the years, especially for ColdFire parts.)

> serious disadvantage now. Tool development is such a labour intensive
> and skilled process, with shrinking market, that they have to charge
> high prices just to stay in business, never mind keep up with the state
> of the art. A highly unstable market, with thousands of devices and
> variants to support and guessing which will be successfull and sell
> volume. Not a business I would want to be in at all.
>

There are some situations where commercial developers have advantages
over open source developers - it is often easier to get restricted
information (advance information, or extra technical details) from the
microcontroller manufacturers. Even for those manufacturers which
directly support gcc ports, there can be restrictions with some PHB
wanting to keep details secret, which therefore cannot end up in open
source code.

There is also currently only one serious, powerful open source compiler
system - gcc. So for architectures that don't fit gcc's models, there
are currently no top-quality open source alternatives (sdcc is, as far
as I understand it, a perfectly reasonable compiler - but it is not a
top-ranking 8-bit compiler in the same way that gcc is for many 32-bit
targets).

The other thing that commercial developers can do that open source
developers typically do not, is spend time and money on everything
*around* the compiler - documentation, installation, IDEs, debuggers,
dedicated support teams to help you out when the software license
locking system has lost its keys, etc. These sorts of things take time
and effort, and are often not the most interesting parts of the job -
it's fine if you are paid to do it, but for many open source developers
and their employers, margins are smaller and it's harder to find the
time and money to spend on less essential parts.

As I see it, there is plenty of place for both commercial and open
source development tools (and both high-price and "cheap and cheerful"
price commercial tools) at the moment. What is interesting is the
changes for newer architectures - steadily more manufacturers are going
straight for a gcc port for newer 32-bit architectures, rather than the
more traditional approach of working closely with a commercial
developer. The development process for these ports is often not very
open (they work behind closed doors, and release binaries and
corresponding source files in lumps), making it somewhat of a middle road.

> It's not just the compiler either - The gcc suite has an excellent set
> of binary utilities, linker, make and archiver / librarian etc, which
> blend together seemlessly in a unix environment to produce a really top
> class development suite. Again, miuch more than can be said for some of
> the commercial vendors. If you want a good laugh, have a look at the
> utilities associated with earlier versions of the IAR H8 compiler and
> the command line options. Even Keil C wasn't that brilliant in terms of
> added value and that's a compiler that i've used extensively and respect
> quite a lot, even if the code it produces looks horrendous at times.
>

Many commercial toolsets are now using open source utilities, as well
they should - for most parts not directly related to code generation,
common open source versions of the tools are far better than anything
anyone else makes. A good example is the Quartus design suite for
Altera FPGAs - the compilation and routing parts are all closed source
from Altera, but the glue that holds it together is cygwin, tcl, and
perl - all open source versions. Their soft cpu is generated with
closed-source programs, using tcl and perl, with Eclipse for the IDE and
gcc for compilation.

Chris Hills

unread,
Aug 3, 2007, 6:15:22 AM8/3/07
to
In article <46b2d8bc$0$3196$8404...@news.wineasy.se>, David Brown
<da...@westcontrol.removethisbit.com> writes

>ChrisQuayle wrote:
>> Chris Hills wrote:
>>
>>> OK then can some one PROVE how FOSS compilers are more advanced or
>>>*at least* as good as commercial ones?
>> serious disadvantage now. Tool development is such a labour intensive
>>and skilled process, with shrinking market, that they have to charge
>>high prices just to stay in business, never mind keep up with the
>>state of the art. A highly unstable market, with thousands of devices
>>and variants to support and guessing which will be successfull and
>>sell volume. Not a business I would want to be in at all.
>>
>
>There are some situations where commercial developers have advantages
>over open source developers - it is often easier to get restricted
>information (advance information, or extra technical details) from the
>microcontroller manufacturers. Even for those manufacturers which
>directly support gcc ports, there can be restrictions with some PHB
>wanting to keep details secret, which therefore cannot end up in open
>source code.

This is true.... there is a lot if NDA type work that goes on for
months before the release of parts. I have been involved in one or two.

The Silicon company was working with the compiler company before even
the registers were fully named (or mapped out)

>There is also currently only one serious, powerful open source compiler
>system - gcc.

However it is not really suited to the smaller end of the market.

>So for architectures that don't fit gcc's models, there are currently
>no top-quality open source alternatives (sdcc is, as far as I
>understand it, a perfectly reasonable compiler - but it is not a
>top-ranking 8-bit compiler in the same way that gcc is for many 32-bit
>targets).

It is missing some important features that make it unsuitable for a lot
of development

>The other thing that commercial developers can do that open source
>developers typically do not, is spend time and money on everything
>*around* the compiler - documentation, installation, IDEs, debuggers,
>dedicated support teams

Yes. Absolutely

> to help you out when the software license locking system has lost its
>keys, etc.

What license locking? ?You assume much. II sell several commercial
compilers that have no locking on them at all. Most Sw these days has
activation/locking.

> These sorts of things take time and effort, and are often not the
>most interesting parts of the job - it's fine if you are paid to do it,
>but for many open source developers and their employers, margins are
>smaller and it's harder to find the time and money to spend on less
>essential parts.

So they only do the bits they like or make money on?


>As I see it, there is plenty of place for both commercial and open
>source development tools (and both high-price and "cheap and cheerful"
>price commercial tools) at the moment.

I agree.

> What is interesting is the changes for newer architectures - steadily
>more manufacturers are going straight for a gcc port for newer 32-bit
>architectures, rather than the more traditional approach of working
>closely with a commercial developer.

This is incorrect. They do work with the commercial compiler vendors.
However they usually do a port of gcc so there is a free tool so there
is a low risk/cost entry point to trying out the new silicon. It is
nothing to do with a belief in FOSS.

> The development process for these ports is often not very open (they
>work behind closed doors, and release binaries and corresponding source
>files in lumps), making it somewhat of a middle road.

Absolutely. Also the gcc port is usually created but not supported. IT
is just there as is so there is a free/low risk way of evaluating the
new silicon.

>> It's not just the compiler either - The gcc suite has an excellent
>>set of binary utilities, linker, make and archiver / librarian etc,
>>which blend together seemlessly in a unix environment to produce a
>>really top class development suite. Again, miuch more than can be
>>said for some of the commercial vendors. If you want a good laugh,
>>have a look at the utilities associated with earlier versions of the
>>IAR H8 compiler and the command line options. Even Keil C wasn't that
>>brilliant in terms of added value and that's a compiler that i've used
>>extensively and respect quite a lot, even if the code it produces
>>looks horrendous at times.
>>
>
>Many commercial toolsets are now using open source utilities,

Some

>as well they should
Why? Many used to make a nice side line income doing utilities.

>- for most parts not directly related to code generation, common open
>source versions of the tools are far better than anything anyone else

This is not true... I have a customer who's use of a FOSS utility cost
them over 10K GBP in time and effort sorting out the problem.

>makes. A good example is the Quartus design suite for Altera FPGAs -
>the compilation and routing parts are all closed source from Altera,
>but the glue that holds it together is cygwin, tcl, and perl - all open
>source versions. Their soft cpu is generated with closed-source
>programs, using tcl and perl, with Eclipse for the IDE and gcc for
>compilation.

Seems a sensible mix but note Altera is a HW company... to them software
is simply a marketing tool.

All the people who you mention who have "embraced" open source are HW
companies who are selling HW and value Sw as a free give away.....

Open source values the programmer at Zero and I have seen nothing to
change this view. In it act it is getting worse.

ChrisQuayle

unread,
Aug 3, 2007, 8:07:50 AM8/3/07
to

Put the world to rights indeed :-). If you want to have a rather offbeat
take on all this and similar stuff, look at:

http://www.reciprocality.org/pdf/index.html

and start with the Programmers Stone section. Apologies if everyone else
has seen this already.

Other sections of the site are interesting as well - Brain dead,
dopamine addicted society courtesy of government social engineering etc.

Deserves a wider audience imho, despite the completely off the wall
parts and follows on from where Brooks ended...

ChrisQuayle

unread,
Aug 3, 2007, 9:45:11 AM8/3/07
to
David Brown wrote:
>
> (If you want to try a newer version, go to www.codesourcery.com. They
> maintain the gcc m68k port, and there have been a fair number of
> improvements over the years, especially for ColdFire parts.)

The reason for 2.7.2 was originally because it was a late enough
revision for the 68k port to be stable and the whole suite was less
cluttered by ports to other, later processors. Also, Tru64 unix wasn't
supported as a target and some of the very late gcc and library versions
wouldn't build at all, whereas earlier versions built fine. Really
can't fault it though and have done some of my best ever work using
those tools. Apart from the fact that the tools just work, a key
advantage is that future projects using other processors will have a
head start in terms of knowledge and code base. Now have a whole set of
templates for make files, linker scripts, debugging, various home rolled
utilities etc, that will transfer with only minor change and expense to
a new target.

What gcc and open source in general have done is to make it possible for
small shops. one man companies etc to have access to some of the best
tools available and am gratefull for that. The leverage effect gcc has
had on the ability to generate large, robust software systems and on the
advancement of the computing art is hard to underestimate imho. Without
a good compiler and infrastucture, there is no software and now,
everyone can play :-).

> There are some situations where commercial developers have advantages
> over open source developers - it is often easier to get restricted
> information (advance information, or extra technical details) from the
> microcontroller manufacturers. Even for those manufacturers which
> directly support gcc ports, there can be restrictions with some PHB
> wanting to keep details secret, which therefore cannot end up in open
> source code.

There may be some advantages, but did't someone else say that arm put a
lot of cash into gcc development to make it work properly ?. If so, one
would assume that there was quite a lot of input from arm so that key
processor attributes were properly catered for.

The way I see it, the current producer / consumer relationship for
commercial tools is one of mistrust, a pita. All the flexlm and dongle
crap is basically telling you that they don't trust you and they think
you will steal the software to run on several machines concurrently. No
mainstream pc software vendor would get away with that now, so why
should we put up with it for embedded tools ?. We bought Keil C some
years ago for a project and (the client) paid extra for an undongled
version, not so we could steal it, but so that it could be installed at
client and development sites. I was the only developer, so only a single
copy would ever be used at once. This I would consider fair use within
the license terms, though i'm sure some vendors would disagree.

I dunno, overall, commercial compilers will enventually die in the
16/32/64 bit arena without a revised business model. 8 bit market
perhaps, but there's starting to be quite a bit of open source activity
there as well...

David Brown

unread,
Aug 3, 2007, 11:11:55 AM8/3/07
to
Chris Hills wrote:
> In article <46b2d8bc$0$3196$8404...@news.wineasy.se>, David Brown
> <da...@westcontrol.removethisbit.com> writes
>> ChrisQuayle wrote:
>>> Chris Hills wrote:
>>>
>>>> OK then can some one PROVE how FOSS compilers are more advanced or
>>>> *at least* as good as commercial ones?
>>> serious disadvantage now. Tool development is such a labour intensive
>>> and skilled process, with shrinking market, that they have to charge
>>> high prices just to stay in business, never mind keep up with the
>>> state of the art. A highly unstable market, with thousands of devices
>>> and variants to support and guessing which will be successfull and
>>> sell volume. Not a business I would want to be in at all.
>>>
>>
>> There are some situations where commercial developers have advantages
>> over open source developers - it is often easier to get restricted
>> information (advance information, or extra technical details) from the
>> microcontroller manufacturers. Even for those manufacturers which
>> directly support gcc ports, there can be restrictions with some PHB
>> wanting to keep details secret, which therefore cannot end up in open
>> source code.
>
> This is true.... there is a lot if NDA type work that goes on for
> months before the release of parts. I have been involved in one or two.
>

Sometimes open source programmers can get access to this sort of
information, sometimes not (for example, I'm sure that the Atmel
employees involved in avr-gcc development have access to non-public
information, and I know that a couple of msp430-gcc developers have NDAs
for access to debugger details, which are implemented in a small
closed-source program). There are also situations where the open source
development is at least partly behind closed doors (with the source
released when it is finished) - development companies with this approach
have little problem getting advance information. But in general, it can
be easier to get secret information if you stick to closed source
development.

> The Silicon company was working with the compiler company before even
> the registers were fully named (or mapped out)
>
>> There is also currently only one serious, powerful open source
>> compiler system - gcc.
>
> However it is not really suited to the smaller end of the market.
>

Correct - it is a better match for bigger devices (although the avr-gcc
port is solid despite a few mis-matches with gcc's expectations of a
processor, and the msp430 port is fine).

>> So for architectures that don't fit gcc's models, there are currently
>> no top-quality open source alternatives (sdcc is, as far as I
>> understand it, a perfectly reasonable compiler - but it is not a
>> top-ranking 8-bit compiler in the same way that gcc is for many 32-bit
>> targets).
>
> It is missing some important features that make it unsuitable for a lot
> of development
>

I don't know enough details to comment here - I've never actually used
an 8051.

>> The other thing that commercial developers can do that open source
>> developers typically do not, is spend time and money on everything
>> *around* the compiler - documentation, installation, IDEs, debuggers,
>> dedicated support teams
>
> Yes. Absolutely
>
>> to help you out when the software license locking system has lost its
>> keys, etc.
>
> What license locking? ?You assume much. II sell several commercial
> compilers that have no locking on them at all. Most Sw these days has
> activation/locking.
>

Most software these days does not have locking or activation. Many
types of software have some sort of registration or require a license
key or number during installation, but few have any sort of locking (by
which I mean some sort of check or validation before each run). The
tools used in embedded development (both for software and hardware
development) are unusual in this respect, in that locking of some sort
is very common for commercial tools.

Any serious embedded developer can tell you horror stories of fights
with licenses, ranging from broken hardware dongles, lost licenses after
hard disk crashes or changing network cards, confusions over licensing
policies resulting in waste time and money, long waits for license
codes, issues when transferring the software to another computer, and
other such problems. Some commercial software suppliers do a good job
of minimising these issues, others can be a real pain, and the
inconvenience is part of the price you have to pay for commercial software.

>> These sorts of things take time and effort, and are often not the
>> most interesting parts of the job - it's fine if you are paid to do
>> it, but for many open source developers and their employers, margins
>> are smaller and it's harder to find the time and money to spend on
>> less essential parts.
>
> So they only do the bits they like or make money on?
>

It depends on who they are, and how they work. A volunteer coder
working in his free time is likely to work on things that interest him.
Others might contribute changes that improve the software for their
own use while being unable to justify the time or money needed to make
changes that don't benefit them or their employer. But there are other
developers who are paid to work on open source projects - they will work
on whatever their employers tell them to work on, just like any other
paid programmer.

There are many different models for making money out of open source
development, with some similarities and some differences from models for
making money out of commercial software. That leads to different
priorities for what people spend their time and money on.

>
>> As I see it, there is plenty of place for both commercial and open
>> source development tools (and both high-price and "cheap and cheerful"
>> price commercial tools) at the moment.
>
> I agree.
>
>> What is interesting is the changes for newer architectures - steadily
>> more manufacturers are going straight for a gcc port for newer 32-bit
>> architectures, rather than the more traditional approach of working
>> closely with a commercial developer.
>
> This is incorrect. They do work with the commercial compiler vendors.
> However they usually do a port of gcc so there is a free tool so there
> is a low risk/cost entry point to trying out the new silicon. It is
> nothing to do with a belief in FOSS.
>

I did not say it was anything to do with "belief in FOSS". As an
example, the AVR32 is supported by a gcc port written by or for Atmel.
I would be very surprised if this is because Atmel has altruistic
beliefs about open source - it is a purely pragmatic economic decision.
I presume (I have no insider information - this is all guesswork) they
knew their chip was useless without a high quality compiler, so they
looked at how they could get such a compiler quickly and cheaply (for
them), providing a compiler that is solid and cheap for developers, and
which would be a good match for the sorts of software they expect to be
used on the devices. The answer is a gcc port.

That of course does not mean that Atmel did not work with commercial
vendors as well - IAR apparently makes an AVR32 compiler, although
Atmel's website does not mention it under "Tools and Software". Clearly
it is in Atmel's interest for there to be alternative toolsets. But
there is little doubt that gcc is the "official" toolset for the AVR32
in Atmel's eyes, not just a cheapo alternative for hobby users.


>> The development process for these ports is often not very open (they
>> work behind closed doors, and release binaries and corresponding
>> source files in lumps), making it somewhat of a middle road.
>
> Absolutely. Also the gcc port is usually created but not supported. IT
> is just there as is so there is a free/low risk way of evaluating the
> new silicon.
>

Ask Atmel (who employee some key avr-gcc developers, and who continually
develop the AVR32 compiler, including ready-to-install versions for
several Linux distributions), or Xilinx or Altera (who continually
develop their gcc ports for their soft processors) whether they view it
as a cheap way for customers to evaluate their processors.

>>> It's not just the compiler either - The gcc suite has an excellent
>>> set of binary utilities, linker, make and archiver / librarian etc,
>>> which blend together seemlessly in a unix environment to produce a
>>> really top class development suite. Again, miuch more than can be
>>> said for some of the commercial vendors. If you want a good laugh,
>>> have a look at the utilities associated with earlier versions of the
>>> IAR H8 compiler and the command line options. Even Keil C wasn't
>>> that brilliant in terms of added value and that's a compiler that
>>> i've used extensively and respect quite a lot, even if the code it
>>> produces looks horrendous at times.
>>>
>>
>> Many commercial toolsets are now using open source utilities,
>
> Some
>
>> as well they should
> Why? Many used to make a nice side line income doing utilities.
>

Maybe they used to, but why would anyone want to pay extra for a "make"
utility? Why would any commercial developer want to pay its staff to
write an editor, or a scripting language, when there are perfectly good
ones available freely, written by experts at that kind of programming
(which is very different from the area of expertise of the compiler
writers)?

>> - for most parts not directly related to code generation, common open
>> source versions of the tools are far better than anything anyone else
>
> This is not true... I have a customer who's use of a FOSS utility cost
> them over 10K GBP in time and effort sorting out the problem.
>

Perhaps I should have added a "generally" or "usually" in there
somewhere - there is always going to be someone somewhere that has
problems. I had a customer who wasted over 10K GBP as a result of
stupid licensing complications with a commercial embedded development
toolkit - I don't expect you to suddenly think open source tools are
always the better choice as a result of that single sample point.

>> makes. A good example is the Quartus design suite for Altera FPGAs -
>> the compilation and routing parts are all closed source from Altera,
>> but the glue that holds it together is cygwin, tcl, and perl - all
>> open source versions. Their soft cpu is generated with closed-source
>> programs, using tcl and perl, with Eclipse for the IDE and gcc for
>> compilation.
>
> Seems a sensible mix but note Altera is a HW company... to them software
> is simply a marketing tool.
>

Altera (and Xilinx, and others) spend more money on software development
than hardware development. Software is the key to FPGA development -
squeeze 10% more out of the chips by an improved placement algorithm,
and you've gained 10% over your competition for *all* your customers,
and *all* your devices, at zero cost per part. So Altera spends a huge
amount of time and money getting its programmers to do what they do best
- making software for FPGA development. It does not waste any time or
money on making "glue" software to hold it together - they use what is
freely available.

> All the people who you mention who have "embraced" open source are HW
> companies who are selling HW and value Sw as a free give away.....
>

I was specifically giving examples of hardware companies which develop
and support open source software.

There are some companies that make money directly from selling nothing
but open source software, but most see open source as a way to make
money in some other way - selling support, selling other software
alongside the open source software, selling hardware, selling
subscriptions for early-release versions of the software, selling
pre-packaged software, and that sort of thing.

> Open source values the programmer at Zero and I have seen nothing to
> change this view. In it act it is getting worse.
>

The programmers who develop open source software for a living, along
with their employers, would be surprised to hear that.

There are certainly some areas where commercial software is becoming
less competitive in the face of open source, and therefore some
programmers will be out of work or have to change the way they make
their living. But it's just like any other sort of competition - for
the most part, the end-users get more choices and lower prices, while
top-dog suppliers are no longer able to charge such huge margins. And
just like any other sort of competition, it has its risks and casualties
- too vicious competition can lead to a drop in quality due to price
wars, and the middle-men often suffer or are made obsolete - that may be
a good thing or a bad thing (obviously for you, as a reseller of
commercial software, it's a bad thing - for end users, it can be good
due to lower prices, or bad due to the loss of useful resources).

David Brown

unread,
Aug 3, 2007, 11:35:39 AM8/3/07
to
ChrisQuayle wrote:
> David Brown wrote:
>>
>> (If you want to try a newer version, go to www.codesourcery.com. They
>> maintain the gcc m68k port, and there have been a fair number of
>> improvements over the years, especially for ColdFire parts.)
>
> The reason for 2.7.2 was originally because it was a late enough
> revision for the 68k port to be stable and the whole suite was less
> cluttered by ports to other, later processors. Also, Tru64 unix wasn't
> supported as a target and some of the very late gcc and library versions
> wouldn't build at all, whereas earlier versions built fine. Really
> can't fault it though and have done some of my best ever work using

There are certainly differences between versions - as the compiler (and
the C language) have developed, you'll always get situations where code
doesn't re-compile cleanly. That's why it's seldom a good idea to
change the toolset for working projects.

> those tools. Apart from the fact that the tools just work, a key
> advantage is that future projects using other processors will have a
> head start in terms of knowledge and code base. Now have a whole set of
> templates for make files, linker scripts, debugging, various home rolled
> utilities etc, that will transfer with only minor change and expense to
> a new target.
>

I too see that as a major benefit of using gcc - I use it on a half
dozen different architectures.

> What gcc and open source in general have done is to make it possible for
> small shops. one man companies etc to have access to some of the best
> tools available and am gratefull for that. The leverage effect gcc has
> had on the ability to generate large, robust software systems and on the
> advancement of the computing art is hard to underestimate imho. Without
> a good compiler and infrastucture, there is no software and now,
> everyone can play :-).
>

Cost of the tools can certainly be an issue for many users. Others will
tell you that the cost of even top range commercial tools is
insignificant compared to the cost of the programmer, but there are
certainly exceptions. In your case, being a small one-man shop, the
outlay for top price tools would be a large chunk of your budget. In my
case (we are small, but not that small), a big issue is that I use a
fair number of different architectures - if I only spend 10% of my time
working with a particular device, then the tools are effectively 10
times as expense when amortized over time. The similarity between gcc
ports and thus lower learning curve is an additional benefit (I know
some of the large commercial developers support many architectures).

>> There are some situations where commercial developers have advantages
>> over open source developers - it is often easier to get restricted
>> information (advance information, or extra technical details) from the
>> microcontroller manufacturers. Even for those manufacturers which
>> directly support gcc ports, there can be restrictions with some PHB
>> wanting to keep details secret, which therefore cannot end up in open
>> source code.
>
> There may be some advantages, but did't someone else say that arm put a
> lot of cash into gcc development to make it work properly ?. If so, one
> would assume that there was quite a lot of input from arm so that key
> processor attributes were properly catered for.
>
> The way I see it, the current producer / consumer relationship for
> commercial tools is one of mistrust, a pita. All the flexlm and dongle
> crap is basically telling you that they don't trust you and they think
> you will steal the software to run on several machines concurrently. No
> mainstream pc software vendor would get away with that now, so why
> should we put up with it for embedded tools ?. We bought Keil C some
> years ago for a project and (the client) paid extra for an undongled
> version, not so we could steal it, but so that it could be installed at
> client and development sites. I was the only developer, so only a single
> copy would ever be used at once. This I would consider fair use within
> the license terms, though i'm sure some vendors would disagree.
>

Some commercial vendors I have come across do see things in this light -
I've had some that have happily emailed me an extra license so that I
could run the software on my home PC as well as the office one. These
sort of vendors assume you are honest - the software will work fine for
a certain time without any licenses, or after license problems, giving
you plenty of time to talk to them and buy a license or fix the problem.
But there are others that, as you say, start with the assumption that
all their customers are potential thieves - not a good way to start a
business relationship.

> I dunno, overall, commercial compilers will enventually die in the
> 16/32/64 bit arena without a revised business model. 8 bit market
> perhaps, but there's starting to be quite a bit of open source activity
> there as well...
>

I don't know that commercial 32-bit compiler development will die, but
it will certainly change (and has been changing). It will certainly get
harder to write competitive tools for new architectures - maintaining
tools for existing architectures is much less demanding.

> Chris
>

steve_s...@hotmail.com

unread,
Aug 3, 2007, 11:45:52 AM8/3/07
to
On Aug 3, 6:15 am, Chris Hills <ch...@phaedsys.org> wrote:

> Open source values the programmer at Zero and I have seen nothing to
> change this view.

With countless of us programmers being paid handsomely to work with
open source software, this is simply an admission that you don't
understand
the open source business models. The folks who write our pay cheques
clearly value open source programming well above zero.

Steve

John Devereux

unread,
Aug 3, 2007, 12:04:13 PM8/3/07
to
steve_s...@hotmail.com writes:

I think that most gcc development is by paid staff. (And the linux
kernel too, probably). And many vendors give away limited versions of
tools. So I am not sure what Chris' point is here.

--

John Devereux

Anton Erasmus

unread,
Aug 3, 2007, 4:02:00 PM8/3/07
to
On Fri, 03 Aug 2007 08:19:48 +0100, Paul Burke <pa...@scazon.com>
wrote:

>ChrisQuayle wrote:
>
>>..what's the unique selling point ?.
>
>FUD of course. If you buy a $xxxx product, when the project crashes, you
>have your arse covered. If you recommended cutting costs, it will be
>siezed on by everyone else as an excuse for failure, and you will find
>yourself climbing the pyramid steps to face the jade knife.

I think that is why companies like Codesourcery charge so much for a
comercial license for gcc. One gets support, and a binary that has
been checked etc. , but IMO the high price is to stop people
automatically pointing fingers at gcc if something goes wrong. If one
had payed big bucks, then obviously it must be good. If one had
downloaded the code for free, then obviously it is suspect.

[Snipped]

Anton Erasmus

Robert Adsett

unread,
Aug 3, 2007, 8:05:35 PM8/3/07
to
In article <HjGsi.5021$vi3....@newsfe2-gui.ntli.net>, ChrisQuayle
says...

> The way I see it, the current producer / consumer relationship for
> commercial tools is one of mistrust, a pita. All the flexlm and dongle
> crap is basically telling you that they don't trust you and they think
> you will steal the software to run on several machines concurrently.

Bingo. It's certainly led me to strongly prefer architectures with
freely available open source tools.

Commercial tools have a real hurdle to overcome, being better isn't good
enough. The minimum requirement is significantly better. Good enough
is good enough generally.

> No
> mainstream pc software vendor would get away with that now,

MS often appears to be headed in that direction with their quality
asurance checks.

> so why
> should we put up with it for embedded tools ?. We bought Keil C some
> years ago for a project and (the client) paid extra for an undongled
> version, not so we could steal it, but so that it could be installed at
> client and development sites. I was the only developer, so only a single
> copy would ever be used at once. This I would consider fair use within
> the license terms, though i'm sure some vendors would disagree.

Even of SW they no longer actively sell.

Robert

Chris Hills

unread,
Aug 4, 2007, 6:55:05 AM8/4/07
to
In article <MPG.211d8cdcf...@free.teranews.com>, Robert Adsett
<su...@aeolusdevelopment.com> writes


Why not move the dongle between the two computers. As you were the only
user I can't see the problem.

It is loading more messages.
0 new messages