Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Clearly, it is too late to fix c99 - C is dead

5 views
Skip to first unread message

QuantumG

unread,
Sep 25, 2004, 3:26:10 AM9/25/04
to
Back in 2002, Harry H. Cheng wrote:

> Agreed. gcc is a C compiler for different platforms.
> VC++ is a C/C++ compiler for Windows.
> SCC is a C compiler for Cray machine.
> Ch is an embeddable C interpreter for different platforms.
> They use different names and have different extensions to C.
> However, they all conform to the ISO C90 standard.
> I think they will conform to C99 eventually.


In reply to a rant I posted on comp.lang.c. The point of my rant was
that a large majority of existing compilers should have conformed with
a large percentage of C99 back in '99. The purpose of releasing a
standard is to codify existing practice. So when C99 was released it
should have taken a few weeks for the majority of existing compilers
to be tweaked to conform. That wasn't the case. The committee didn't
codify existing practice, they made up a new language and they
released it as a standard.

It is now 2004, almost 5 years since the release of C99. One of the
most popular C compilers in the world, GCC, has yet to implement the
standard (see http://gcc.gnu.org/c99status.html) and they never will.
How can I say this with such certainty? In fact, I'll say more:
no-one intends to ever implement the standard. If you look at what
Fergus Henderson said when refering to a test program written by Tony
Finch which exercises GCC's implementation of variable length structs
you can appreciate the problem:

> I think this is a defect in C99. C99 does not match existing practice
> here. I note that Sun C, Compaq C, and GNU C all use the same layout
> for this struct. This layout is more efficient (uses less space) than
> the layout mandated by C99. I don't think the committee intended to
> force implementations to be inefficient in this way and I don't think
> the committee intended to force implementations to break binary
> compatibility with previous releases.

- http://gcc.gnu.org/ml/gcc/2002-05/msg02858.html

GCC still produces the output Tony Finch discovered back in 2002, as
do the other compilers Fergus Henderson mentioned. I don't doubt
there is a compiler somewhere that implements this part of the
standard correctly but the existing practice of a vast majority of C
compilers is to ignore the standard and, as this is the only standard
we have, I believe that to be the death kneel of the language.

Trent Waddington

Merrill & Michele

unread,
Sep 25, 2004, 4:45:21 AM9/25/04
to
In order to kill C, you would antecendently have to kill me, which is not so
easily done. If c99 falls flat, there will be a c2006 oder something
similar. Lack of standards makes everyone pull their hair out. It borders
on language-bashing for me to ask: if not C, then what? MPJ


Spacen Jasset

unread,
Sep 25, 2004, 7:39:32 PM9/25/04
to
There is nothing in the C99 standard that I find especially appealing or,
how shall I put it, a required to use C99. I'll be using C89 for the
forseeable future. C99 atempts to help with internationalisation, but the
fact is to write portable, internationalised programs one still has to make
ones own way, and not rely on the compiler system to support anything since
it doesn't have to provide for any UTF encodings or anything particularly,
concretely, useful.

These are only my musings; they may change, but I doubt it.


Derrick Coetzee

unread,
Sep 25, 2004, 7:57:26 PM9/25/04
to
QuantumG wrote:
> In reply to a rant I posted on comp.lang.c. The point of my rant was
> that a large majority of existing compilers should have conformed with
> a large percentage of C99 back in '99. The purpose of releasing a
> standard is to codify existing practice. So when C99 was released it
> should have taken a few weeks for the majority of existing compilers
> to be tweaked to conform. That wasn't the case. The committee didn't
> codify existing practice, they made up a new language and they
> released it as a standard.

ANSI C did make some significant changes. I think part of it also has to
do with demand. In 1990, C was pretty much *the* main language in
industry, and so people cared about it a lot more. Now Java, C++, Visual
Basic, and others are (each!) more prevalent. If any of them changed
significantly, I think its changes would be picked up more quickly by
compilers and developers alike (this may have already happened with
VB6->VB.NET).
--
Derrick Coetzee
I grant this newsgroup posting into the public domain. I disclaim all
express or implied warranty and all liability. I am not a professional.

QuantumG

unread,
Sep 26, 2004, 7:18:26 AM9/26/04
to
Derrick Coetzee <dcn...@moonflare.com> wrote in message news:<cj50m5$6r3$1...@news-int.gatech.edu>...

>
> ANSI C did make some significant changes. I think part of it also has to
> do with demand. In 1990, C was pretty much *the* main language in
> industry, and so people cared about it a lot more. Now Java, C++, Visual
> Basic, and others are (each!) more prevalent.

I believe each of these languages are now more prevalent because the
greatest strength of C - standardization - was deliberately sabotaged
by an uncaring, unthinking committee that exceeded its charter. The
results were easy to predict, and now, 5 years later, have come to
pass exactly as predicted.

Joe Wright

unread,
Sep 26, 2004, 12:58:25 PM9/26/04
to
QuantumG wrote:

I don't think so. There are certain kinds of programs that ought to
be written in C. Neither Java nor Visual Basic are alternatives for
these programs. Some will claim that C++ is 'a better C' and can be
used in its place. There is not broad agreement on this point.

C lives.
--
Joe Wright mailto:joeww...@comcast.net
"Everything should be made as simple as possible, but not simpler."
--- Albert Einstein ---

Mike Wahler

unread,
Sep 27, 2004, 12:51:26 AM9/27/04
to
"Derrick Coetzee" <dcn...@moonflare.com> wrote in message
news:cj50m5$6r3$1...@news-int.gatech.edu...

> ANSI C did make some significant changes. I think part of it also has to


> do with demand. In 1990, C was pretty much *the* main language in
> industry, and so people cared about it a lot more.


> Now Java, C++, Visual
> Basic, and others are (each!) more prevalent.

Your computing sphere is very limited if you believe that.

-Mike


Chris Barts

unread,
Sep 27, 2004, 1:51:28 AM9/27/04
to
On Sat, 25 Sep 2004 19:57:26 -0400, Derrick Coetzee wrote:

> ANSI C did make some significant changes. I think part of it also has to
> do with demand. In 1990, C was pretty much *the* main language in
> industry, and so people cared about it a lot more. Now Java, C++, Visual
> Basic, and others are (each!) more prevalent. If any of them changed
> significantly, I think its changes would be picked up more quickly by
> compilers and developers alike (this may have already happened with
> VB6->VB.NET).

This is really, really narrow of you. C, Java, and Visual Basic all have
their own, usually mutually-exclusive, problem domains. Using a C program
when you should be programming in VB is pretty damned stupid, and
vice-versa. Anyone who would say the above probably has only programmed in
one environment, making one kind of program in one language. If, indeed,
you have programmed at all.

C is still the only real language for OS development and other standalone
environments. C has displaced assembly and a lot of other languages like
Forth in that domain, and it shows precisely no signs of going anyplace.

In the Unix world, C is the dominant programming language for userland
applications as well. Even though gcc (the dominant compiler in that
realm) offers full and efficient C++ support, object-oriented code simply
has not caught on in that realm outside some notable exceptions.

There is plenty of legacy code written in C. This is why compilers usually
support pre-Standard constructs, and it is why languages that look like C
(and, to some extent, act like C) thrive, while Fortran and Cobol and Lisp
descendants have failed. There's nothing to suggest that C is becoming any
less relevant with time.

beli...@aol.com

unread,
Sep 27, 2004, 8:57:28 AM9/27/04
to
Chris Barts <chb...@gmail.com> wrote in message news:<pan.2004.09.27....@gmail.com>...


> There is plenty of legacy code written in C. This is why compilers usually
> support pre-Standard constructs, and it is why languages that look like C
> (and, to some extent, act like C) thrive, while Fortran and Cobol and Lisp
> descendants have failed. There's nothing to suggest that C is becoming any
> less relevant with time.

I think the statement "Fortran has failed" is wrong. Many scientists
and engineers still consider it the best tool for getting their work
done. Enough people are writing code in Fortran 95 to support about 10
compiler vendors (see
http://www.dmoz.org/Computers/Programming/Languages/Fortran/Compilers/
). There is a general trend towards using higher-level languages, and
Fortran is a higher-level language than either C or C++ for numerical
work, especially involving multidimensional arrays.

The Fortran 2003 standard was ratified a few weeks ago, and its new
features include OOP with inheritance, interoperability with C, IEEE
arithmetic, and stream I/O. Already some F95 compilers have added some
of these features.

Dan Pop

unread,
Sep 27, 2004, 10:28:45 AM9/27/04
to
In <cj50m5$6r3$1...@news-int.gatech.edu> Derrick Coetzee <dcn...@moonflare.com> writes:

>Now Java, C++, Visual Basic, and others are (each!) more prevalent.

Do you have some hard data to back up this statement? Whenever I look at
some open source project, I see C source code. Exceptionally Fortran or
C++.

Dan
--
Dan Pop
DESY Zeuthen, RZ group
Email: Dan...@ifh.de
Currently looking for a job in the European Union

Kenny McCormack

unread,
Sep 27, 2004, 12:43:58 PM9/27/04
to
In article <cj982t$qvh$5...@sunnews.cern.ch>, Dan Pop <Dan...@cern.ch> wrote:
>In <cj50m5$6r3$1...@news-int.gatech.edu> Derrick Coetzee <dcn...@moonflare.com> writes:
>
>>Now Java, C++, Visual Basic, and others are (each!) more prevalent.
>
>Do you have some hard data to back up this statement? Whenever I look at
>some open source project, I see C source code. Exceptionally Fortran or
>C++.

It just depends on where you look.

When I look at code written in the real world, for pay, to solve (badly)
real world problems, I see VB (and/or other MS abominations) code.

Michael Mair

unread,
Sep 27, 2004, 1:24:22 PM9/27/04
to
>>>Now Java, C++, Visual Basic, and others are (each!) more prevalent.
>>
>>Do you have some hard data to back up this statement? Whenever I look at
>>some open source project, I see C source code. Exceptionally Fortran or
>>C++.
>
> It just depends on where you look.
>
> When I look at code written in the real world, for pay, to solve (badly)
> real world problems, I see VB (and/or other MS abominations) code.

And if you look at small-scale implementations of numerical or other
algorithms, every one is well advised to do that in Matlab as the
development time in my experience is shorter than with many other
languages.

<OT rant>
If the language serves the purpose, then it is okay to use it.
The thing I find objectionable is that a too large part of the
"programmers" using (for example) VB do not really want to think,
let alone learn to write programs in the Right Way for their
language/platform, thinking they are the king when they put together
a GUI and write some action routines called at mouseclick or the
like. I do not want to ascribe this to the language, even though
some languages, C among them, force you to learn using the grey stuff
between your ears. I just get very annoyed when interviewing someone
for a job who is not even aware that he/she cannot do serious,
reliable programming but think the world of themselves...
</OT rant>

As long as I think that my students benefit more from learning C
and doing things seemingly unnecessary complicated from scratch,
I will continue to teach them C (including C99 basics). I guess that
this will be for more years than I probably will teach anyone for the
same reason why learning maths and proving some basic stuff does a
world more of good than obtaining a top-of-the-line scientific
calculator.


-- Michael

Dan Pop

unread,
Sep 27, 2004, 1:51:23 PM9/27/04
to

When I do this kind of exercise, I usually see either badly written
C code or badly written C code using a few C++ features. In my segment
of the real world, Windows is seldom used a programming platform: it's
far too inadequate for this kind of usage.

Alan Balmer

unread,
Sep 27, 2004, 2:21:11 PM9/27/04
to
On 25 Sep 2004 00:26:10 -0700, q...@biodome.org (QuantumG) wrote:

>Back in 2002, Harry H. Cheng wrote:
>
>> Agreed. gcc is a C compiler for different platforms.
>> VC++ is a C/C++ compiler for Windows.
>> SCC is a C compiler for Cray machine.
>> Ch is an embeddable C interpreter for different platforms.
>> They use different names and have different extensions to C.
>> However, they all conform to the ISO C90 standard.
>> I think they will conform to C99 eventually.
>
>
>In reply to a rant I posted on comp.lang.c. The point of my rant was
>that a large majority of existing compilers should have conformed with
>a large percentage of C99 back in '99. The purpose of releasing a
>standard is to codify existing practice.

That's an incorrect assumption very early in your argument.

--
Al Balmer
Balmer Consulting
removebalmerc...@att.net

Michael Wojcik

unread,
Sep 27, 2004, 4:05:46 PM9/27/04
to

In article <3064b51d.04092...@posting.google.com>, beli...@aol.com writes:
> Chris Barts <chb...@gmail.com> wrote in message news:<pan.2004.09.27....@gmail.com>...
>
> > There is plenty of legacy code written in C. This is why compilers usually
> > support pre-Standard constructs, and it is why languages that look like C
> > (and, to some extent, act like C) thrive, while Fortran and Cobol and Lisp
> > descendants have failed. There's nothing to suggest that C is becoming any
> > less relevant with time.
>
> I think the statement "Fortran has failed" is wrong.

COBOL and Lisp are still going strong, too. Chris is zero-for-three
on this one.

--
Michael Wojcik michael...@microfocus.com

Even 300 years later, you should plan it in detail, when it comes to your
summer vacation. -- Pizzicato Five

Mike Wahler

unread,
Sep 27, 2004, 5:49:28 PM9/27/04
to

<beli...@aol.com> wrote in message
news:3064b51d.04092...@posting.google.com...

> Chris Barts <chb...@gmail.com> wrote in message
news:<pan.2004.09.27....@gmail.com>...
>
> > There is plenty of legacy code written in C. This is why compilers
usually
> > support pre-Standard constructs, and it is why languages that look like
C
> > (and, to some extent, act like C) thrive, while Fortran and Cobol and
Lisp
> > descendants have failed. There's nothing to suggest that C is becoming
any
> > less relevant with time.
>
> I think the statement "Fortran has failed" is wrong.


Yes it's wrong. But Chris did not make that statement. He stated:


"Fortran and Cobol and Lisp descendants have failed."

Whether this is true or not I have no idea, especially when
'failed' is a subjective issue.

-Mike


Mike Wahler

unread,
Sep 27, 2004, 5:53:09 PM9/27/04
to

"Kenny McCormack" <gaz...@yin.interaccess.com> wrote in message
news:cj9gde$4j5$1...@yin.interaccess.com...

> In article <cj982t$qvh$5...@sunnews.cern.ch>, Dan Pop <Dan...@cern.ch>
wrote:
> >In <cj50m5$6r3$1...@news-int.gatech.edu> Derrick Coetzee
<dcn...@moonflare.com> writes:
> >
> >>Now Java, C++, Visual Basic, and others are (each!) more prevalent.
> >
> >Do you have some hard data to back up this statement? Whenever I look at
> >some open source project, I see C source code. Exceptionally Fortran or
> >C++.
>
> It just depends on where you look.

Yes. But it seems that you 'look' with a very narrow view.

> When I look at code written in the real world, for pay, to solve (badly)
> real world problems, I see VB (and/or other MS abominations) code.

Your 'real world' must be very small then. PC's running Microsoft
Windows (or any other OS) are a tiny minority of computer systems
in existence.

-Mike


Mark McIntyre

unread,
Sep 27, 2004, 5:53:09 PM9/27/04
to
On Mon, 27 Sep 2004 16:43:58 GMT, in comp.lang.c ,
gaz...@yin.interaccess.com (Kenny McCormack) wrote:

>In article <cj982t$qvh$5...@sunnews.cern.ch>, Dan Pop <Dan...@cern.ch> wrote:
>>In <cj50m5$6r3$1...@news-int.gatech.edu> Derrick Coetzee <dcn...@moonflare.com> writes:
>>
>>>Now Java, C++, Visual Basic, and others are (each!) more prevalent.
>>
>>Do you have some hard data to back up this statement? Whenever I look at
>>some open source project, I see C source code. Exceptionally Fortran or
>>C++.
>
>It just depends on where you look.

This is true.

>When I look at code written in the real world, for pay, to solve (badly)
>real world problems, I see VB (and/or other MS abominations) code.

whereas, in perhaps a more real-world situation than you can imagine, I see
C, C++, C#, Java, VB, VB.Net, Python and a heck of a lot of scripting
languages. Each being used where its most appropriate.

Only anidiot uses VB to write processor intensive libraries or code that
you need to have accessible on Solaris, Redhat and WinXP. Similarly only a
masochist uses C to write a WinXP GUI app.
--
Mark McIntyre
CLC FAQ <http://www.eskimo.com/~scs/C-faq/top.html>
CLC readme: <http://www.ungerhu.com/jxh/clc.welcome.txt>


----== Posted via Newsfeed.Com - Unlimited-Uncensored-Secure Usenet News==----
http://www.newsfeed.com The #1 Newsgroup Service in the World! >100,000 Newsgroups
---= 19 East/West-Coast Specialized Servers - Total Privacy via Encryption =---

Kenny McCormack

unread,
Sep 27, 2004, 8:55:13 PM9/27/04
to
In article <kn2hl0hlv76uh310l...@4ax.com>,
Mark McIntyre <markmc...@spamcop.net> wrote:
...

>whereas, in perhaps a more real-world situation than you can imagine, I see
>C, C++, C#, Java, VB, VB.Net, Python and a heck of a lot of scripting
>languages. Each being used where its most appropriate.

I stand by my original assertion - whatever it was - but it's not worth
arguing about. I think I made it clear that my ego is not in this.

>Only anidiot uses VB to write processor intensive libraries or code that
>you need to have accessible on Solaris, Redhat and WinXP. Similarly only a
>masochist uses C to write a WinXP GUI app.

So true. But there's a f*** lot of idiots in the world, and more than your
fair share of masochists.

Kenny McCormack

unread,
Sep 27, 2004, 8:58:50 PM9/27/04
to
In article <9Z%5d.11541$gG4....@newsread1.news.pas.earthlink.net>,
Mike Wahler <mkwa...@mkwahler.net> wrote:
...

>Your 'real world' must be very small then. PC's running Microsoft
>Windows (or any other OS) are a tiny minority of computer systems
>in existence.

You sure about that? As I've tried to make clear, my ego's no in this, but
I'm pretty sure that, counting boxes or counting CPUs, IBM compatible PCs
(including all the server boxes which are really just overgrown PCs)
running on x86 chips make up a substantial percentage of the total number
of boxes in the world. I wouldn't be surprised if it was at least 60%.

Now, if you want to do it by computing power - megaflops or whatever - you
might have a defensible position.

Derrick Coetzee

unread,
Sep 27, 2004, 9:56:40 PM9/27/04
to
Chris Barts wrote:
> This is really, really narrow of you. C, Java, and Visual Basic all have
> their own, usually mutually-exclusive, problem domains. Using a C program
> when you should be programming in VB is pretty damned stupid, and
> vice-versa. Anyone who would say the above probably has only programmed in
> one environment, making one kind of program in one language. If, indeed,
> you have programmed at all.

Please don't insult me. I'm aware that C is used extensively in many
standalone environments, and I have used it in such environments in
industry, and it is one of my favourite languages. Speaking of the
industry in general, however, a lot of code produced nowadays is
high-level web and database applications that run on desktop PCs, and
other similarly boring stuff. I am not asserting that C has been
supplanted universally, but only that it is no longer dominant across
most of the industry as it once was.

Chris Torek

unread,
Sep 27, 2004, 10:11:33 PM9/27/04
to
>In article <9Z%5d.11541$gG4....@newsread1.news.pas.earthlink.net>,
>Mike Wahler <mkwa...@mkwahler.net> wrote:
>>Your 'real world' must be very small then. PC's running Microsoft
>>Windows (or any other OS) are a tiny minority of computer systems
>>in existence.

I think you need to define "any other OS" here. :-)

In article <cjadd9$bcg$1...@yin.interaccess.com>


Kenny McCormack <gaz...@interaccess.com> wrote:
>You sure about that? As I've tried to make clear, my ego's no in this, but
>I'm pretty sure that, counting boxes or counting CPUs, IBM compatible PCs
>(including all the server boxes which are really just overgrown PCs)
>running on x86 chips make up a substantial percentage of the total number
>of boxes in the world. I wouldn't be surprised if it was at least 60%.

And I think you need to define "box" here.

Your microwave has a microprocessor. If your refrigerator is new
and high-end, it has one. Your TV has one; your VCR or DVD player
has one; your car, if it was built within the last decade, has at
least one, and probably over a dozen, CPUs. Are these "boxes"?

For that matter, even your desktop PC has more than one CPU. In
particular, every disk drive has a microprocessor, and a modern
multisync monitor has one. Your keyboard and mouse have (more
limited) microprocessors in them. The CPU on your motherboard --
which is the only one running Windows, if it is indeed running
Windows -- is quite outnumbered.

>Now, if you want to do it by computing power - megaflops or whatever - you
>might have a defensible position.

Actually, here, the Pentium-clones may have the edge. Many of the
small microprocessors (e.g., even the PowerPC in the TiVo) are
running at lower clock frequencies to reduce power dissipation
(which leads to heat, which requires a fan, which makes the TiVo
too noisy).
--
In-Real-Life: Chris Torek, Wind River Systems
Salt Lake City, UT, USA (40°39.22'N, 111°50.29'W) +1 801 277 2603
email: forget about it http://web.torek.net/torek/index.html
Reading email is like searching for food in the garbage, thanks to spammers.

Kenny McCormack

unread,
Sep 27, 2004, 11:07:41 PM9/27/04
to
In article <cjah8...@news1.newsguy.com>,

Chris Torek <nos...@torek.net> wrote:
>>In article <9Z%5d.11541$gG4....@newsread1.news.pas.earthlink.net>,
>>Mike Wahler <mkwa...@mkwahler.net> wrote:
>>>Your 'real world' must be very small then. PC's running Microsoft
>>>Windows (or any other OS) are a tiny minority of computer systems
>>>in existence.
>
>I think you need to define "any other OS" here. :-)

That's not me. That was the other poster's concept.

>>You sure about that? As I've tried to make clear, my ego's no in this, but
>>I'm pretty sure that, counting boxes or counting CPUs, IBM compatible PCs
>>(including all the server boxes which are really just overgrown PCs)
>>running on x86 chips make up a substantial percentage of the total number
>>of boxes in the world. I wouldn't be surprised if it was at least 60%.
>
>And I think you need to define "box" here.

As far as I'm concerned, a microwave is not a computer. It does have
a microprocessor, as you note. I think the point is that these days, just
about everything has a microprocessor, but that doesn't mean they are
computers.

And, the previous poster did say that:

>>>PC's running Microsoft Windows (or any other OS) are a tiny minority of
>>>computer systems in existence.

^^^^^^^^

Chris Torek

unread,
Sep 28, 2004, 12:53:45 AM9/28/04
to
>>>In article <9Z%5d.11541$gG4....@newsread1.news.pas.earthlink.net>,
>>>Mike Wahler <mkwa...@mkwahler.net> wrote:
>>>>Your 'real world' must be very small then. PC's running Microsoft
>>>>Windows (or any other OS) are a tiny minority of computer systems
>>>>in existence.

>In article <cjah8...@news1.newsguy.com>,
>Chris Torek <nos...@torek.net> wrote:
>>I think you need to define "any other OS" here. :-)

In article <news:cjakut$f22$1...@yin.interaccess.com>


Kenny McCormack <gaz...@interaccess.com> wrote:
>That's not me. That was the other poster's concept.

Indeed -- and note which name is the only available referent for "you"
at that point. :-)

Anyway, my point was that it is difficult to count "computers" and
"systems" and "OSes" without first defining each. Is an embedded
system a "system"? It has one or more microprocessors, and these
days, many of them are programmed in C (and some are even being
done in both Java and C, including high-end car "infotainment"
systems).

>As far as I'm concerned, a microwave is not a computer. ...

This fits well with my personal definition of an "embedded system
computer", which is "any time you don't constantly think: there is
a computer in here" when you use it. :-)

Richard Bos

unread,
Sep 28, 2004, 2:57:03 AM9/28/04
to
"Mike Wahler" <mkwa...@mkwahler.net> wrote:

> <beli...@aol.com> wrote in message


> > Chris Barts <chb...@gmail.com> wrote in message
> >

> > > (and, to some extent, act like C) thrive, while Fortran and Cobol and Lisp
> > > descendants have failed. There's nothing to suggest that C is becoming any
> > > less relevant with time.
> >
> > I think the statement "Fortran has failed" is wrong.
>
> Yes it's wrong. But Chris did not make that statement. He stated:
> "Fortran and Cobol and Lisp descendants have failed."
>
> Whether this is true or not I have no idea, especially when
> 'failed' is a subjective issue.

If he meant "Fortran and Cobol and (Lisp descendants)", it's clearly
untrue because of the first two languages.
However, I suspect he meant "(Fortran and Cobol and Lisp) descendants";
in which case, no, I don't know any succesful Fortran- and Cobol-alikes,
either, but AFAIK Scheme is doing reasonably well in the same areas for
which Lisp was originally used.

Richard

Richard Bos

unread,
Sep 28, 2004, 11:29:13 AM9/28/04
to
Derrick Coetzee <dcn...@moonflare.com> wrote:

> Chris Barts wrote:
> > This is really, really narrow of you. C, Java, and Visual Basic all have
> > their own, usually mutually-exclusive, problem domains. Using a C program
> > when you should be programming in VB is pretty damned stupid, and
> > vice-versa. Anyone who would say the above probably has only programmed in
> > one environment, making one kind of program in one language. If, indeed,
> > you have programmed at all.
>
> Please don't insult me. I'm aware that C is used extensively in many
> standalone environments, and I have used it in such environments in
> industry, and it is one of my favourite languages. Speaking of the
> industry in general, however, a lot of code produced nowadays is
> high-level web and database applications that run on desktop PCs,

Quite. And I don't want any database application written in VB or Java
on _my_ network, thank you very much. Java is for slow, broken Web code;
VB is for slow, broken amateurs' programs. For production code, one
either uses a domain-specific language (such as, for database programs,
dBase 2000, or even FoxPro), or a high quality general language such as
C.

Richard

Alan Balmer

unread,
Sep 28, 2004, 11:59:11 AM9/28/04
to
On Tue, 28 Sep 2004 03:07:41 GMT, gaz...@yin.interaccess.com (Kenny
McCormack) wrote:

>As far as I'm concerned, a microwave is not a computer. It does have
>a microprocessor, as you note. I think the point is that these days, just
>about everything has a microprocessor, but that doesn't mean they are
>computers.

My office is not a computer, either, but it has one in it. (Three,
actually.)

Kenny McCormack

unread,
Sep 28, 2004, 12:15:10 PM9/28/04
to
In article <lg2jl017jd5sck6tp...@4ax.com>,

Alan Balmer <alba...@spamcop.net> wrote:
>On Tue, 28 Sep 2004 03:07:41 GMT, gaz...@yin.interaccess.com (Kenny
>McCormack) wrote:
>
>>As far as I'm concerned, a microwave is not a computer. It does have
>>a microprocessor, as you note. I think the point is that these days,
>>just about everything has a microprocessor, but that doesn't mean they
>>are computers.
>
>My office is not a computer, either, but it has one in it. (Three,
>actually.)

My car is not a door, but it has one in it. (2, actually)

Neither is my house a door, though it has several.

Are we having fun yet?

Mike Wahler

unread,
Sep 28, 2004, 1:15:02 PM9/28/04
to

"Richard Bos" <r...@hoekstra-uitgeverij.nl> wrote in message
news:415907da....@news.individual.net...

> "Mike Wahler" <mkwa...@mkwahler.net> wrote:
>
> > <beli...@aol.com> wrote in message
> > > Chris Barts <chb...@gmail.com> wrote in message
> > >
> > > > (and, to some extent, act like C) thrive, while Fortran and Cobol
and Lisp
> > > > descendants have failed. There's nothing to suggest that C is
becoming any
> > > > less relevant with time.
> > >
> > > I think the statement "Fortran has failed" is wrong.
> >
> > Yes it's wrong. But Chris did not make that statement. He stated:
> > "Fortran and Cobol and Lisp descendants have failed."
> >
> > Whether this is true or not I have no idea, especially when
> > 'failed' is a subjective issue.
>
> If he meant "Fortran and Cobol and (Lisp descendants)", it's clearly
> untrue because of the first two languages.

Yes.

> However, I suspect he meant "(Fortran and Cobol and Lisp) descendants";

Yes, that's how I interpreted it also. Perhaps Chris will clarify.

-Mike


Derrick Coetzee

unread,
Sep 28, 2004, 1:24:06 PM9/28/04
to
Richard Bos wrote:
> Java is for slow, broken Web code; VB is for slow, broken amateurs' programs.

If you believe either of these languages is confined to such small
domains, then it's your sphere that is small. Java is used in many
domains where portability is important, or where the added safety or
security of Java is important, including domains traditionally belonging
to C such as compilers and raytracers. Moreover, Java *can* be compiled
directly to native code that runs as fast as any other native code, and
you're confusing the Java language with the Java environment if you
imagine it can't be. Speed comparisons between modern VMs like the Java
HotSpot VM and native code also show that Java VMs can no longer really
be considered unacceptably slow in most cases (although heavyweight,
certainly). Welcome to 2004.

As for VB, well... er... I'm not about to defend VB.

Alan Balmer

unread,
Sep 28, 2004, 2:02:55 PM9/28/04
to
On Tue, 28 Sep 2004 16:15:10 GMT, gaz...@yin.interaccess.com (Kenny
McCormack) wrote:

>In article <lg2jl017jd5sck6tp...@4ax.com>,
>Alan Balmer <alba...@spamcop.net> wrote:
>>On Tue, 28 Sep 2004 03:07:41 GMT, gaz...@yin.interaccess.com (Kenny
>>McCormack) wrote:
>>
>>>As far as I'm concerned, a microwave is not a computer. It does have
>>>a microprocessor, as you note. I think the point is that these days,
>>>just about everything has a microprocessor, but that doesn't mean they
>>>are computers.
>>
>>My office is not a computer, either, but it has one in it. (Three,
>>actually.)
>
>My car is not a door, but it has one in it. (2, actually)
>
>Neither is my house a door, though it has several.
>
>Are we having fun yet?

If we were discussing how many doors exist, your remark would be
apropos. As it is, we are discussing how many computers there are, and
your microwave has one, whether the fact supports your argument or
not.

Michael Wojcik

unread,
Sep 28, 2004, 2:47:57 PM9/28/04
to

In article <415907da....@news.individual.net>, r...@hoekstra-uitgeverij.nl (Richard Bos) writes:
> "Mike Wahler" <mkwa...@mkwahler.net> wrote:
> > <beli...@aol.com> wrote in message
> > > Chris Barts <chb...@gmail.com> wrote in message
> > >
> > > > ... Fortran and Cobol and Lisp descendants have failed ...

> > >
> > > I think the statement "Fortran has failed" is wrong.
> >
> > Yes it's wrong. But Chris did not make that statement. He stated:
> > "Fortran and Cobol and Lisp descendants have failed."
>
> If he meant "Fortran and Cobol and (Lisp descendants)", it's clearly
> untrue because of the first two languages.
> However, I suspect he meant "(Fortran and Cobol and Lisp) descendants";

On reflection, I believe you're right, though I made the same error
as beliavsky in my previous post. That said, however, I still think
Chris is wrong.

> in which case, no, I don't know any succesful Fortran- and Cobol-alikes,
> either,

Both Fortran and COBOL feature current standards which offer
significant new features beyond what those languages traditionally
provided, and both seem to have communities of developers who use
only the traditional features, as well as communities who use the
new ones.

COBOL, for example, now features OO. Few COBOL programmers use OO
COBOL, but enough do to make supporting it profitable.

So in a sense, Fortran and COBOL *are* descendants of Fortran and
COBOL. They just didn't bother renaming the language.

> but AFAIK Scheme is doing reasonably well in the same areas for
> which Lisp was originally used.

Yes, and again with Lisp we have Common Lisp with its OO support
(CLOS), which is substantially different from traditional Lisp.

What chiefly distinguishes C and its "descendants" from the other
cases is that there was a strong movement to preserve C with only
relatively unobtrusive changes; thus most of the various languages
that diverged from C had to present themselves as new languages to
gain wide acceptance.

--
Michael Wojcik michael...@microfocus.com

It's like being shot at in an airport with all those guys running
around throwing hand grenades. Certain people function better with
hand grenades coming from all sides than other people do when the
hand grenades are only coming from inside out.
-- Dick Selcer, coach of the Cinci Bengals

Michael Wojcik

unread,
Sep 28, 2004, 2:52:37 PM9/28/04
to

In article <cjakut$f22$1...@yin.interaccess.com>, gaz...@yin.interaccess.com (Kenny McCormack) writes:
>
> As far as I'm concerned, a microwave is not a computer.

I don't believe Chris claimed that it was. He said there was a
computer in it, and he questioned the definition of "box". I believe
the implication was that "box" should not be defined as "computer".

> It does have
> a microprocessor, as you note. I think the point is that these days, just
> about everything has a microprocessor, but that doesn't mean they are
> computers.

The people who write software for them probably feel differently.

> And, the previous poster did say that:
>
> >>>PC's running Microsoft Windows (or any other OS) are a tiny minority of
> >>>computer systems in existence.
> ^^^^^^^^

And he's right.

You may choose to define "computer" as "general-purpose computer", but
don't be surprised if the rest of us continue to use a more sensible
definition.


--
Michael Wojcik michael...@microfocus.com

I would never understand our engineer. But is there anything in this world
that *isn't* made out of words? -- Tawada Yoko (trans. Margaret Mitsutani)

Kenny McCormack

unread,
Sep 28, 2004, 3:30:12 PM9/28/04
to
In article <cjcbt...@news2.newsguy.com>,

Michael Wojcik <mwo...@newsguy.com> wrote:
>
>You may choose to define "computer" as "general-purpose computer", but
>don't be surprised if the rest of us continue to use a different
>definition.

No problem. And I won't lose any sleep over it, either.

Keep in mind that if you told the average man on the street that there was
a computer in his microwave, he'd rush home and open the door to remove his
PC from the microwave (and hope that no one turned the microwave on while
the PC was in there).

Dave Vandervies

unread,
Sep 28, 2004, 3:55:41 PM9/28/04
to
In article <cjcbk...@news2.newsguy.com>,

Michael Wojcik <mwo...@newsguy.com> wrote:
>In article <415907da....@news.individual.net>,
>r...@hoekstra-uitgeverij.nl (Richard Bos) writes:


>COBOL, for example, now features OO.

That would be ADD ONE TO COBOL GIVING COBOL?


>So in a sense, Fortran and COBOL *are* descendants of Fortran and
>COBOL. They just didn't bother renaming the language.
>
>> but AFAIK Scheme is doing reasonably well in the same areas for
>> which Lisp was originally used.
>
>Yes, and again with Lisp we have Common Lisp with its OO support
>(CLOS), which is substantially different from traditional Lisp.


>What chiefly distinguishes C and its "descendants" from the other
>cases is that there was a strong movement to preserve C with only
>relatively unobtrusive changes; thus most of the various languages
>that diverged from C had to present themselves as new languages to
>gain wide acceptance.

This looks like a (slight) overstatement of the case to me.

C++ is really the only "true" descendant of C; the other C-like languages
(at least the ones I know about) tend to have entirely different lineage
with C-like syntax pasted on (because curly braces are so much k3wLer than
"begin" and "end").

Interestingly, most C implementations come packaged with C++
implementations, and it's usually not difficult (though seldom entirely
trivial) to convert a well-written C program to a program that does The
Right Thing when given to a C++ compiler. (Not that there's often a
good reason to do this.)

So it seems that C and C++, taken as a pair, are not at all unlike F77
and F90 (if I've got those names right), or Lisp and Lisp-with-CLOS,
or old-COBOL and new-COBOL-with-OO, and languages like Java are just
hangers-on that add to the confusion.


dave

--
Dave Vandervies dj3v...@csclub.uwaterloo.ca
> The only rule is 'read everything Chris Torek writes'.
Have you seen his latest shopping list? Heavy stuff...
--CBFalconer and Richard Heathfield in comp.lang.c

Michael Wojcik

unread,
Sep 28, 2004, 4:56:50 PM9/28/04
to

In article <cjceh5$2i2$1...@yin.interaccess.com>, gaz...@yin.interaccess.com (Kenny McCormack) writes:
>
> Keep in mind that if you told the average man on the street that there was
> a computer in his microwave, he'd rush home and open the door to remove his
> PC from the microwave (and hope that no one turned the microwave on while
> the PC was in there).

I give the average man on the street more credit than that - at long
as we're talking about anglophone men who know what a microwave
[oven] is, and have some idea of what a computer is. (I can hardly
expect someone who doesn't meet those conditions to understand the
mooted statement.)

I suspect most educated people in the industrialized world are
conversant with the idea of embedded computers. For example, I don't
believe I've run into an automobile owner in the past decade or so
who wasn't aware that there was some kind of "computer" controlling
their car's engine, even if they had little idea what it might
actually be doing.

--
Michael Wojcik michael...@microfocus.com

Push up the bottom with your finger, it will puffy and makes stand up.
-- instructions for "swan" from an origami kit

Mark McIntyre

unread,
Sep 28, 2004, 5:33:56 PM9/28/04
to
On Tue, 28 Sep 2004 00:55:13 GMT, in comp.lang.c ,
gaz...@yin.interaccess.com (Kenny McCormack) wrote:

>In article <kn2hl0hlv76uh310l...@4ax.com>,
>Mark McIntyre <markmc...@spamcop.net> wrote:
>...

>>Only anidiot uses VB to write processor intensive libraries or code that
>>you need to have accessible on Solaris, Redhat and WinXP. Similarly only a
>>masochist uses C to write a WinXP GUI app.
>
>So true. But there's a f*** lot of idiots in the world, and more than your
>fair share of masochists.

True. However this doesn't make them any less idiotic, and I can report
that none of them work for me, or will do in the future.

Kenny McCormack

unread,
Sep 28, 2004, 5:35:03 PM9/28/04
to
In article <cjcj6...@news3.newsguy.com>,
Michael Wojcik <mwo...@newsguy.com> wrote:
...

>I suspect most educated people in the industrialized world are
>conversant with the idea of embedded computers. For example, I don't
>believe I've run into an automobile owner in the past decade or so
>who wasn't aware that there was some kind of "computer" controlling
>their car's engine, even if they had little idea what it might
>actually be doing.

The fact that you put "computer" in quotes proves my point.

Anyone with any sense knows that a microprocessor isn't a computer anymore
than a door is a house. Or that a CRT is a TV.

Now, to be fair, I have come across a fair number of uneducated people who
refer to PCs as CPUs - no doubt because they think it makes them sound cool.
You know - as in, "Hey Fred, could you go install Word on Joe's CPU?".

Mark McIntyre

unread,
Sep 28, 2004, 5:47:30 PM9/28/04
to
On Tue, 28 Sep 2004 00:58:50 GMT, in comp.lang.c ,
gaz...@yin.interaccess.com (Kenny McCormack) wrote:

>In article <9Z%5d.11541$gG4....@newsread1.news.pas.earthlink.net>,
>Mike Wahler <mkwa...@mkwahler.net> wrote:
>...
>>Your 'real world' must be very small then. PC's running Microsoft
>>Windows (or any other OS) are a tiny minority of computer systems
>>in existence.
>
>You sure about that? As I've tried to make clear, my ego's no in this, but
>I'm pretty sure that, counting boxes or counting CPUs, IBM compatible PCs
>(including all the server boxes which are really just overgrown PCs)
>running on x86 chips make up a substantial percentage of the total number
>of boxes in the world. I wouldn't be surprised if it was at least 60%.

There are CONSIDERABLY more nonobvious computers in the world than there
are personal computers. How many people in the US have mobile phones?
Cars? Microwaves? Digital alarm clocks? Video recorders? DVD players? MP3
players? PDAs? And we've not even started to think about ATMs, Pin card
readers, cash registers, vote counters, etc etc etc....

Mind you, if they're using VB to write vote-counting software, I can
predict the Nov result now:

G Bush -0x80090317
J Kerry -0x8009030D
R Nader Out of Cheese error +++ Redo From Start +++++

Alan Balmer

unread,
Sep 28, 2004, 6:40:44 PM9/28/04
to
On Tue, 28 Sep 2004 21:35:03 GMT, gaz...@yin.interaccess.com (Kenny
McCormack) wrote:

>In article <cjcj6...@news3.newsguy.com>,
>Michael Wojcik <mwo...@newsguy.com> wrote:
>...
>>I suspect most educated people in the industrialized world are
>>conversant with the idea of embedded computers. For example, I don't
>>believe I've run into an automobile owner in the past decade or so
>>who wasn't aware that there was some kind of "computer" controlling
>>their car's engine, even if they had little idea what it might
>>actually be doing.
>
>The fact that you put "computer" in quotes proves my point.
>
>Anyone with any sense knows that a microprocessor isn't a computer anymore
>than a door is a house. Or that a CRT is a TV.
>

So anyone who disagrees with your personal definition has no sense.
You should probably be warned that there are a *lot* of people who
don't agree with you.

>Now, to be fair, I have come across a fair number of uneducated people who
>refer to PCs as CPUs - no doubt because they think it makes them sound cool.
>You know - as in, "Hey Fred, could you go install Word on Joe's CPU?".

I've come across a few who think that a microprocessor is not a
computer.

Kenny McCormack

unread,
Sep 28, 2004, 6:51:53 PM9/28/04
to
In article <ippjl0pet7u62lufp...@4ax.com>,
Alan Balmer <alba...@spamcop.net> wrote:
...

>>Anyone with any sense knows that a microprocessor isn't a computer anymore
>>than a door is a house. Or that a CRT is a TV.
>>
>So anyone who disagrees with your personal definition has no sense.

Pretty much, yeah.

>You should probably be warned that there are a *lot* of people who
>don't agree with you.

I've learned to live with it. I'm used to it by now.

>>Now, to be fair, I have come across a fair number of uneducated people who
>>refer to PCs as CPUs - no doubt because they think it makes them sound cool.
>>You know - as in, "Hey Fred, could you go install Word on Joe's CPU?".
>
>I've come across a few who think that a microprocessor is not a
>computer.

It's not. It is a component of a computer. Note that a computer may have
more than one microprocessor. (*)

(*) Or, it may have none (certain mainframes...)

Allin Cottrell

unread,
Sep 28, 2004, 11:42:15 PM9/28/04
to
Dan Pop wrote:
> In <cj50m5$6r3$1...@news-int.gatech.edu> Derrick Coetzee <dcn...@moonflare.com> writes:
>
>
>>Now Java, C++, Visual Basic, and others are (each!) more prevalent.
>
>
> Do you have some hard data to back up this statement? Whenever I look at
> some open source project, I see C source code. Exceptionally Fortran or
> C++.

Yes, agreed. The only caveat is that this is in a sense a biased
sample, because you're talking about cases where the source code
is visible, and hence the programming language known. Sadly, a
great deal of code is still invisible and uncheckable, and
written in who-knows-what source language (though one may guess).

Allin Cottrell

Dan Pop

unread,
Sep 29, 2004, 9:09:04 AM9/29/04
to
In <cjc6ot$dhu$1...@news-int2.gatech.edu> Derrick Coetzee <dcn...@moonflare.com> writes:

>to C such as compilers and raytracers. Moreover, Java *can* be compiled
>directly to native code that runs as fast as any other native code,

You're really naive if you believe this. Just because it's native code
it doesn't mean that it runs necessarily as fast as the native code
generated by a C compiler from a C program solving the same problem.

Java's portability comes at the cost of Java being an overspecified
programming language. If the native behaviour of the underlying
processor doesn't match the Java virtual machine specification, additional
code is required to provide the behaviour of the Java virtual machine.
Then, there are issues related to the bound checking and garbage
collection *required* by Java.

It is highly nonrealistic to expect a language designed on the principle
"the programmer is incompetent and cannot be trusted" to be as efficient
as languages that trust the programmer to know what he's doing. There
are redeeming advantages for Java's approach, but it is sheer foolishness
to believe that it comes at no cost.

>and you're confusing the Java language with the Java environment if you
>imagine it can't be.

The Java language is defined in terms of the Java virtual machine.
No Java to native code translation device can ignore the specification
of the Java virtual machine.

Dan
--
Dan Pop
DESY Zeuthen, RZ group
Email: Dan...@ifh.de
Currently looking for a job in the European Union

Richard Bos

unread,
Sep 29, 2004, 10:35:08 AM9/29/04
to
Derrick Coetzee <dcn...@moonflare.com> wrote:

> Speed comparisons between modern VMs like the Java
> HotSpot VM and native code also show

I am not impressed by IndustrhyStones. I _am_ impressed by my own
observations.

Richard

Richard Bos

unread,
Sep 29, 2004, 10:47:22 AM9/29/04
to
Mark McIntyre <markmc...@spamcop.net> wrote:

> There are CONSIDERABLY more nonobvious computers in the world than there
> are personal computers. How many people in the US have mobile phones?
> Cars? Microwaves? Digital alarm clocks? Video recorders? DVD players? MP3
> players? PDAs? And we've not even started to think about ATMs,

Many ATMs _are_ normal PCs. I've seen a picture of one showing a BSOD.
Ditto, but in person, the information terminals at an airport. I've
forgotten which; statistics say it's probably Schiphol, but my memory
insists that it was an English airport, which means in has to be either
Heathrow or Stansted.

Yes, actually, I _was_ a little worried. If they run the terminals on
Win-BSOD-dows, who knows WTF they run their important systems on?

> Mind you, if they're using VB to write vote-counting software, I can
> predict the Nov result now:
>
> G Bush -0x80090317
> J Kerry -0x8009030D
> R Nader Out of Cheese error +++ Redo From Start +++++

Having read comp.risks for the last couple of years, I wouldn't put it
past Diebold.

Richard

Richard Bos

unread,
Sep 29, 2004, 10:48:51 AM9/29/04
to
gaz...@yin.interaccess.com (Kenny McCormack) wrote:

Keep in mind that to the average man on the street, "my computer" is his
monitor, and his computer is "my diskdrive".

Richard

Mike Wahler

unread,
Sep 29, 2004, 12:20:31 PM9/29/04
to
"Kenny McCormack" <gaz...@yin.interaccess.com> wrote in message
news:cjadd9$bcg$1...@yin.interaccess.com...

> In article <9Z%5d.11541$gG4....@newsread1.news.pas.earthlink.net>,
> Mike Wahler <mkwa...@mkwahler.net> wrote:
> ...
> >Your 'real world' must be very small then. PC's running Microsoft
> >Windows (or any other OS) are a tiny minority of computer systems
> >in existence.
>
> You sure about that?

Absolutely positive.

> As I've tried to make clear, my ego's no in this,

Egos are not relevant to the facts.

> but
> I'm pretty sure that, counting boxes or counting CPUs,

Not every computer is enclosed in a 'box'.

> IBM compatible PCs
> (including all the server boxes which are really just overgrown PCs)

Every server is an 'overgrown PC?' Huh? Even a 390? A VAX?

> running on x86 chips make up a substantial percentage of the total number
> of boxes in the world.

Not every computer is enclosed in a 'box'. Also, I feel that
the use of the word 'box' to indicate a computer is an attempt
to sound 'kewl', which impresses me not.

>I wouldn't be surprised if it was at least 60%.

It's not.

>
> Now, if you want to do it by computing power - megaflops or whatever - you
> might have a defensible position.

The 'power' of various computers is moot to my assertion. I was
simply talking about the *number* of computers in existence.

I have a typical American house with mostly modern appliances,
and four automobiles. At any given time my 'office' (a converted
bedroom) houses from five to seven PC's, and three or four other
specialized computers which are *not* PC's. There are also several
dozen other computers scattered throughout my house and automobiles.
There's even a computer system that controls when and how long my
garden gets watered, based upon how much Mother Nature has already
done so.

-Mike


Michael Wojcik

unread,
Sep 29, 2004, 1:10:48 PM9/29/04
to

In article <cjclr8$alm$1...@yin.interaccess.com>, gaz...@yin.interaccess.com (Kenny McCormack) writes:
> In article <cjcj6...@news3.newsguy.com>,
> Michael Wojcik <mwo...@newsguy.com> wrote:
> ...
> >I suspect most educated people in the industrialized world are
> >conversant with the idea of embedded computers. For example, I don't
> >believe I've run into an automobile owner in the past decade or so
> >who wasn't aware that there was some kind of "computer" controlling
> >their car's engine, even if they had little idea what it might
> >actually be doing.
>
> The fact that you put "computer" in quotes proves my point.

It does no such thing. Perhaps you are applying some idiosyncratic
restricted definition of the function of the quotation mark as well?

> Anyone with any sense knows that a microprocessor isn't a computer anymore
> than a door is a house. Or that a CRT is a TV.

This appears to be your own personal neurosis. You may believe it
applies to "anyone with any sense", of course, though you'd be verging
into outright psychosis.

Not that it matters. This entire discussion boils down to "most
computers are Windows boxes, provided we define 'computers' in a way
which makes that statement true, even though no one else here agrees
it should be defined that way". Since that is tautologically true,
pointless, and stupid, there's really no need to discuss it further.
It fails to support whatever argument you introduced it for - not, I
imagine, that anyone besides you cares.

--
Michael Wojcik michael...@microfocus.com

However, we maintain that our mission is more than creating high-tech
amusement--rather, we must endeavor to provide high-tech, high-touch
entertainment with an emphasis on enkindling human warmth.
-- "The Ultimate in Entertainment", from the president of video game
producer Namco

Michael Wojcik

unread,
Sep 29, 2004, 1:17:56 PM9/29/04
to

In article <415ac9ee....@news.individual.net>, r...@hoekstra-uitgeverij.nl (Richard Bos) writes:
>
> Many ATMs _are_ normal PCs. I've seen a picture of one showing a BSOD.
> Ditto, but in person, the information terminals at an airport. I've
> forgotten which; statistics say it's probably Schiphol, but my memory
> insists that it was an English airport, which means in has to be either
> Heathrow or Stansted.

I don't remember whether I've seen a crashed airport information
display at Heathrow, but I've seen them from time to time at the
Lansing, MI airport, and they're definitely running Windows. I don't
think I've seen a BSOD, but I have seen various Windows system error
dialog boxes.

My cell phone, on the other hand, is not, though it has a complete
Java runtime on it, and a web browser, and all sorts of other
nonsense. (It has yet to crash.)

So there are two more data points which demonstrate pretty much
nothing. Some embedded systems run Windows. Some do not.

--
Michael Wojcik michael...@microfocus.com

I'm not particularly funny, but I wanted to do something outrageous.
And I'm Norwegian, so I wasn't going to go too far. -- Darlyne Erickson

Keith Thompson

unread,
Sep 29, 2004, 3:37:12 PM9/29/04
to
"Mike Wahler" <mkwa...@mkwahler.net> writes:
> "Kenny McCormack" <gaz...@yin.interaccess.com> wrote in message
> news:cjadd9$bcg$1...@yin.interaccess.com...
[...]

>> IBM compatible PCs
>> (including all the server boxes which are really just overgrown PCs)
>
> Every server is an 'overgrown PC?' Huh? Even a 390? A VAX?

Note the lack of a comma after "boxes". I think he was referring to
the subset of server boxes which are "which are really just overgrown
PCs" (many of them are), not asserting that all server boxes are
overgrown PCs.

>> running on x86 chips make up a substantial percentage of the total number
>> of boxes in the world.
>
> Not every computer is enclosed in a 'box'. Also, I feel that
> the use of the word 'box' to indicate a computer is an attempt
> to sound 'kewl', which impresses me not.

I suspect we can all agree on the following statements:

1. Most non-embedded computer systems are more or less PC-compatible
systems running x86 processors. This includes most desktop and laptop
PCs and many (but by no means all) servers.

2. Most computer systems, embedded or not, are *not* PC-compatibles.
This includes the engine computer(s) in your car and the CPUs in your
keyboard, your mobile phone, your washing machine, and your DVD
player.

I think the only point of disagreement is whether the term "computer"
applies to embedded systems as well as to standalone computers. That
may be an interesting question, but it's off-topic here.

Another question, that's more nearly topical, is how much programming
(C or otherwise) is done for embedded systems vs. non-embedded
systems, where "programming" might be measured in lines of code or in
programmer hours. Certainly the vast majority of the programmers I've
known haven't worked on embedded systems, but my experience is almost
certainly not representative.

--
Keith Thompson (The_Other_Keith) ks...@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.

Kenny McCormack

unread,
Sep 29, 2004, 4:12:08 PM9/29/04
to
In article <lnacv8g...@nuthaus.mib.org>,
Keith Thompson <ks...@mib.org> wrote:
...

>>> (including all the server boxes which are really just overgrown PCs)
>>
>> Every server is an 'overgrown PC?' Huh? Even a 390? A VAX?
>
>Note the lack of a comma after "boxes". I think he was referring to
>the subset of server boxes which are "which are really just overgrown
>PCs" (many of them are), not asserting that all server boxes are
>overgrown PCs.

Indeed. I suppose that "that" would have been a better choice of word
than "which". But, still I thought my meaning was perfectly clear.

Interesting, though, how it scans differently if the comma is removed.

>>> running on x86 chips make up a substantial percentage of the total number
>>> of boxes in the world.
>>
>> Not every computer is enclosed in a 'box'. Also, I feel that
>> the use of the word 'box' to indicate a computer is an attempt
>> to sound 'kewl', which impresses me not.
>
>I suspect we can all agree on the following statements:
>
>1. Most non-embedded computer systems are more or less PC-compatible
>systems running x86 processors. This includes most desktop and laptop
>PCs and many (but by no means all) servers.

Exactly.

>2. Most computer systems, embedded or not, are *not* PC-compatibles.
>This includes the engine computer(s) in your car and the CPUs in your
>keyboard, your mobile phone, your washing machine, and your DVD
>player.
>
>I think the only point of disagreement is whether the term "computer"
>applies to embedded systems as well as to standalone computers. That
>may be an interesting question, but it's off-topic here.

Exactly. I think it is perfectly clear in man-on-the-street terms, but
I also understand why it is a point of contention in this NG (see below).

>Another question, that's more nearly topical, is how much programming
>(C or otherwise) is done for embedded systems vs. non-embedded
>systems, where "programming" might be measured in lines of code or in
>programmer hours. Certainly the vast majority of the programmers I've
>known haven't worked on embedded systems, but my experience is almost
>certainly not representative.

I'll bet (no, I don't have any statistics on this) that a large portion of
the new C programming that is happening today *is* in embedded systems
- hence the overwhelming urge here to consider such things "computers".

I can't see much point in doing new development on conventional computers
(aka, "non-embedded systems") in C or other low-level languages, except of
course for things like OSes and device drivers (and any other obvious
exceptions to my generalization). This is not, of course, to say that
learning C isn't a good thing - for any of a number of reasons.

CBFalconer

unread,
Sep 29, 2004, 4:50:57 PM9/29/04
to
Michael Wojcik wrote:
>
... snip ...

>
> So there are two more data points which demonstrate pretty much
> nothing. Some embedded systems run Windows. Some do not.

Another data point: No reliable embedded system runs Windows.

--
A: Because it fouls the order in which people normally read text.
Q: Why is top-posting such a bad thing?
A: Top-posting.
Q: What is the most annoying thing on usenet and in e-mail?


Alan Balmer

unread,
Sep 29, 2004, 5:16:24 PM9/29/04
to
On Wed, 29 Sep 2004 20:12:08 GMT, gaz...@yin.interaccess.com (Kenny
McCormack) wrote:

>
>I'll bet (no, I don't have any statistics on this) that a large portion of
>the new C programming that is happening today *is* in embedded systems
>- hence the overwhelming urge here to consider such things "computers".

Nah, it's simply because they *are* computers, and always have been,
even before Windows and the 386 were invented ;-)

What we don't understand is your overwhelming urge to claim they are
not computers.

Kenny McCormack

unread,
Sep 29, 2004, 5:27:03 PM9/29/04
to
In article <td9ml0p8l513gtqm9...@4ax.com>,

Because they are not. See Mark Twain story at bottom (one of my favorites).

(Or, to put it a little succinctly: Back at ya!)

--- Begin Story ---
Q: If you call a tail a leg, how many legs does a dog have?
A: 4. Calling a tail a leg does not make it a leg.

Mark McIntyre

unread,
Sep 29, 2004, 5:26:49 PM9/29/04
to
On Tue, 28 Sep 2004 19:30:12 GMT, in comp.lang.c ,
gaz...@yin.interaccess.com (Kenny McCormack) wrote:

>Keep in mind that if you told the average man on the street that there was
>a computer in his microwave, he'd rush home and open the door to remove his
>PC from the microwave (and hope that no one turned the microwave on while
>the PC was in there).

No he wouldn't. Even the average man on the street knows that there are
computers in pretty much everything these days.

Kenny McCormack

unread,
Sep 29, 2004, 5:28:50 PM9/29/04
to
In article <g2aml05oi8gno3u11...@4ax.com>,

Mark McIntyre <markmc...@spamcop.net> wrote:
>On Tue, 28 Sep 2004 19:30:12 GMT, in comp.lang.c ,
>gaz...@yin.interaccess.com (Kenny McCormack) wrote:
>
>>Keep in mind that if you told the average man on the street that there
>>was a computer in his microwave, he'd rush home and open the door to
>>remove his PC from the microwave (and hope that no one turned the
>>microwave on while the PC was in there).
>
>Indeed he would. Even the average man on the street knows that there are
>microprocessors in pretty much everything these days.

HTH.

Mark McIntyre

unread,
Sep 29, 2004, 5:44:18 PM9/29/04
to
On Tue, 28 Sep 2004 21:35:03 GMT, in comp.lang.c ,
gaz...@yin.interaccess.com (Kenny McCormack) wrote:

>The fact that you put "computer" in quotes proves my point.

no, it merely proves that the meaning of "computer" is what we're
discussing

>Anyone with any sense knows that a microprocessor isn't a computer

Well duh. Neither is an 8088 a computer, or a pentium4. But, connected to a
bunch of other stuff, its what we call a computer.

>anymore than a door is a house. Or that a CRT is a TV.

Your illogic baffles me. My only possible response is "my nipples explode
with delight - would you please fondle my buttocks"

Mark McIntyre

unread,
Sep 29, 2004, 5:48:12 PM9/29/04
to
On Wed, 29 Sep 2004 14:47:22 GMT, in comp.lang.c ,
r...@hoekstra-uitgeverij.nl (Richard Bos) wrote:

>Mark McIntyre <markmc...@spamcop.net> wrote:
>
>> There are CONSIDERABLY more nonobvious computers in the world than there
>> are personal computers. How many people in the US have mobile phones?
>> Cars? Microwaves? Digital alarm clocks? Video recorders? DVD players? MP3
>> players? PDAs? And we've not even started to think about ATMs,
>
>Many ATMs _are_ normal PCs. I've seen a picture of one showing a BSOD.

I saw not one but three in the flesh, two weeks ago, in Liverpool Street
station in London.

>Yes, actually, I _was_ a little worried. If they run the terminals on
>Win-BSOD-dows, who knows WTF they run their important systems on?

I have it on good report that its a little old lady with a really strong
cup of tea.

Dave Vandervies

unread,
Sep 29, 2004, 5:58:46 PM9/29/04
to
In article <cjf9rk$1je$1...@yin.interaccess.com>,

Kenny McCormack <gaz...@interaccess.com> wrote:
>In article <g2aml05oi8gno3u11...@4ax.com>,
>Mark McIntyre <markmc...@spamcop.net> wrote:

[Not quite what Kenny McCormack quoted him as writing]

You should know better than that.

Interesting that I got to this thread right after another one where ERT
was once again getting roasted for silently modifying his quotes...


dave

--
Dave Vandervies dj3v...@csclub.uwaterloo.ca
[I]f such an implementation existed, probably this entire newsgroup's
readership would hunt down, torture and kill the thoughtless bastard
from which it originated. --Micah Cowan in comp.lang.c

Alan Balmer

unread,
Sep 29, 2004, 6:05:39 PM9/29/04
to
On Wed, 29 Sep 2004 21:27:03 GMT, gaz...@yin.interaccess.com (Kenny
McCormack) wrote:

>In article <td9ml0p8l513gtqm9...@4ax.com>,
>Alan Balmer <alba...@spamcop.net> wrote:
>>On Wed, 29 Sep 2004 20:12:08 GMT, gaz...@yin.interaccess.com (Kenny
>>McCormack) wrote:
>>
>>>
>>>I'll bet (no, I don't have any statistics on this) that a large portion
>>>of the new C programming that is happening today *is* in embedded systems
>>>- hence the overwhelming urge here to consider such things "computers".
>>
>>Nah, it's simply because they *are* computers, and always have been,
>>even before Windows and the 386 were invented ;-)
>>
>>What we don't understand is your overwhelming urge to claim they are
>>not computers.
>
>Because they are not. See Mark Twain story at bottom (one of my favorites).
>
>(Or, to put it a little succinctly: Back at ya!)
>
>--- Begin Story ---
>Q: If you call a tail a leg, how many legs does a dog have?
>A: 4. Calling a tail a leg does not make it a leg.

I think that one was old before Mark Twain wrote it down <G>.

I grew up near Elmira, NY, where Clemens spent a good deal of his
writing time. They've made the study where he wrote Huck Finn a
tourist attraction.

jacob navia

unread,
Sep 29, 2004, 9:40:28 AM9/29/04
to
Chris Torek wrote:

>>>>In article <9Z%5d.11541$gG4....@newsread1.news.pas.earthlink.net>,
>>>>Mike Wahler <mkwa...@mkwahler.net> wrote:
>>>>
>>>>>Your 'real world' must be very small then. PC's running Microsoft
>>>>>Windows (or any other OS) are a tiny minority of computer systems
>>>>>in existence.
>
>

>>In article <cjah8...@news1.newsguy.com>,
>>Chris Torek <nos...@torek.net> wrote:
>>
>>>I think you need to define "any other OS" here. :-)
>
>
> In article <news:cjakut$f22$1...@yin.interaccess.com>
> Kenny McCormack <gaz...@interaccess.com> wrote:
>
>>That's not me. That was the other poster's concept.
>
>
> Indeed -- and note which name is the only available referent for "you"
> at that point. :-)
>
> Anyway, my point was that it is difficult to count "computers" and
> "systems" and "OSes" without first defining each. Is an embedded
> system a "system"? It has one or more microprocessors, and these
> days, many of them are programmed in C (and some are even being
> done in both Java and C, including high-end car "infotainment"
> systems).
>
>
>>As far as I'm concerned, a microwave is not a computer. ...
>
>
> This fits well with my personal definition of an "embedded system
> computer", which is "any time you don't constantly think: there is
> a computer in here" when you use it. :-)

That is a special purpose system. Not a general purpose system that can
be programmed.

The crux in a computer is that it can be programmed to do any new
task you want by using the processor's instruction set.

This supposes access to the processor instruction set, and a mean
of directing its actions using a programming language.

You could convert the microwave in a computer by adding the
necessary software, but then... it wouldn't be a microwave
oven anymore...

Keith Thompson

unread,
Sep 29, 2004, 7:06:36 PM9/29/04
to

No, this doesn't help.

What Mark McIntyre actually wrote in the article to which you replied
was:

] No he wouldn't. Even the average man on the street knows that there are
] computers in pretty much everything these days.

If you want to correct what he wrote, do so. Changing someone else's
words and claiming that they're what he wrote is incredibly rude and
dishonest. Don't do it again.

(This is the second forged quotation I've seen today; what's going on
around here?)

Jarno A Wuolijoki

unread,
Sep 29, 2004, 7:12:17 PM9/29/04
to
On Wed, 29 Sep 2004, Kenny McCormack wrote:

> Q: If you call a tail a leg, how many legs does a dog have?

One.

Keith Thompson

unread,
Sep 29, 2004, 7:13:28 PM9/29/04
to
gaz...@yin.interaccess.com (Kenny McCormack) writes:
> In article <td9ml0p8l513gtqm9...@4ax.com>,
> Alan Balmer <alba...@spamcop.net> wrote:
>>On Wed, 29 Sep 2004 20:12:08 GMT, gaz...@yin.interaccess.com (Kenny
>>McCormack) wrote:
>>
>>>
>>>I'll bet (no, I don't have any statistics on this) that a large portion
>>>of the new C programming that is happening today *is* in embedded systems
>>>- hence the overwhelming urge here to consider such things "computers".
>>
>>Nah, it's simply because they *are* computers, and always have been,
>>even before Windows and the 386 were invented ;-)
>>
>>What we don't understand is your overwhelming urge to claim they are
>>not computers.
>
> Because they are not.
[snip]

Look up the word "computer" in a dictionary. (There's a free online
dictionary at www.m-w.com.) Think about whether an embedded system
meets the definition. Then, if you want to discuss it further, go to
alt.usage.english; it's off-topic here.

Dave Vandervies

unread,
Sep 29, 2004, 7:14:13 PM9/29/04
to
In article <cjf9o9$1hk$1...@yin.interaccess.com>,

If you call a leg a tail, how many legs does a dog have?

Kenny McCormack

unread,
Sep 29, 2004, 7:26:33 PM9/29/04
to
In article <lnk6uce...@nuthaus.mib.org>,

Keith Thompson <ks...@mib.org> wrote:
...
>(This is the second forged quotation I've seen today; what's going on
>around here?)

We do it because it is fun!

CBFalconer

unread,
Sep 30, 2004, 12:13:36 AM9/30/04
to
Kenny McCormack wrote:
> Keith Thompson <ks...@mib.org> wrote:
> ...
>> (This is the second forged quotation I've seen today; what's
>> going on around here?)
>
> We do it because it is fun!

PLONK. Never to be seen again on this system.

Derrick Coetzee

unread,
Sep 30, 2004, 1:18:14 AM9/30/04
to
Dan Pop wrote:
> In <cjc6ot$dhu$1...@news-int2.gatech.edu> Derrick Coetzee <dcn...@moonflare.com> writes:
>
>>to C such as compilers and raytracers. Moreover, Java *can* be compiled
>>directly to native code that runs as fast as any other native code,
>
> You're really naive if you believe this. Just because it's native code
> it doesn't mean that it runs necessarily as fast as the native code
> generated by a C compiler from a C program solving the same problem.
>
> Java's portability comes at the cost of Java being an overspecified
> programming language.

That's true. My statement that all native code runs just as fast is
totally wrong. At least you know when you write something in C that
it'll pretty much map right onto the machine (while running a Java
program making extensive use of "int"s on a 16-bit machine for example
would be really slow).
--
Derrick Coetzee
I grant this newsgroup posting into the public domain. I disclaim all
express or implied warranty and all liability. I am not a professional.

Richard Bos

unread,
Sep 30, 2004, 3:06:34 AM9/30/04
to
CBFalconer <cbfal...@yahoo.com> wrote:

> Michael Wojcik wrote:
> >
> > So there are two more data points which demonstrate pretty much
> > nothing. Some embedded systems run Windows. Some do not.
>
> Another data point: No reliable embedded system runs Windows.

s/embedded//, TYVM.

Richard

one20...@yahoo.com

unread,
Sep 30, 2004, 4:11:19 AM9/30/04
to
Dave Vandervies wrote:

>
>
>>What chiefly distinguishes C and its "descendants" from the other
>>cases is that there was a strong movement to preserve C with only
>>relatively unobtrusive changes; thus most of the various languages
>>that diverged from C had to present themselves as new languages to
>>gain wide acceptance.
>
>
> This looks like a (slight) overstatement of the case to me.
>
> C++ is really the only "true" descendant of C; the other C-like languages

not sure how you define "true descendant".
Can C/C++ interpreter Ch be called the true descebdant of C? C++ extends
C with OO. Ch extends C with built-in string for easy scripting and
other extentions for numerical computing. Of course, C99 complex, VLA,
IEEE floating are well supported.

> (at least the ones I know about) tend to have entirely different lineage
> with C-like syntax pasted on (because curly braces are so much k3wLer than
> "begin" and "end").


> Interestingly, most C implementations come packaged with C++
> implementations, and it's usually not difficult (though seldom entirely
> trivial) to convert a well-written C program to a program that does The
> Right Thing when given to a C++ compiler. (Not that there's often a
> good reason to do this.)
>
> So it seems that C and C++, taken as a pair, are not at all unlike F77
> and F90 (if I've got those names right), or Lisp and Lisp-with-CLOS,
> or old-COBOL and new-COBOL-with-OO, and languages like Java are just
> hangers-on that add to the confusion.
>
>
> dave
>

Dan Pop

unread,
Sep 30, 2004, 9:03:19 AM9/30/04
to
In <415B0F31...@yahoo.com> CBFalconer <cbfal...@yahoo.com> writes:

>Michael Wojcik wrote:
>>
>... snip ...
>>
>> So there are two more data points which demonstrate pretty much
>> nothing. Some embedded systems run Windows. Some do not.
>
>Another data point: No reliable embedded system runs Windows.

How about the Microsoft Xbox?

Dan
--
Dan Pop
DESY Zeuthen, RZ group
Email: Dan...@ifh.de
Currently looking for a job in the European Union

Dan Pop

unread,
Sep 30, 2004, 9:27:27 AM9/30/04
to
In <415b39db$0$30687$8fcf...@news.wanadoo.fr> jacob navia <ja...@jacob.remcomp.fr> writes:

>Chris Torek wrote:
>
>> This fits well with my personal definition of an "embedded system
>> computer", which is "any time you don't constantly think: there is
>> a computer in here" when you use it. :-)
>
>That is a special purpose system. Not a general purpose system that can
>be programmed.

It may not be a general purpose system, but someone did program it,
nevertheless.

>The crux in a computer is that it can be programmed to do any new
>task you want by using the processor's instruction set.

The very same computer can be used in a general purpose computing system
and in an embedded control application. Once upon a time, 8086 chips
were quite popular on the logic board of hard disks. No idea what they're
using these days, but I have performed firmware upgrades on modern hard
disks. And I've both used general purpose computing systems based on Z80
chips and programed Z80 chips used in embedded control applications.

>This supposes access to the processor instruction set, and a mean
>of directing its actions using a programming language.

This is entirely true for embedded control systems, too. How do you
think they get programmed in the first place?

>You could convert the microwave in a computer by adding the
>necessary software, but then... it wouldn't be a microwave
>oven anymore...

You don't have to: the microwave oven has a *complete* computer inside:
CPU, memory, I/O devices. It's not any less of a computer just because
it doesn't look like your PC and it doesn't run Windows. Computer and
general purpose computing system are two different things.

How many people know the actual number of computers inside their PCs?

Alan Balmer

unread,
Sep 30, 2004, 11:37:00 AM9/30/04
to
On 30 Sep 2004 13:03:19 GMT, Dan...@cern.ch (Dan Pop) wrote:

>In <415B0F31...@yahoo.com> CBFalconer <cbfal...@yahoo.com> writes:
>
>>Michael Wojcik wrote:
>>>
>>... snip ...
>>>
>>> So there are two more data points which demonstrate pretty much
>>> nothing. Some embedded systems run Windows. Some do not.
>>
>>Another data point: No reliable embedded system runs Windows.
>
>How about the Microsoft Xbox?

What makes you think it's reliable?

We did a prototype wireless card reader using Win CE. The MS
salesperson estimated development time as two weeks - it was closer to
six months. (I wasn't directly involved with the project, so won't
testify as to the reasons.) The final version worked well enough, but
it was hardly a critical application.

Thomas Stegen

unread,
Sep 30, 2004, 1:56:46 PM9/30/04
to
Richard Bos wrote:
> Quite. And I don't want any database application written in VB or Java
> on _my_ network, thank you very much. Java is for slow, broken Web code;

Haha, I wonder how this can be true when the company I work for (approx.
200 employees) uses Java and has no failed projects and a better
referencability than any other company in this particular sector. We
don't really write web code so this is not directly contradicting your
statement, but still a valid point I feel.

Could our platform have been written better in a different language,
maybe. The main benefit of Java is that it can talk to almost anything
between heaven and earth because it has massive industry wide support
for most things.

Yes, hundreds of customers have been served and they are all happy,
or at least that is what they are telling us. At least the customers
for whom our product generates hundreds of millions of pounds of
revenue for each year.

Then again, we use Java to develop our development platform (which is
quite large). The people who develop specific solutions only use this
platform and does not write any Java. Though some companies has bought
the platform and uses it to develop their own products.

Personally I don't find Java a particularly good language nor a
particularly bad language, I do find that I write code faster in Java
than with C even though I've used C more. For code that is not to
complex and needs to be fast C is the best language. Same goes when lots
of hardware access is needed. In most other cases I will choose
something else.

--
Thomas.

Dave Vandervies

unread,
Sep 30, 2004, 2:17:11 PM9/30/04
to
In article <HcP6d.21774$QJ3....@newssvr21.news.prodigy.com>,

one20...@yahoo.com <one20...@yahoo.com> wrote:
>Dave Vandervies wrote:
>
>>
>>
>>>What chiefly distinguishes C and its "descendants" from the other
>>>cases is that there was a strong movement to preserve C with only
>>>relatively unobtrusive changes; thus most of the various languages
>>>that diverged from C had to present themselves as new languages to
>>>gain wide acceptance.
>>
>>
>> This looks like a (slight) overstatement of the case to me.
>>
>> C++ is really the only "true" descendant of C; the other C-like languages
>
>not sure how you define "true descendant".
>Can C/C++ interpreter Ch be called the true descebdant of C? C++ extends
>C with OO. Ch extends C with built-in string for easy scripting and
>other extentions for numerical computing. Of course, C99 complex, VLA,
>IEEE floating are well supported.

That sounds to me like "C implementation with extensions", not "descendant
of C".


dave

--
Dave Vandervies dj3v...@csclub.uwaterloo.ca
Your apology is inappropriate, and is therefore rejected. In future,
only apologise when you've done something wrong, damn you.
--Richard Heathfield in comp.lang.c

Dave Vandervies

unread,
Sep 30, 2004, 2:21:15 PM9/30/04
to
In article <415b39db$0$30687$8fcf...@news.wanadoo.fr>,
jacob navia <ja...@jacob.remcomp.fr> wrote:

[Attributions for this one got lost]


>>>As far as I'm concerned, a microwave is not a computer. ...

[snip]

>The crux in a computer is that it can be programmed to do any new
>task you want by using the processor's instruction set.

By that definition, my parents' microwave is just as much a computer as
their PC running Windows is.

Both can be programmed to do any new task you want (within the limits of
computability and hardware resources) by someone with the appropriate
tools. Neither can be programmed by anybody in the house or by using
only tools found in the house.

Mark McIntyre

unread,
Sep 30, 2004, 4:53:02 PM9/30/04
to
On Wed, 29 Sep 2004 21:27:03 GMT, in comp.lang.c ,
gaz...@yin.interaccess.com (Kenny McCormack) wrote:

>In article <td9ml0p8l513gtqm9...@4ax.com>,
>Alan Balmer <alba...@spamcop.net> wrote:

>>What we don't understand is your overwhelming urge to claim they are
>>not computers.
>
>Because they are not.

Heck, even wikipedia gets this right.

"As currently defined by The Oxford English Dictionary, Second Edition
(OED2) a Computer is a device for making calculations or controlling
operations that are expressible in numercial or logical terms."

I see nothing there about whether it has an intel processor, keyboard,
display device etc.

>See Mark Twain story at bottom (one of my favorites).

And totally irrelevant. A tail is not a leg, so why call it one?

Mark McIntyre

unread,
Sep 30, 2004, 4:57:46 PM9/30/04
to

Mhm. When I was at school, I knew lots of people like you. They too thought
making complete tossers of themselves was fun. Most of them ended up
getting poor grades, pregnant or bain-damaged. Or all three.

*plonk*

QuantumG

unread,
Sep 30, 2004, 9:31:37 PM9/30/04
to
Alan Balmer <alba...@att.net> wrote in message news:<4cmgl0p63ptioklji...@4ax.com>...
>
> That's an incorrect assumption very early in your argument.

It's not an "incorrect assumption" it's the declared charter of the C
standization committee.

Chris Barts

unread,
Oct 1, 2004, 2:15:40 AM10/1/04
to
On Tue, 28 Sep 2004 17:15:02 +0000, Mike Wahler wrote:

>
> "Richard Bos" <r...@hoekstra-uitgeverij.nl> wrote in message
> news:415907da....@news.individual.net...
>> "Mike Wahler" <mkwa...@mkwahler.net> wrote:
>>
>> > <beli...@aol.com> wrote in message
>> > > Chris Barts <chb...@gmail.com> wrote in message
>> > >
>> > > > (and, to some extent, act like C) thrive, while Fortran and Cobol
> and Lisp
>> > > > descendants have failed. There's nothing to suggest that C is
> becoming any
>> > > > less relevant with time.
>> > >
>> > > I think the statement "Fortran has failed" is wrong.
>> >
>> > Yes it's wrong. But Chris did not make that statement. He stated:
>> > "Fortran and Cobol and Lisp descendants have failed."
>> >
>> > Whether this is true or not I have no idea, especially when
>> > 'failed' is a subjective issue.
>>
>> If he meant "Fortran and Cobol and (Lisp descendants)", it's clearly
>> untrue because of the first two languages.
>
> Yes.
>
>> However, I suspect he meant "(Fortran and Cobol and Lisp) descendants";
>
> Yes, that's how I interpreted it also. Perhaps Chris will clarify.

I could have been a lot clearer and much less wrong.

I meant that C not only had three successful lineal descendants
(Objective-C, C++, and C#), but that it inspired people to copy its syntax
(a freeform block structure, instead of line-oriented code, delimited with
curly-braces, instead of the Algolish begin-end blocks) and, to some
extent, semantics (using = for assignment and == for equality-test,
semicolon-as-separator instead of the Pascalish semicolon-as-terminator,
etc.)).

In my second sense, Java and Perl can be considered spiritual descendants
of C to some extent. They, especially Perl, stripped out a lot of the
lower-level semantics of C, but they kept a lot of things and generally
made it as easy as possible for an experienced C programmer to get up to
speed in the language.

Fortran and Cobol are both, traditionally, line-structured languages with
strict formatting rules. They have syntax that didn't catch on much past
the 1970s (Fortran has whitespace rules that almost defy belief when a
younger programmer sees them, Cobol has an odd reliance on a strict
grouping of divisions and subdivisions within the code) and semantics that
are simply not seen in most modern languages (Fortran has great
array-handling primitives, and doing very precise decimal math in Cobol is
more natural than most of the control structures). To be short, most
languages (by number of languages, not by number of lines of code) simply
don't look or work like that anymore.

Lisp has had a more pervasive influence, but it's usually dressed up in
more appealing syntax. I don't know how successful Scheme has been, but
(and here's where I dropped the ball in my own eyes) functional
programming has become a lot more common over the past few decades. Lispy
ideas about high-level control structures (map, closures) and
garbage-collection have really caught on. You can even program essentially
functional (in the Lisp sense) code in Perl and other ultra-high-level
`scripting' languages. So while languages don't look like Lisp, they sure
as hell work like it.

I didn't say that Fortran, Cobol, and Lisp are dead. They might smell
funny, but they aren't gone forever. ;-) I did say that C inspired a whole
class of block-structured languages that look different from the
Algol-inspired and `other' languages of this kind.

Richard Bos

unread,
Oct 1, 2004, 2:23:09 AM10/1/04
to
gaz...@yin.interaccess.com (Kenny McCormack) wrote:

Be careful. Your fun can easily slide into misrepresentation, which can,
in some jurisdictions, leave you open to lawsuits.

Richard

Alan Balmer

unread,
Oct 1, 2004, 12:24:11 PM10/1/04
to
On 30 Sep 2004 18:31:37 -0700, q...@biodome.org (QuantumG) wrote:

>Alan Balmer <alba...@att.net> wrote in message news:<4cmgl0p63ptioklji...@4ax.com>...

<excessive snipping corrected>
> The point of my rant was
>>that a large majority of existing compilers should have conformed with
>>a large percentage of C99 back in '99. The purpose of releasing a
>>standard is to codify existing practice.
>>
<end of restored quote>


>> That's an incorrect assumption very early in your argument.
>
>It's not an "incorrect assumption" it's the declared charter of the C
>standization committee.

No, it's only one of 11 principles listed in the charter, and far from
the only reason to release a standard.

Julian V. Noble

unread,
Oct 1, 2004, 1:38:32 PM10/1/04
to
Kenny McCormack wrote:
>
[ snipped ]

> See Mark Twain story at bottom (one of my favorites).
>

> (Or, to put it a little succinctly: Back at ya!)
>
> --- Begin Story ---
> Q: If you call a tail a leg, how many legs does a dog have?
> A: 4. Calling a tail a leg does not make it a leg.

It's an Abe Lincoln story, not a Mark Twain one. He said it in
a speech, possibly one of the debates with Douglas.

--
Julian V. Noble
Professor Emeritus of Physics
j...@lessspamformother.virginia.edu
^^^^^^^^^^^^^^^^^^
http://galileo.phys.virginia.edu/~jvn/

"For there was never yet philosopher that could endure the
toothache patiently."

-- Wm. Shakespeare, Much Ado about Nothing. Act v. Sc. 1.

Mark McIntyre

unread,
Oct 2, 2004, 4:23:02 PM10/2/04
to
On Thu, 30 Sep 2004 21:54:05 GMT, in comp.lang.c , CBFalconer
<cbfal...@yahoo.com> wrote:

>Mark McIntyre wrote:
>>
>... snip ...


>> Most of them ended up getting poor grades, pregnant or
>> bain-damaged. Or all three.
>

>Alexander Bain, 1818-1903, Scottish philosopher. To have been
>bain-damaged your schoolmates must be well into their second
>century. I didn't think you were that old :-)

From Wikipedia:
"Bain-damaged: forced to read the works of Alexander Bain until your brain
melts, or you become pregnant, or you begin to imagine you're a mattress.".

>(I thought it had to be a real word, but that's all I could find
>in my dictionary)

There was an Ed McBain in Once upon a Time in the West. Will he do?

Mark McIntyre

unread,
Oct 2, 2004, 4:23:28 PM10/2/04
to
On Thu, 30 Sep 2004 21:54:07 GMT, in comp.lang.c , CBFalconer
<cbfal...@yahoo.com> wrote:

>Mark McIntyre wrote:
>>
>... snip ...
>>

>> And totally irrelevant. A tail is not a leg, so why call it one?
>

>But a tale may well be a leg end :-)

whitespace is significant in C....

Michael Wojcik

unread,
Oct 2, 2004, 5:02:51 PM10/2/04
to

In article <cjf5bq$58$1...@yin.interaccess.com>, gaz...@yin.interaccess.com (Kenny McCormack) writes:
>
> I can't see much point in doing new development on conventional computers
> (aka, "non-embedded systems") in C or other low-level languages, except of
> course for things like OSes and device drivers (and any other obvious
> exceptions to my generalization).

Three reasons for new development in C on general-purpose systems:

1. Portability. There's a well-supported standard C implementation
for all of the many platforms we deliver software on. (We use the OS
vendor's commercial implementation, except on Linux.) Until
relatively recently, C was the only language for which that was true.

2. Interoperability with existing code base. Like many
organizations, we have a large C code base (thanks to #1). Those
products need to be enhanced and extended, and C is the best choice
for much of that work because it's the least disruptive.

3. Expertise. For most projects there's no compelling reason to
discard thousands of programmer-years of C experience.

--
Michael Wojcik michael...@microfocus.com

[After the lynching of George "Big Nose" Parrot, Dr. John] Osborne
had the skin tanned and made into a pair of shoes and a medical bag.
Osborne, who became governor, frequently wore the shoes.
-- _Lincoln [Nebraska] Journal Star_

jacob navia

unread,
Oct 2, 2004, 5:27:34 PM10/2/04
to
You mention very good reasons.

There are others.

What I like from C is that it is not object oriented, nor
functional, nor a list processing language, nor are arrays
really the "basic object" like APL.

C is agnostic. It has no preferred data model, it is up to
the programmer.

This is quite a freedom. There is no preconceived notions, and
you can build anything.

This has its down sides. It is lacking parametized types,
and polymorphism. But incorporating this and a few changes
can make C a nice language to use. It is much simpler than
many, like C++.

C is a general purpose language, and many programming languages
are written in C.

Contrary to the opinion of many in this list, I think that C
has a great potential.

Not only for embedded systems, or maintenance code but for
developing new software.

Randy Howard

unread,
Oct 2, 2004, 5:40:22 PM10/2/04
to
In article <415f1d47$0$3058$8fcf...@news.wanadoo.fr>, ja...@jacob.remcomp.fr
says...

> Contrary to the opinion of many in this list, I think that C
> has a great potential.

Do you really think there are many people that would bother subscribing
to a newsgroup (this is not a list btw), that don't think it has
potential?

As soon as a better language comes along to write operating systems,
you be sure and let us know. :-)

--
Randy Howard (2reply remove FOOBAR)

Guillaume

unread,
Oct 2, 2004, 5:47:26 PM10/2/04
to
> Contrary to the opinion of many in this list, I think that C
> has a great potential.
>
> Not only for embedded systems, or maintenance code but for
> developing new software.

I second that.

Not to mention than most "higher level" languages invariably
mean more bloat. I hear people claiming otherwise on a regular
basis (especially about C++), but no one has ever been able to
prove that. Never.

This bloat may allow to put extra bells and whistles all over
the place in less development time, but is that really useful?
No other language can beat the efficiency of C overall. That's
the bottom line in my opinion.

And the fact (that other people have mentioned) that the C
standard is of much higher quality than any other language's
(despite what we may read here and there every once in a while)
is equally a big plus.

Mike Wahler

unread,
Oct 4, 2004, 2:42:35 PM10/4/04
to
"Keith Thompson" <ks...@mib.org> wrote in message
news:lnk6uce...@nuthaus.mib.org...

> gaz...@yin.interaccess.com (Kenny McCormack) writes:
>
> (This is the second forged quotation I've seen today; what's going on
> around here?)

It's coming up on election time in the U.S., of course. :-)

-Mike


Chris Barts

unread,
Oct 15, 2004, 4:38:36 AM10/15/04
to
Dave Thompson wrote:
> On Fri, 01 Oct 2004 00:15:40 -0600, Chris Barts <chb...@gmail.com>
> wrote:
> <snip>

>
>>I meant that C not only had three successful lineal descendants
>>(Objective-C, C++, and C#), but that it inspired people to copy its syntax
>>(a freeform block structure, instead of line-oriented code, delimited with
>>curly-braces, instead of the Algolish begin-end blocks) and, to some
>>extent, semantics (using = for assignment and == for equality-test,
>>semicolon-as-separator instead of the Pascalish semicolon-as-terminator,
>>etc.)).
>>
>
> You stated that last backwards -- C terminator, Pascal separator.

Very likely true. I never fully learned the semicolon rules under
Pascal, because I never fully learned Pascal.

> And
> I would count = and ; as syntax issues also; for C's semantic
> influence I would count pass-by-value at least as default, nonzero as
> boolean true (although LISP had nonNIL), and to an extent bit
> operations on integers (some other languages supported this, but not
> as conveniently). And maybe value-returning assignment -- I _think_
> algol68 did that, but no other "normal" (procedural) HLL.

Again, all very true. C has set a lot of standards, and it seems that
they are reflexively thought of as being How It's Done even though
plenty of languages do/did it differently.

>
> While your list was unique to C, which is what you actually say as I
> read it, the individual features aren't. Algol and PL/I were freeform,
> the latter with terminator.

Yes, I meant for the whole list to stand as a unit, as a list of things
that were all present in C and are also all present in a lot of other
languages designed since C.

> AIUI algol 68 was so lax on representation
> you could have { } "spell" do od, although you probably wouldn't have
> been very popular.

How styles change, neh? ;) And I was always fascinated by how Algol-68
could be so lax on the details of how the language was represented.

> FORTRAN used = and .EQ.; COBOL used = for both
> (although assignments are also done by "verb" forms that don't use =).

I have programmed FORTRAN and COBOL (note caps). These are both very true.

>
>
>>In my second sense, Java and Perl can be considered spiritual descendants
>>of C to some extent. They, especially Perl, stripped out a lot of the
>>lower-level semantics of C, but they kept a lot of things and generally
>>made it as easy as possible for an experienced C programmer to get up to
>>speed in the language.
>>
>

> I would say (executable) Java and the action part of awk were
> deliberate "cleanups" of C; csh and perl somewhat less so -- that is,
> they used some features of C, but also their own dissimilar ones.

Yes, precisely. (Although isn't Java a `better C++'? ;))

>
>
>>Fortran and Cobol are both, traditionally, line-structured languages with
>>strict formatting rules. They have syntax that didn't catch on much past
>>the 1970s (Fortran has whitespace rules that almost defy belief when a
>>younger programmer sees them, Cobol has an odd reliance on a strict
>>grouping of divisions and subdivisions within the code)
>
>

> FORTRAN did in the days it was all-caps; the versions known by Fortran
> (>= 1990) _also_ allow "modern" (or at least mainstream) syntax,
> except with explicit continuation.

Yes, I was lax in my spelling. (But, to be fair, FORTRAN is
case-insensitive, not necessarily all-caps.)

> COBOL did (at least through 1985, I
> haven't examined the most recent version) require certain things
> (divisions, sections, IIRC level numbers) to start at the beginning of
> a line, but otherwise spanned lines and needed explicit continuation
> only within a string (and FWIW standard C effectively does also -- it
> requires breaking the literal and letting the pieces be concatenated)
> and allowed multiple statements and IIRC even sentences on a line
> although they were mostly verbose enough it would often be an
> inconveniently long line.

COBOL is inconvenient? Since when? :)

> COBOL's required ordering is in my view not
> that different from declarations-before-code in algol, FORTRAN, and C
> within a block before C99; and Ada, and Pascal which in its classic
> form has grouping/ordering at least as strict.

True. But C and FORTRAN don't create explicit divisions within the code,
so they at least look free-form with respect to where declarations can go.

>
> Original BASIC was strictly line-oriented. With = for both BTW.
> APL also was line-oriented. And I believe MUMPS. (And RPG.)

APL is in a world of its own, and I don't think it's used much these
days. BASIC has gone through so many fundamental revisions and
nonstandard extensions it makes Pascal look like a single, unified
whole. I've never used MUMPS or RPG, nor have I ever seen sample code in
either of those languages.

>
>
>>and semantics that
>>are simply not seen in most modern languages (Fortran has great
>>array-handling primitives, and doing very precise decimal math in Cobol is
>>more natural than most of the control structures). To be short, most
>>languages (by number of languages, not by number of lines of code) simply
>>don't look or work like that anymore.
>>
>

> The Fortran array enhancements (aka HPF) are standard only since F90.

Huh. I thought FORTRAN has always had array-handling primitives.

> PL/I and Ada (already) incorporated pretty much both of these, and APL
> arguably did even better on arrays, but those aren't modern (at least
> in the sense of recent) either.

PL/I and Ada implemented pretty much everything from their respective
eras. ;) And, as I said, APL is Lisp translated to mirror-Greek for math
geeks.

>
> I wouldn't call even the original COBOL control structures unnatural
> -- admittedly a subjective judgement -- although they tend to be
> fragile and definitely aren't orthogonal (arguably elegant); -85 adds
> "structured" but (unsurprisingly) still rather verbose options.

COBOL has PERFORM THROUGH and (in some versions) the ALTER verb. Hardly
a threat to if ... else or while.

And then we have the old joke:

The newest language in the COBOL family has object-oriented semantics.
It's called ADD 1 TO COBOL GIVING COBOL.

Dave Thompson

unread,
Oct 25, 2004, 1:39:43 AM10/25/04
to
On Fri, 15 Oct 2004 02:38:36 -0600, Chris Barts <chb...@gmail.com>
wrote:

> Dave Thompson wrote:
<many points of agreement snipped>

> > I wouldn't call even the original COBOL control structures unnatural
> > -- admittedly a subjective judgement -- although they tend to be
> > fragile and definitely aren't orthogonal (arguably elegant); -85 adds
> > "structured" but (unsurprisingly) still rather verbose options.
>
> COBOL has PERFORM THROUGH and (in some versions) the ALTER verb. Hardly
> a threat to if ... else or while.
>

True, but at least the latter is a crock no one with sense would think
of as the natural solution to any but the weirdest problem. For me the
original control structures, used in the "obvious" way, are clear
enough, although when you (need to) change the algorithm even modestly
you often must rearrange the code (e.g. put a sentence, or PERFORM
body, out-of-line) which is what I meant by "fragile".

And as I said, COBOL-85 adds "properly structured" forms:
IF condition THEN body can contain nested forms [ELSE etc] END-IF
PERFORM VARYING var etc UNTIL cond body ditto END-PERFORM
(Plus a bunch of specialized ones like READ ... [NOT] AT END ... .)
These are isomorphic to, though often more verbose than, the algol/
Pascal/C etc. equivalents. On the PERFORM you can choose TEST BEFORE =
C while, Pascal while do or TEST AFTER = C do while, Pascal repeat
until. Plus EVALUATE which is like a case (C switch) statement on LSD.

And for that matter also nested procedures (aka program-units) a la
algol, PL/I, Pascal, Ada, but not C et seq; although with the clunky
syntax overhead (at least 10 lines or so) for each, and outer-unit
variables are NOT accessible unless you declare them GLOBAL -- which
actually isn't GLOBAL in the normal meaning, only for a subtree; what
other folks sometimes call global COBOL calls EXTERNAL!

> And then we have the old joke:
>
> The newest language in the COBOL family has object-oriented semantics.
> It's called ADD 1 TO COBOL GIVING COBOL.

It's more "OO" <G>, and more idiomatic, to just ADD 1 TO COBOL .

I believe this actually happened in 2002(?), along with other less
major changes, but I haven't looked seriously through the result yet.

- David.Thompson1 at worldnet.att.net

Paul Hsieh

unread,
Oct 25, 2004, 7:34:28 PM10/25/04
to
Dan...@cern.ch (Dan Pop) wrote:
> Derrick Coetzee <dcn...@moonflare.com> writes:
>
> >Now Java, C++, Visual Basic, and others are (each!) more prevalent.
>
> Do you have some hard data to back up this statement? Whenever I look at
> some open source project, I see C source code. Exceptionally Fortran or
> C++.

http://sourceforge.net/softwaremap/trove_list.php?form_cat=160
http://www.tiobe.com/tpci.htm

While neither is really "hard" data, they are at least independent
measures that show nearly the same thing. It shows C, C++ and Java to
be all neck and neck. Not sure why you are seeing any appreciable
amount of Fortran, but I'll take these figures over anecdotal claims
about what you have seen or not.

--
Paul Hsieh
http://www.pobox.com/~qed/
http://bstring.sf.net/

Rob Thorpe

unread,
Oct 26, 2004, 5:03:15 AM10/26/04
to
webs...@gmail.com (Paul Hsieh) wrote in message news:<d0a001b7.04102...@posting.google.com>...

This shows projects overall, but it is interesting to look at
distributions. They contain fairly well used and stable software, not
necessarily new software, but they show what is used in practice.
dhwheeler did a code line count on Red Hat 7.1. These are his
results:

C 71.18%
C++ 15.18%
Shell (Bourne-like) 2.63%
Lisp 2.40%
Assembly 1.88%

See http://www.dwheeler.com/sloc/redhat71-v1/redhat71sloc.html

The great masses of C code are the kernel, gcc, binutils and X

0 new messages