>On 11 Oct 95, he then wrote:
> From what I have seen most major research universities have/are
> abandoning all operating systems and languages except Unix and C/C++
> (except maybe some useless theoretic masturbatorial crap).
>Coincidentally, I just stumbled over the "FIRST-COURSE LANGUAGE FOR
>COMPUTER SCIENCE MAJORS" list, which is maintained by Dick Reid and
>available at ftp.cps.msu.edu:pub/arch/CS1_Language_List.Z. This list
>surveys the languages used in the first course for Computer Science
>majors.
>The current (13th) edition lists 442 institutions. It turns out that
>only 39 of those institutions use C as the first language, and 34 C++.
>Pascal, on the other hand, is used by more than twice as many
>institutions as C and C++ combined!
>Here is a summary by language:
> Pascal: 157 institutions
> Ada: 73 institutions
> Scheme: 50 institutions
> C: 39 institutions
> Modula: 35 institutions
> C++: 34 institutions
> Modula-2: 13 institutions
> Fortran: 8 institutions
> Turing: 6 institutions
> SML: 6 institutions
> Miranda: 4 institutions
> Eiffel: 3 institutions
> Modula-3: 2 institutions
> Oberon: 2 institutions
> ISETL: 2 institutions
> ObjPascal: 1 institution
> ML: 1 institution
> Smalltalk: 1 institution
> Beta: 1 institution
> Prolog: 1 institution
> Simula: 1 institution
> Haskell: 1 institution
> Orwell: 1 institution
> Total: 442 institutions
What language is used for first class is not as important as
which languages are used in the core classes and which languages
are used in research. If you take your list and sort by
the top 40 US CS departments, you get:
01 C Stanford University
02 Scheme Massachusetts Institute of Technology
03 Scheme University of California, Berkeley
04 C Carnegie Mellon University
05 Pascal Cornell
06 Scheme Princeton University
07 Pascal University of Texas, Austin
08 Scheme University of Illinois
09 C University of Washington
10 Pascal/C University of Wisconsin, Madison (2/3-term Pascal, 1/3-term C)
11 C Harvard (Intro I, CS50).
12 C Cal Tech (CS1)
13 ObjPascal Brown University
14 C++ University of California, Los Angeles
15 Scheme Yale University
16 Pascal University of Maryland, College Park
17 Pascal New York University
18 Pascal University of Massachusetts Amherst
19 Scheme Rice University (C last half)
20 C University of Southern California
21 C University of Michigan (1/4 Pascal, then C)
22 Scheme Columbia University
23 Pascal University of California, San Diego
24 Scheme University of Chicago
25 Pascal University of Pennsylvania
26 C++ Purdue University
27 Pascal Rutgers University
28 Pascal University of North Carolina, Chapel Hill
29 ????? University of Rochester
30 Modula-3 State University of New York, Stony Brook (CS1)
31 Pascal Georgia Tech
32 Pascal University of Arizona
33 Pascal University of California, Irvine
34 C++ University of Virginia
35 Scheme Indiana University
36 Scheme Johns Hopkins
37 Scheme Northwestern University
38 Modula Ohio State
39 Pascal University of Colorado, Colorado Springs
40 Scheme University of Utah
Look mom no Ada and little Modula*!
First, Pascal is a ancient joke, you can't really do anything real in
it today and in my opinion those schools that teach it are too
bleeping lazy to change CS1 to a practical language. (My
undergraduate institution UCI said they were going to convert to Ada,
that was 10+ years ago) Pascal's antiquated design makes wonderful
fodder for C programmers to ridicule pascal style languages ({} vs
begin/end, etc). Its a temporary language which quickly gets switched
to the real "core" language. This is most likely to be C/C++.
Scheme is also a theoretic joke. Its for those computer scientists
who think programming is non-scientific vomit and want to shoe-horn
all the pathetic programming "CONCEPTS OF COMPUTER SCIENCE" into a
single quarter so they won't have to teach it very often. (They don't
care that they have 4-5 years to teach programming/software
engineering...). Thus, Scheme is also a temporary concepts language
(which like Pascal through its worthlessness promotes the conversion
of students into C biggots). Depts like Berkeley and UCLA quickly
switch to C in the second class (CS2) and basically use C for the
rest of the way (UCLA has now killed Scheme for the first class and
teaches only C++).
So basically for most universities in the above list, you can just change
Pascal => Pascal/C and Scheme => Scheme/C. C is thus around 100%.
From what I have seen in research, Unix/C/C++ almost completely
dominatates most software oriented parts of computer science like
operating systems, etc.
Jay.
Make that 158. All math and CS students at University of Waterloo,
Ontario, Canada see Pascal as the first language that they learn here.
| Ada: 73 institutions
| Scheme: 50 institutions
| C: 39 institutions
| Modula: 35 institutions
| C++: 34 institutions
| Modula-2: 13 institutions
| Fortran: 8 institutions
| Turing: 6 institutions
| SML: 6 institutions
| Miranda: 4 institutions
| Eiffel: 3 institutions
| Modula-3: 2 institutions
| Oberon: 2 institutions
| ISETL: 2 institutions
| ObjPascal: 1 institution
| ML: 1 institution
| Smalltalk: 1 institution
| Beta: 1 institution
| Prolog: 1 institution
| Simula: 1 institution
| Haskell: 1 institution
| Orwell: 1 institution
--
-------------------------------------------------------------
Carsten Whimster CS241 Tutor MC1021 x6657 cs241@cayley
-------------------------------------------------------------
>> Here is a summary by language:
>> Pascal: 157 institutions
>> Ada: 73 institutions
>> Scheme: 50 institutions
>> C: 39 institutions
>> Modula: 35 institutions
>> C++: 34 institutions
>> Modula-2: 13 institutions
>> Fortran: 8 institutions
>> Turing: 6 institutions
>It seems evident that the listing of Modula and Modula-2 ought to be
>combined as the first clearly should be the second, making a total of 48
>for this language.
>
This list is just quoting data from the Reid Report, in which Richard
Reid of Michigan State (re...@cps.msu.edu) tracks languages used in the
intro course (to the extent that people report such to him).
Dick reports only what people tell him. If a school said it's using
Modula, he reports Modula. In his shoes, I'd do the same.:-)
My guess is that he is preparing a new release of the report, which he
posts several times a year to comp.edu. He takes only first-hand
information, so if you have anything to report, you should do so now.
Mike Feldman
------------------------------------------------------------------------
Michael B. Feldman - chair, SIGAda Education Working Group
Professor, Dept. of Electrical Engineering and Computer Science
The George Washington University - Washington, DC 20052 USA
202-994-5919 (voice) - 202-994-0227 (fax) - mfel...@seas.gwu.edu (Internet)
------------------------------------------------------------------------
Fight FUD with Fact
------------------------------------------------------------------------
Ada on the WWW: http://lglwww.epfl.ch/Ada/ or http://info.acm.org/sigada/
------------------------------------------------------------------------
[...]
>What language is used for first class is not as important as
>which languages are used in the core classes and which languages
>are used in research. If you take your list and sort by
>the top 40 US CS departments, you get:
[...]
>37 Scheme Northwestern University
Not to be a nitpicker, but there are two equivalent intro-to-programming
sequences at NU, so this list is a little oversimplified. (I'm sure the same
is true for some of the other institutions on the list.) I am taking the
c/c++ sequence, A10 and B30. There is also A11/B11, which teaches using
Scheme and C++. To my knowledge, as far a operating systems, everybody's
encouraged to use whatever they want. The labs have DOS and UNIX compilers,
and I did my programming on a Mac actually, under CodeWarrior Version 5.
-Tony
______________
Anthony W. Becker "If we wish not to go backwards,
CAS Computer Science we must run."
sho...@nwu.edu -Pelagius
708.332.9179
What do you call UNIX? Sure looks awful C like to me and does multi-tasking
just fine...Also I'm just starting to learn a parallel processing language
called MPL which is almost 100% ANSI C with some neat extensions. The compiler
is even made by the FSF.
If you're doing research you're probably less concerned with langauges and
more concerned with algorithms for getting the job done. I could be wrong
on this not being a researcher myself...
--
"If it wasn't for C, we would be using BASI, PASAL, and OBOL."
Jonas J. Schlein (sch...@gl.umbc.edu)
Naivety strikes again :-)
I am sorry to say that at least in Universities, researchers most definitely
*do* use a language simply because of its popularity. They may to be fair
have some reasons for doing so:
A popoular language may have more tools
A popular language may have free compilers to play with
They may have learned this popular language in school
There are MANY MANY people doing research on concurency who simply
don't know Ada (but are sure it is junk).
I find the last thought, that it is obvious to do an OS in C, to be
an entertaining self-referential statement along these lines. Why would
one only consider C to do an OS, could it possibly have something to
do with tradition and popularity? :-) :-)
It is of course perfectly reasonable and technically possible to write
an operating system in Ada-95, and indeed Ada seems a far better language
for such a project in C. Any of the low level stuff you can do in C,
you can do in Ada 95, often more easily (e.g. record rep clauses to
layout data explicitly), and certainly more clearly!
>I am sorry to say that at least in Universities, researchers most definitely
>*do* use a language simply because of its popularity. They may to be fair
>have some reasons for doing so:
>A popoular language may have more tools
>A popular language may have free compilers to play with
>They may have learned this popular language in school
How true. They may also be blithely unconcerned with choosing the
"right" language, rather they'll just pick the one they know, thinking
it'll get the job done just as well.
>There are MANY MANY people doing research on concurency who simply
>don't know Ada (but are sure it is junk).
One hopes this will change with the availability of GNAT. To be fair to
the researchers, they have built zillions of big and small C extensions
and dialects, because the C compilers, preprocessor, etc., were there
in source form for them to hack on. For this we can thank other nicely
funded efforts, such as those at AT&T and Berkeley (the latter funded
largely by ARPA).
An example of this is Concurrent C, which was done by Gehani and friends
at Bell Labs, in around 1986. This glued an extended and (IMHO) improved
Ada tasking model onto C, and was implemented as a preprocessor. Their
AT&T "religious" connections aside, all the tools were there to make
this development easy. Later they turned the preprocessor into a
compiler, presumably using some C compiler as a baseline on which to build.
Indeed, C++ started this way too, as a preprocessed "C with classes"
at Bell Labs, that later turned into a compiler.:-)
For reasons having (IMHO) mostly to do with DoD myopia, nothing
comparable was funded to completion for Ada in the early/mid 80's.
There was Ada/Ed, but this was not intended as a true compiler.
The true compilers were ALS and AIE, but these were contracted as
full-bore, industrial-strength things that turned out to be disasters
and were in any case overtaken by vents by the time they were delivered.
I saw a _reliable_ estimate of $125 million that went into the ALS
sinkhole before it eas finally cancelled.
The only thing remotely comparable to the C research compilers was
Arcturus, part of the Arcadia project at UC Irvine in the early 80's.
As I recall, this was ARPA-funded but (apparently) the funding for
it was cut off shortly before it would have been viable. I have never
been able to find out just why it was snuffed out, but have heard hints
that the decision was political (ARPA's priorities changed).
The sources for the early Pascal and Modula compilers were given away (or
sold for a pittance), and this helped stimulate research in those areas,
as well as fostering commercial implementations that could use the
freebies at least indirectly as "reference implementations".
I did not follow the early Pascal stuff, but knew of several Modula-2
houses who started their commercial compilers by purchasing Wirth's
original sources.
So now, with Ada 95, we have GNAT, and thank heavens for that.
Now researchers have an honest-to-goodness, nearly-validated,
multi-host, multi-target Ada 95 compiler to hack on.:-)
Mike Feldman
>More fun talking to a UC Berkeley CS graduate today. In my ...
I don't teach. I've been programming in the embedded real-time area
since, literally, before most current college students were born. So
I have no idea what is being taught in schools. And, IMHO, knowledge
of a particular language is usually a prerequisiter for a programming
job.
These are the languages that I would look to see on a resume.
In my shop we use C and VisualBasic. There is still a need for Fortran
in places which are heavily Math intensive. ADA is used within the
defense related community. C++ is used extensively, although it is
just begining to be considered in the embedded arena. COBOL is not yet
dead within the DP industry. There is a limited need for ASM at the
hardware/driver level (usually an engineering application not CS). And
thats really about it for commercially useable languages.
My most marketable (embeded real-time) language skill is C.
--
Roy_Wi...@Milacron.com
Any opinions in this posting are my own and not those of my present
or previous employers.
>This summary needs one slight correction. The language "Modula"
>is obsolete, and I doubt that anyone is using it. Almost
>certainly all the responses saying "Modula" were referring to
>Modula-2. In other words, the Modula and Modula-2 figures
>refer to the same language, and should be added together.
That is for Dick Reid to decide. He has just published a new
edition on comp.edu. I will not bother to cross-post it here,
as you can read it there. "Modula" is still given there. Dick
reports the information people send him, and probably has
neither time nor inclination to track down the "Modula"
places and ask them. I suggest you write to Dick.
>Somebody else pointed out that you get a different picture if
>you look at the "top universities". The catch here is that only
>US universities were mentioned. One very clear point that comes
>out of the list is that Modula-2 is very popular outside the
>US, but that US institutions seem to favour C/C++.
Some do. Maybe someone can take the new list and sort it by
country.
We do need to be careful not to lie with statistics. This list
may well be biased, as it lists those institutions have faculty
or students who are actibve on the net, and read comp.edu or
hear about Dick's tracking from other sources. We have no way to
know how representative it is, so I'd be careful in bandying these
numbers around as though they were Revealed Truth.
Mike Feldman
>In my shop we use C and VisualBasic. There is still a need for Fortran
>in places which are heavily Math intensive. ADA is used within the
>defense related community.
I agree with your statements, for the most part. Since this is a heavily
cross-posted note, and these groups are international, I thought I'd
put a few Ada facts on the record. It is true that Ada is used in
the "defense community", but you might be surprised by some things
on the attached list.
Mike Feldman
---
Interesting Projects (mostly non-defense)
in which Ada is used to at least a significant degree.
I am just getting starting with this categorization by domain;
I know the list is incomplete. I am very interested in getting
additions, corrections, and additional domains; I want the data
to be current and verifiable.
Michael B. Feldman
chair, SIGAda Education Working Group
Professor, Dept. of Electrical Engineering and Computer Science
The George Washington University
Washington, DC 20052 USA
202-994-5919 (voice)
202-994-0227 (fax)
mfel...@seas.gwu.edu (Internet)
Air Traffic Control Systems, by country
Australia
Belgium
Brazil
Canada
China
Czech Republic
Denmark
Finland
France
Germany
Greece
Hong Kong
Hungary
India
Ireland
Kenya
Netherlands
New Zealand
Pakistan
Scotland
Singapore
South Africa
Spain
Sweden
United Kingdom
United States
Vietnam
Banking and Financial Networks
Reuters
Swiss Postbank Electronic Funds Transfer system
Commercial Aircraft
Airbus 330
Airbus 340
Beechjet 400A (US business jet)
Beech Starship I (US business turboprop)
Beriev BE-200 (Russian forest fire patrol)
Boeing 737-200, -400, -500, -600, -700, -800
Boeing 747-400
Boeing 757
Boeing 767
Boeing 777
Canadair Regional Jet
Embraer CBA-123 and CBA-145 (Brazilian-made regional airliners)
Fokker F-100 (Dutch DC-9-size airliner)
Ilyushin 96M (Russian jetliner)
Saab 2000
Tupolev TU-204 (Russian jetliner)
Communication and Navigational Satellites
Cassini
ENVISAT-1 - European Space Agency, Earth observation satellite
EOS - NASA's Earth Observing System
Goes
INMARSAT - voice and data communications to ships and mobile communications
Intelsat VII
NSTAR (Nippon Telephone and Telegraph)
PanAmSat (South American Intelsat-like consortium)
RadarSat (Canada)
United States Coast Guard Differential Global Positioning System
Railway Transportation
Cairo Metro
Calcutta Metro
Caracas Metro
Channel Tunnel
Conrail (major U.S. railway company)
French High-Speed Rail (TGV)
French National Railways
Hong Kong Suburban Rail
London Underground
Paris Metro
Paris Suburban Rail
Television Industry
Canal+ (French pay-per-view TV, remote cable box control software)
Sounds like sour grapes from the losers to me. Highly effective
languages like Perl have no trouble penetrating the Unix community,
displacing C for many systems programming applications. The reason
that Ada, Modula, Eiffel, and the rest haven't made inroads is because
they aren't good enough. End of story. Not good enough means one or
more of: the compilers cost too much, are too big and slow, produce
executables that are too big and slow, are pointlessly non-ergonomic
(upper CASE keyword disease), don't interface gracefully with the OS
or with C, don't live up to their own claims ("safe" langauges don't
abort when you call NEW), and, most importantly, no cool programs ever
seem to be written in these languages. If Doom were written in Ada,
that would get people interested.
Hard to read, hard to type. Like I said, ergonomics counts,
especially given an entrenched champion. And that was just one factor
among many.
| IMO, there are two major reasons why you should choose a language:
|
| 1) speed.
| 2) portability.
Those aren't bad reasons. Quality of implementation, in general, is
very important.
| Also, Your last comparision is meaningless. If doom was produced by the DoD,
| then it would have been made in ADA.
No, my last point was valid. If people write interesting software in
a particular language and tell other people about it, other people
will make decisions based on that. Advertise your success stories.
: Hard to read, hard to type. Like I said, ergonomics counts,
: especially given an entrenched champion. And that was just one factor
: among many.
: | IMO, there are two major reasons why you should choose a language:
: |
: | 1) speed.
: | 2) portability.
The first point gives another reason, readability. Most programs are write
once, read many. Using a language with obfuscated syntax generates programs
which tend to be write only, i.e. non-maintainable.
--
Colin Walls |
Colin...@barclays.co.uk (work)| Die Tat ist alles,
Co...@murorum.demon.co.uk (home) | nichts der Ruhm.
Tel: 01565-614531 |
Gee. I guess it must be impossible. Unix must have been written in
Modula 2.
How do you think it's done?
-s
--
Peter Seebach - se...@solon.com || se...@intran.xerox.com --
C/Unix proto-wizard -- C/Unix questions? Send mail for help. No, really!
Copyright 1995 Peter Seebach. -- High energy particle theology
The *other* C FAQ - ftp taniemarie.solon.com /pub/c/afq
(For one, the Bourne shell syntax in't nearly obscure enough.)
Um, perhaps if you think that the A68 syntax is (a) obscure and (b) unrelated
to the Bourne Shell, you do NOT quite have the knack of extracing the
context free granmmar from the W grammar yet!
There is nothing obscure about A-68 syntax, it is really a very straightforward
Pascal like syntax (actually closer in most respects to Ada than Pascal)
----snip----
> The sources for the early Pascal and Modula compilers were given away
Does anyone know whether these are still available? And if so, where?
Ned
In <SCHWARTZ.95...@galapagos.cse.psu.edu> Scott Schwartz wrote:
> | Also, Your last comparision is meaningless. If doom was produced by
the DoD,
> | then it would have been made in ADA.
>
> No, my last point was valid. If people write interesting software in
> a particular language and tell other people about it, other people
> will make decisions based on that. Advertise your success stories.
Wasn't Doom actually first written in Objective C on a NeXT? Didn't
do much for that language did it?
----------------------------------------------------------------
Bob Love, rl...@neosoft.com (local) MIME & NeXT Mail OK
rl...@raptor.rmnug.org (permanent) PGP key available
----------------------------------------------------------------
--
|Fidonet: Robert B. Love 4:901/215.99
|Internet: Robert.B..Love@p99.f215.n901.z4.fidonet.org
Or the platform :)
For the purposes of inheriting fame from the game Doom, though, I am
sure that C and C++ advocates will welcome Objective-C to the fold, at
least for a little while ;)
Don't forget too that all the snap and speed of Windows NT has
something to do with it being written in C.
> In article <SCHWARTZ.95...@galapagos.cse.psu.edu> schw...@galapagos.cse.psu.edu (Scott Schwartz) writes:
>
> >Sounds like sour grapes from the losers to me. Highly effective
> >languages like Perl have no trouble penetrating the Unix community,
> >displacing C for many systems programming applications. The reason
> >that Ada, Modula, Eiffel, and the rest haven't made inroads is because
> >they aren't good enough. End of story. Not good enough means one or
> >more of: the compilers cost too much, are too big and slow, produce
> >executables that are too big and slow, are pointlessly non-ergonomic
> >(upper CASE keyword disease), don't interface gracefully with the OS
> >or with C, don't live up to their own claims ("safe" langauges don't
> >abort when you call NEW), and, most importantly, no cool programs ever
> >seem to be written in these languages. If Doom were written in Ada,
> >that would get people interested.
Good point. I don't think the FAA's air traffic control system or the
software used inside Boeing airplanes qualify as "cool" programs.
>
> What the hell does it matter what case a keyword is? It makes no sense to me
> why that would be a reason not to use a language. IMO, there are two
> major reasons why you should choose a language:
>
> 1) speed.
> 2) portability.
Interesting. I guess "readability" and "maintainability" count for
nothing; nor do features that help a programmer write correct
programs. In other words, all the features that help you make sure
your program works correctly and continues to work correctly when the
program's functionality is changed or enhanced.
But then again, if there are bugs in Doom, it's not going to cause
loss of life or injury or anything.
Just more stuff to add to my convinction that languages like C are
great if you don't particularly care how well your programs work.
-- Adam
> Newsgroups: comp.lang.ada,comp.lang.c++,comp.lang.c,comp.lang.modula3,comp.lang.modula2
> From: lath...@lion.cs.latrobe.edu.au (Edwin David Latham)
> X-Nntp-Posting-Host: lion.lat.oz.au
> Organization: CompSci & CompEng, La Trobe Uni, Australia
> Date: Thu, 26 Oct 1995 08:14:37 GMT
Sources for the SRC Modula-3 compiler can be found at:
gatekeeper.dec.com:/pub/DEC/Modula-3
I think somewhere on gatekeeper there is sources for other compiler as
well.
-- Farshad
--
Farshad Nayeri
nay...@gte.com
"Improvement" is a debateable term. It's a *different* language. C is
smaller and simpler. They can do exactly the same things; just
differently. C is sften better for small code or embedded systems; C++ may
be better for large systems. Whatever you know is a good choice.
I prefer C, until C++ settles down more. I really like some of C++,
and hate some of it. YMMV.
Hughes' commercial communication satellites which are based
on the HS-601 satellite bus use flight control software written
in Ada. (This is their latest and most popular model; there are
>My most marketable (embeded real-time) language skill is C.
I hate to infect a good argument with a newbie question, but I seem to
be missing the point. I was under the impression that C++ was an
improvement over C. So why would it be considered more marketable.
Are there things that you can do in C that you cannot in C++?
Graham Forsythe
: WHY??? I am building an embedded systems OS in Modula-2. I can't see that
: there are ANY problems with that. If I tried to write it in C, I would run
: into big problems with multitasking. How do you create a multitasking system
: in a language that doesn't support tasks?
I don't know. Maybe it's impossible. Maybe the operating system I'm using
right now is a figment of my imagination. Maybe ALL Unixes are figments
of some collective imagination.
The fact is, if the language you want to use for your OS doesn't support
coroutines (or, if you prefer, activations like Scheme uses) then you need
to implement them in a combination of that language and perhaps some
assembly language. Note, however, that if you ARE using a language that
supports those features, then the language implementor has to implement
them using a combination of that language and perhaps some assembly
language. I'm not sure that there's anything to be gained there. What
happens if the guy who did the implementation for the language didn't do
exactly what you want?
From my perspective, the problem with using Modula-2 for a general-purpose
operating system is the level of abstraction inherent in the language.
I believe that the API of an OS should be defined with a minimum of
abstraction and that includes things like parameter passing conventions
and suchlike. If you write the OS in Modula-2, the natural way of defining
the system calls is in terms of Modula-2 functions, and I believe that that
is a mistake. That sort of thing makes it difficult to implement other
languages for that system and even may make it difficult to use different
implementations of the same language on that system.
In case anybody's wondering, I also think that sort of abstraction is a
problem with the way Unix defines its API in terms of C function calls.
"Wasn't Doom actually first written in Objective C on a NeXT? Didn't
do much for that language did it?"
Well certainly not many people know what language DOOM is written in. On the
other hand, Nextstep is written in Objective C, and that certainly does
provide significant visibility for Objective C (most people only know about
Objective C because of this use by Next).
"Look, I've written a fair number of well-defined modules in C that
I can reuse just fine. I've had no trouble producing bug-free code on
a smallish scale, and as it grows, I continue to produce well-defined
modules that I can rely on. Good coding vs. bad coding is orders of
magnitude more important than language."
And of course the crucial words in this paragraph are "smallish scale"
WRiting large programs has almost nothing in common with writing small
programs. Of course you can write in any language, including assembly
language, or even absolute machine langauge on a "smallish scale".
>WRiting large programs has almost nothing in common with writing small
>programs. Of course you can write in any language, including assembly
>language, or even absolute machine langauge on a "smallish scale".
If that's how you write large programs, you have missed the concept of
modular programming entirely. I can produce 1,000 line pieces of code I
can trust, and I can put hundreds of them together.
How many millions of lines do you need to have before it's big enough?
It'll still be made of small, bug-free chunks, or badly written.
Nice sarcasm. My point is that your criticism of C was ludicrous. The
things you were pointing at produce crappy unmaintainable code in
any language.
>If this is based on real experience, i.e. you really have put together
>hundreds of 1000 line modules with no trouble, you should share your
>method. However, to be convincing you need to be talking about programs
>of significant size, say at least 0.5 million Sloc (that is still not
>really big, but still big enough that the glib statement that modular
>programming solves all the problems is not easily credible!)
Sloc?
No, I haven't gone that big. I've only been doing this for about a year
in my spare time, and it takes a while. But it is the basis of the C
standard. What do you think the standard library *is*? It's a set
of pretested, reliable routines to do certain well defined things.
>Of course we are all familiar with the concept of modular programming (Parnas
>has after all been around for a while :-) But it is a long time since we
>bought this is aome magic elixir that *automatically* solves the problem
>of writing large programs.
I didn't say "automatically". I merely said it was possible to produce
large programs in C that are reliable. I never said it would be easy.
I just object to the characterization of C as a buggy language. It
isn't. I can show you bug free pieces of C code. I can also show you
horribly buggy programs in Ada, M3, Icon, Eiffel, or any other language
you care to name.
Don't blame the language, blame the programmers. When a language is
one of the most commonly used, every programmer who lies about experience
to get a job is likely to end up writing in it. So, you have a lot of
incompetent programmers working on projects specced so as to be impossible,
at companies that would rather release something early than stable.
But this is not the fault of C; the same programmers would produce just
as many useless things in any other language.
You can argue that your favorite magic bullet (run time type checking,
array bounds, whatever) will catch "almost all" of those errors, but I
can laugh at the claim, because it's wrong. No language level feature
will prevent shoddy programs from being broken and useless.
I have produced at least some small reliable programs. I am at this time
unconvinced there is ever a sufficient *reason* to make them much larger;
I suspect that a vast program, like a vast function, is a result of not
knowing when to break things into modules. uSoft has encouraged this
by sticking a lot of people with the application concept. No word processor
should *ever* have the code for sorting; sorting should be a separate
application. I doubt the code for searching should be in the WP either;
I could justify it as a common library thing, but really, a tool should
be used. Why should any two programs anywhere have substantially the same
code for substantially the same action?
I have never seen a vast application that was well-designed. I have seen
a lot that look really nice on paper, but it always comes down to modules;
if they wrote the basic pieces of their programs as separate programs,
and had them talk, they would do a lot better.
Basically, my point is that my magic bullet, while obviously not able
to make things perfect without human intervention, is every bit as
powerful as yours.
You must be using a different version of Windows NT than I have, what
"snap and speed"? In fact, especially on fast RISC machines, there are
serious performance problems in NT. These have to do more with software
architecture issues (eexcessive layering), than language issues.
: I have never seen a vast application that was well-designed. I have seen
: a lot that look really nice on paper, but it always comes down to modules;
: if they wrote the basic pieces of their programs as separate programs,
: and had them talk, they would do a lot better.
This only relegates the issue of large scale architecture to the
design of the interfacing between these smaller programs. Complexity
has only been shifted, not conquered.
In a large scale software engineering language, you can express many
complex and subtle relationships between the entities that go in one
place and go out another. And doing so is important to keeping
things working.
: --
>| Wasn't Doom actually first written in Objective C on a NeXT? Didn't
>| do much for that language did it?
>
>Or the platform :)
>
>For the purposes of inheriting fame from the game Doom, though, I am
>sure that C and C++ advocates will welcome Objective-C to the fold, at
>least for a little while ;)
I understood Doom to be written in ANSI C, which is why it has been so easily
ported to many platforms.
>Don't forget too that all the snap and speed of Windows NT has
>something to do with it being written in C.
You can't blame NT's dismal performance on C. Consider OS's such as Linux
and Freebsd which are written in C and generally perform significantly
better on the same hardware platform.
--
-----------------------------------------
Lawrence Kirby | fr...@genesis.demon.co.uk
Wilts, England | 7073...@compuserve.com
-----------------------------------------
: Nice sarcasm. My point is that your criticism of C was ludicrous. The
: things you were pointing at produce crappy unmaintainable code in
: any language.
It is clear that Mr. Seebach is unfamiliar with the Ada model for
modularity. And it is also clear that he has the potential for
benefiting from it when he does learn how it works.
There are a lot of good programmers out there who intuitively have the
right idea about how to proceed, but do not have all of the information
necessary to harness that intuition. I suspect that Mr. Seebach, because
of his concern for excellence in modularity, may someday be a very
fine Ada programmer.
This is not intended as sarcasm. It is an observation drawn from
experience with other intelligent programmers who see the possibilities
of modularity but do not yet have the benefit of the knowing about the
substantial literature on the subject.
: >Of course we are all familiar with the concept of modular programming (Parnas
: >has after all been around for a while :-) But it is a long time since we
: >bought this is aome magic elixir that *automatically* solves the problem
: >of writing large programs.
Many programmers have never read Parnas. One of the best summaries of
modularity is in Bertrand Meyer's book, "Object-Oriented Software
Construction." the first four chapters of which are required reading for
anyone who wants to make a career in programming. (Oh, BTW, this is an
Eiffel book).
: I just object to the characterization of C as a buggy language. It
: isn't. I can show you bug free pieces of C code. I can also show you
: horribly buggy programs in Ada, M3, Icon, Eiffel, or any other language
: you care to name.
I agree that C is not inherently "buggy." However, it is a "one size
fits all" language that require substantial discipline to use correctly.
Ada, Modula 2/3, and Eiffel are designed so the compiler will give the
programmer more help in detecting potential defects early in the design
and development process. C programs tend to be more dependent on
"after the fact" auxiliary tools such as code analyzers and debuggers.
The use of C is not evil, but it does require considerable care. It is
a little like the difference between a safety razor and a straight
razor. I can still nick myself with a safety razor, but I will probably
not do enough damage to bleed to death.
: Basically, my point is that my magic bullet, while obviously not able
: to make things perfect without human intervention, is every bit as
: powerful as yours.
Alas, as so often been observed, there is no magic bullet for software
anymore than there is a royal road to mathematics.
Richard Riehle
adaw...@netcom.com
--
ric...@adaworks.com
AdaWorks Software Engineering
Suite 27
2555 Park Boulevard
Palo Alto, CA 94306
(415) 328-1815
FAX 328-1112
You can find a comparison between Windows NT, Windows, Windows for Workgroups
and NetBSD (a BSD-like free operating system) in:
"The Measured Performance of Personal Computer Operating Systems"
J. Bradley Chen, Yashuhiro Endo, Kee Chan, David Mazi`eres,
Antonio Dias, Margo Seltzer, and Michael D. Smith
The Proceedings of the 15th ACM Symposium on Operating System Principles
Regards,
--
Pedro de las Heras
phe...@ordago.uc3m.es
--
Pedro de las Heras Quir'os
mailto:phe...@ordago.uc3m.es // snail-mail:
Voice: +34-1-624 94 97 // Universidad Carlos III
Fax: +34-1-624 94 30 // C/ Butarque 15. E-28911
URL: http://www.uc3m.es/~pheras/ // Legan'es SPAIN
> <assert.h> Assertion checking
> <ctype.h> Character classification and case shift
> <errno.h> Error numbers and errno quasi-variable
> <float.h> Floating-point parameters
> <limits.h> Environment parameters with integral values
> <locale.h> New in standard; localisation support
> <math.h> Transcendental functions, &c
> <setjmp.h> Non-local GOTO
> <signal.h> Interrupts & exceptions
> <stdarg.h> Support for variable length argument lists
> <stddef.h> Some types and constants
> <stdio.h> I/O library
> <stdlib.h> Miscellaneous, including exit
> <string.h> Lousy string functions
Lousy? I like 'em.
> <time.h> time of day, elapsed time, calendar
>For comparison, Fortran, COBOL, and Lisp have almost all of this stuff as
>part of the built in syntax or predefined environment and nobody is in the
>habit of calling e.g. the SORT verb in COBOL "a rich heritage of reusable
>code".
Does the sort verb in cobol let you provide your own arbitrary method for
sorting?
> Of course, in all these cases the
>predefined types and routines of a language _are_ reusable code and of very
>great value, but SINCE THEY ARE DESIGNED WITH THE STANDARD they don't count
>as a demonstration of whether the language supports reuse well.
They do when they're traditionally implemented *in* the language. The
built-ins of some languages are sometimes implemented in C or assembly.
95% of the C library can be written in *completely portable* C.
>As for reuse of the C standard library, it isn't always advisable.
>I recently got a _very_ nasty shock hn a program that was supposed to
>save time by fseek()ing over records it didn't want ran slower by a
>very large factor (3 seconds went to 30 seconds); I am informed that
>the typical System V implementation of "fseek" _always_ flushes the
>current buffer and does an lseek() even when the position you are
>seeking to is the position you are now at. At various times on various
>machines I have found it necessary to replace large chunks of the C library
>to avoid such performance problems.
I've rarely had to do anything of the sort. But you miss the point; it is
the *definition* of fseek to do appropriate flushing. Although a good
implementation would have checked whether it was useful or not.
But, consider; the majority of the Standard Library of C was designed, not
as a part of the language, but as a tool to be used with C. It was a good
enough set of tools to standardize.
I have written a fair number of functions which I can reuse consistantly.
C has provided a good way for me to do this. C has done a lot to advance
the concept, partially by providing a portable way to do things efficiently
enough for people to use it back in the days of slow systems. It's
still pretty good at it.
-s
Well, most compilers can be gotten to warn for any use of = anyplace
where == would make sense. Almost any.
But I dunno; generally, if a bug takes me more than an hour to track down,
I consider myself sloppy. I have good debugging code in most of my work,
so I turn on debugging, and just see what I get...
However, I personally like the behavior of = and ==. I like being able
to put assignment anywhere. :)
Actually, I was being sarcastic ! I am sorry, I didn't realize I had
been to subtle there. When I first evaluated NT for a company, we were
amazed at the LACK of speed it demonstrated, considering that
Microsoft had effectively made everyone believe that it was really the
OS/2-killer that they claimed it was. When we saw it, our jaws
dropped, and then we howled, and threw it out. They now run OS/2.
Since then, NT has gotten better, but my point was that if you need
speed, C is no guarantee.
I dearly love C, but I readily admit it has its flaws.
ob-modula-3: who knows if, when, and where the OS/2 port will be done,
and if it includes access to the full API? What is the speed like :)
As I mentioned to Peter in a private note, it kind of reminds me of Fortran
programmers wen EWD's letter on gotos appeared. They had a hard time
understanding the idea of goto free programming, which is not surprising
if you are in a language that has no reasonable control structures (no
doubt there were Fortran programmers who thought that Fortran had
excellent control structures and was "excellent for strucured programming".
Peter, I suggest that instead of "slogging through Knuth Volume I", you
spend some time instead reading about modern software engineering
methodology. Sure the algorithms in Knuth are important (indeed one
would assume that any programmer should be familiar with the algorithms
in Knuth Volume I, this is after all a book written 25 years ago). But
when it comes to writing really large scale programs, the algorithms are
the trees, and the issue is understanding the structure of the forest.
Another very typical notion in Peter's post is the "I don't want the
language enforcing methodology, I know how to program" viewpoint. One
might have hoped that this would have disappeared by now, but I fear
that schools these days are actually reinforcing this kind of attitude
to programming. Peter cannot see any value at all in enforced information
hiding, because he thinks programmers should be able to follow rules.
Of course as long as Peter sticks to what he is doing, which, from a
private note, is primarily writing little programs of a couple of hundred
lines long in C or Perl or shell scripts, then language is certainly
not much of an issue, and of course C is adequate. Bad programmers
can still make a hash of small programs, and well designed language
s can help even for small programs, but a good programmer can do fine
in C of course.
The worrying thing is that programmers with these kinds of attitudes
all to often *do* end up working on large projects, and importing
these working views into large projects is a quick road to a big mess.
It's depressing to think that so much of a computer science undergraduate
curriculum at many universities is devoted to completely out of date
stuff (think, an undergraduate can be introduced to a 25 year old language,
and use it to concentrate on programming simple algorithms that have been
around for a long time, and, at far too many schools, can graduate with
almost no idea of what has been done in software engineering and
understanding of program construction in the last two decades).
P.S. Peter, I don't think you should waste your time writing a MIX
assembler and simulator, there must be one around you can get your
hands on. At least there were 20 years ago, these days, Knuth's books
look a little flawed in their concentration on MIX. Of course DK would
not write those books that way today, or at least I don't think so. The
algorithms are important, and the discussion brilliant of course, but
the presentation is far too low level for my taste, with low level
efficiency concerns entering far too early. Precisely what is missing
from the code in these books is a good sense of abstraction. But I
guess if you think that C has excellent abstraction facilities, then
MIX will seem just fine to you :-)
By the way, Peter, if you want a full Ada 95 compiler, pick up GNAT 3.00
when it comes out in a week or so (earlier versions are in current use, but
if you are looking at Ada for the first time, you may as well wait a bit
to get version 3.00). If you seriously want to give this a try, I do NOT
recommend trying to learn Ada 95 from the reference manual (I would assume
you did not learn C from the ANSI C standard!) Instead I recomend getting
the Barnes book as an informal introduction. Then when you really understand
the basic structure, you can follow your (perfectly reasonable) desire to
program to "spec" and not to the "compiler", and study the reference
manual. Although a word of warning is appropriate here. THe defining
documents for modern complex languages are very complex documents, and
reading them, especially without a background in language design, is not
an easy task. After all, by comparison, C is a trivially simple language,
and yet the ANSI C standard is still not an easy document for most people
to read and fully understand.
"Does the sort verb in cobol let you provide your own arbitrary method for
sorting?"
Actually SORT in COBOL is a nice example of layered abstraction. You can
define methods for inputting the data, comparing the data, and outputting
the data, including reformtting the data for the actual sort.
The sorting algorithm itself is certainly built in. No one is about to want
to write their own sort in this kind of environment. Remember we are not
talking about simple quick sorts or heap sorts here, we are talking about
very sophisticated methods for external sorting, and the whole point of
the SORT verb is to connect you with prewritten reusable code for the sort.
So the question is a bit like: does lseek allow you to substitute your
own method for randomly accessing part of a file, and the answer is
of course no, and that is the point.
But as others have pointed out, the standard library of C is quite small
in any case, and large languages like COBOL typically have much richer
semantics built in. FOr example, the entire handling of fixed decimal
arithmetic with high precision in COBOL, very much appropriate of course
for fiscal applications, is completely missing from C and C++. Yes you
could write standard library routines to handle this, and people do, but
the integration with the language is poor (e.g. no high precision literals
are available, and no fixed-point literals are available). By contrast
Ada 95 does provide these high precision fixed-point decimal scaled
arithmetic capabilities, cleanly built into the language).
The question of how much should be built into the language is an interesting
one that is somewhat orthogonal to the issue of abstraction and modularity,
though not entirely (because a language like Ada 95 that provdies strong
capabilities for modularization and interfacing makes it easier to build
on new packages, compared to the rather primitive facilities in C. I'll give
two simple examples:
The single giant name space of C means that if you structure an addition
to the language library as a set of separate units (one visible to the
programmer, the others supposedly invisible, there is no way of limiting
the visibility, and name space pollution is inevitable). Ada 95 by
contrast provides package name spaces to solve this problem, and if you
like, you can use private child packages to enforce hiding.
In interfacing to other languages, C is very weak. For example, there is no
convenient way to interact with a multi-dimensional Fortran array. In C
you would end up having to index it backwards, whereas in Ada 95 you
can use pragma Convention Fortran to force a multi-dimensional array to
be mapped as Fortran would map it.
These kind of facilities in Ada 95 indeed make it easier to extend the
functionality of the language by adding libraries (for the record C++
is somewhere between C and Ada 95 in this capability).
But there still remains the issue of how rich the built-in semantics should
be. A decade ago, the trend seemed to be to smaller languages (Oberon comes
to mind as representative of that trend). However, a suprising thing has
happened in the last ten years, the simple languages have lost, and it seems
like this war was one by the complex languages almost without a shot being
fired. Consider:
COMMON LISP with CLOS
COBOL 81 (and now OOP COBOL)
FORTRAN 90
ADA 95
C++ (with the ISO extensions)
SMALLTALK (with its full library)
These are all very complex languages/systems. So it seems like that's what
the world wants these days. There are of course some hold outs for simpler
languages, notably C, but the number of C (as opposed to C++ enthusiasts)
seems to be dwindling (Richard Stallman is one obvious exception, he
certainly has not bought into C++ -- perhaps we can persuade him to
write GCC 3 in Ada 95 :-) :-)
>A decade ago, the trend seemed to be to smaller languages (Oberon comes
>to mind as representative of that trend). However, a suprising thing has
>happened in the last ten years, the simple languages have lost, and it seems
>like this war was one by the complex languages almost without a shot being
>fired. Consider:
> COMMON LISP with CLOS
> COBOL 81 (and now OOP COBOL)
> FORTRAN 90
> ADA 95
> C++ (with the ISO extensions)
> SMALLTALK (with its full library)
>These are all very complex languages/systems. So it seems like that's what
>the world wants these days. There are of course some hold outs for simpler
>languages, notably C, but the number of C (as opposed to C++ enthusiasts)
>seems to be dwindling (Richard Stallman is one obvious exception, he
>certainly has not bought into C++ -- perhaps we can persuade him to
>write GCC 3 in Ada 95 :-) :-)
And the world is basically incompetent. The problem is that there are
basically no mechanisms to stop system/language bloat (which ever
system/language dies with the most features wins!) Commercial
products keep getting fatter as they keep adding features for
marketing reasons. Language/system committees have a tendency to keep
adding features to appease everyone. Academics are completely
apathetic about bloat and some encourage bloat because complex systems
have more "research potential". In fact, the only place where a call
for non-bloat could come from is from academia, but they really do not
give a fuck about software. Anyway the dreaded NIH syndrome in
precludes any popular languages coming from academia. Wirth has been
screaming about software bloat but he has proven totally irresponsible
when it comes to his languages, when he came out with Oberon (after
letting Pascal, Modula* rot), I just rolled my eyes.
>COMMON LISP with CLOS
Widely recognized as bloated and some say it is killing LISP. Or
maybe its AI thats killing LISP or possibly LISP is killing LISP.
>FORTRAN 90 and HPF
Bloated and where are the compilers?
>COBOL (Most real programmers will slit their wrists first).
>ADA 95
Ada was bloated and it got little commercial use and had
very expensive compilers. Ada 95 is more bloated
and it is questionable whether it will go anywhere.
>C++ (with the ISO extensions)
Extremely complex widely popular due mostly to C.
But, all these systems are bloated extensions to already bloated
poorly designed languages (except C which is not bloated but
incompetently designed).
Jay
Any one see any reason not to RFD it? We obviously need it.
I got frustrated by the all caps keywords in M3, the ugly template syntax in
C++, and many other "cosmetic" issues. (I wan't even bother listing my
problems with traditional FORTRAN.)
These are *NOT* trivial. If a language makes it impossible to write decent
looking code, there is something wrong with it.
Consider:
English IS a nice language. BUT IF you WERE to capitalize certain words
BECAUSE they WERE deemed more important THAN others, IT would BE practically
illegible. IT WOULD certainly BE A very difficult READ. NO native speaker OR
reader of English I HAVE ever met HAD any trouble reading text IN English.
THE fact THAT verbs AND certain "keywords" ARE NOT IN all caps DOES NOT hurt
THE language AT ALL.
ANY experienced reader OR writer OF A language WILL spot keywords easily AND
quickly. A sloppy-looking, unnatural, hard-to-read layer DOES NOT help AT
ALL.
>For those of you who like Modula-3 semantics but hate the uppercase
>keywords, I suggest you read he paragraph below (and fetch m3su which
>allows you to use lowercase keywords).
Given than, I may track it down and try to build it for my current home box.
Yes, it's a "minor" point, but they really are *horribly* ugly and hard to
read. I could accept leading caps, but fundementally, native usage of my
language does not have any all-caps words excpt for signs. Code should read
more like text, and less like a billboard.
> Some people prefer uppercase keywords others hate them. Another
> possibility is to accept both forms for keywords. This topic has been
> discussed at length and there is no solution that will completely
> satisfy everyone's tastes. Fortunately this is a very minor issue and
> you can easily have lowercase keywords automatically converted for you
> using an emacs macro or packages like
> <A HREF="file://pion.lcs.mit.edu/pub/m3su"> m3su </A>.
... But what I mostly don't see is why the language *requires* this. Why
not just go whole hog and require keywords to be in bold?
> http://www.research.digital.com/SRC/modula-3/html/welcome.html
(Left this in 'cuz it's a good pointer.)
Thanks for the response. If I don't get any good reasons not to, I'll track
down the RFD process and start the process for comp.lang.advocacy.
By about 220k of code, wost of the issues are visible, if subtle.
I have taken the time to ak a fair number of programmers of widely varying
backgrounds about this; all of them have agreed that abstraction is a feature
of programs and their designs, not languages.
The C standard I/O library uses a type "FILE" to hold information about files.
It does not matter what it is; it works the same everywhere. I have asked
Robert several times to explain how this is not an abstraction, and he has
chosen to ignore the request, because the way in which it is not an
abstraction is "too obvious" t oexplain. Anyone? Bueller?
>As I mentioned to Peter in a private note, it kind of reminds me of Fortran
>programmers wen EWD's letter on gotos appeared. They had a hard time
>understanding the idea of goto free programming, which is not surprising
>if you are in a language that has no reasonable control structures (no
>doubt there were Fortran programmers who thought that Fortran had
>excellent control structures and was "excellent for strucured programming".
...And yet it is now pretty well known that a goto can be legitimately and
sanely used in a structured program. My, how time flies.
>Peter, I suggest that instead of "slogging through Knuth Volume I", you
>spend some time instead reading about modern software engineering
>methodology. Sure the algorithms in Knuth are important (indeed one
>would assume that any programmer should be familiar with the algorithms
>in Knuth Volume I, this is after all a book written 25 years ago). But
>when it comes to writing really large scale programs, the algorithms are
>the trees, and the issue is understanding the structure of the forest.
I've read a fair number of books from the last few years. I've also gone out
and written programs, and looked for where my strategies started breaking
down. My early programs were morasses of code; my modern ones are
sufficiently modular that I can completely rewrite any part without even
recompiling the rest.
>Another very typical notion in Peter's post is the "I don't want the
>language enforcing methodology, I know how to program" viewpoint. One
>might have hoped that this would have disappeared by now, but I fear
>that schools these days are actually reinforcing this kind of attitude
>to programming. Peter cannot see any value at all in enforced information
>hiding, because he thinks programmers should be able to follow rules.
Yes, I do. Programmers who can't follow rules will produce shoddy code no
matter what you bind them with. C makes it quite possible to hide information
about the implementation of a structure; just
typedef struct foo *foo;
and foo is a perfectly valid type, which you know nothing of the
intenals of.
>Of course as long as Peter sticks to what he is doing, which, from a
>private note, is primarily writing little programs of a couple of hundred
>lines long in C or Perl or shell scripts, then language is certainly
>not much of an issue, and of course C is adequate. Bad programmers
>can still make a hash of small programs, and well designed language
>s can help even for small programs, but a good programmer can do fine
>in C of course.
Right... But everything I've heard about Ada is that it's wasting more time
jumping through ill-considered hoops than it saves preventing jumping off the
occasional cliff. Of course, many of these people were C or C++ programmers.
I do distrust anything designed by committee. Small groups can produce a
consistent design; committees never have. If Ada, designed by a committee, is
truly an legent language, with a solid internal concept of how everything
sholud go together, then it is the first time in all of human history that a
committee with a significant number of members has produced one. My
skepticism here is pretty heavy.
>The worrying thing is that programmers with these kinds of attitudes
>all to often *do* end up working on large projects, and importing
>these working views into large projects is a quick road to a big mess.
I see no evidence here. I use regularly a medium large system
which is written by hundreds of programmers, who drop out or join up with no
central authority at all, written almost entirely in C (one module of C++,
a few dozen of assembly), which seems to me to be fundewmentally quite stable.
The problems it has are almost always a result of a change to a device driver
being made by someone who has no documentation for the device.
I refer, of course, to NetBSD, a system running reliably on about a dozen
unrelated machines.
This is, of course, the way C is *designed* to enforce information hiding and