> I am not someone who is by nature much interested in the details of
> computer languages---I'd much rather focus on CPU design or my field,
> digital video. However every so often I have been overcom by guilt pangs
> and attempted to read about ideas in computer languages, and I have
> repeatedly been disappointed at the extraordinarily poor quality of the
> books I find. The computer languages field seems to have no Dawkins, no
> Gould, no popularizers and not even any very good text books. Every one of
> the books I have tried has come across as boring, poorly structured,
> making no obvious attempt to justify things and completely unwilling to
> stand by itself. No I don't know lisp, snobol or prolog---that's why I'm
> reading the damn book---so why do we start getting examples using these
> languages in chapter 1?
I agree that more could be done popularise certain languages.
It seems that the supporters of Lisp expect everyone to see how
superior it is to other languages, and just accept it. However,
whatever "superiority" it may or may not have, it's irrelevant
when you have to first know the language to appreciate it, and if
the language can alienate you as effectively as Lisp can, then
you're only likely to learn it once you can appreciate it. Catch-22.
> People write comprehensible text books in fields like math, physics and
> engineering that seem, a priori, substantially more difficult than
> computer languages. I can read one of those and learn the subject right
> off---I don;t have to engage in a strange two-steps-forward, one-step-back
> dance involving reading four books in parallel. I'm a busy guy and if you
> can't be bothered to make your ideas accessible, I'll read other fields
> that do take that time.
Perhaps you just need one really good book? I don't know if
such a book exists. It could be that my own experience has only
been a positive one because I have an "aptitude" for learning
programming languages. It's not unusual for me to learn more
than one at the same time. (I'm currently learning two different
dialects of ML _and_ at least one dialect of Haskell.)
It could be that the people get the best results from using Lisp
are people like myself - people who enjoy the linguistic mind games
that Lisp can make possible. That's not a great way to "sell" a
language to a programmer, as I bet that most people will be like
yourself, too busy and under pressure to _get results_. Even if
you could get the results better by using Lisp, you still need
to learn it, to want to learn it, and to appreciate why this is
what you want. This isn't easy! Instead, it's easy for someone
with an aptitude for a language, and years of experience using
it, to say "it's easy - go on, it'll change your life!"
That's also an argument for surfing, but not everyone is convinced.
I don't blame them. (My nickname is an in-joke, BTW. It's also a
wind-up for anyone who thinks a "name" on a computer screen means
anything.)
You've mentioned Dawkwins, who in his book "The Blind Watchmaker"
spent a whole chapter listing theories that rival Darwin's. Many
of those "doomed rivals" are nothing more than Darwin's own ideas!
The reason for this, Dawkins claims, is coz of a communication
error made by Darwin. Apparently, he gave answering his critics
a higher priority than explaining evolution.
Is this your criticism of tutorials for Lisp and other languages?
> If those people out there who cares about computer languages and
> constantly complain that the world uses nothing but Fortran, C and Cobol,
> want to actually change that situation, they could do worse than to write
> a comprehensible book on the subject.
It's worth adding that I love Lisp, and I appear to understand
it well enough to have written a couple of interpreters & "toy"
compilers. For my own amusement, of course, and because when I
discovered Lisp, I was using a machine for which the only
development tools available to me were C and assembly language.
However, I can see why it alienates people unfamiliar with it.
It appears to me that hardcore Lisp hackers either can't see this,
or they just don't care. The best Lisp tutorials that I've read
give me no problems, but they do introduce new - and sometimes what
to most programmers will be radical - ideas at an alarming rate.
No doubt Lisp isn't unique in this respect. No language has virtues
that are obvious to the uninitiated. Perhaps even languages as
simple and easy to learn as Basic will baffle some people, or just
leave them unconvinced of any "superiority" that many may feel it
has. I suspect that Lisp and Basic are merely examples of two
extremes.
I'm not interested in "language wars", just as I don't care for
"OS wars", or any other daft allegiance some people may feel to
their chosen tools. I'm not going to try to convert anyone who
isn't at least curious about what I'm using, but if I were to try,
then I'm sure it could be done much better than many of the books
available. In other words, whether we're talking about Lisp or
Linux, the books tend to be "preaching to the converted".
While I don't feel there's a great threat to my favourite programming
languages, I don't feel complacent. That could be because of the
lack of support for them. It's there, but it's so small that you
could easily miss it - and most people seem to be unaware of it.
You couldn't easily say that about C...
Today, Fortran, C and Cobol don't need strong advocates, as they're
already "established". Perhaps too well established in some cases.
It's not hard to find critics of Cobol! Still, there are critics
of Fortran, Lisp, and many other languages, and yet we can still
use them and find damn good counter-arguments to most, if not _all_,
of the critisms.
I prefer instead to criticise the would-be advocates of languages.
You've criticised the tutorials, and I agree that they have their
weaknesses. They fail to communicate the joy of using a particular
language, while writers such as Gould and Dawkins excel at it,
writing about science - another potentially very dry subject.
No wonder that so many people have such distorted ideas of what
Lisp is! A bad learning experience can leave a negative impression,
and there's plenty of that to be found. Perhaps there's a little
less evidence of it, now that Java is popularising many of the
ideas that Lisp exploits, but that merely supports the view that
Lisp is being promoted very badly - or not at all.
In short, "must try harder".
Martin Rodgers,
Enrapture Limited
--
<URL:http://www.enrapture.com/cybes/> You can never browse enough
Future generations are relying on us
It's a world we've made - Incubus
We're living on a knife edge, looking for the ground -- Hawkwind
> In article <handleym-031...@handma.apple.com>
> hand...@apple.com "Maynard Handley" writes:
>
> If those people out there who cares about computer languages and
> constantly complain that the world uses nothing but Fortran, C and Cobol,
> want to actually change that situation, they could do worse than to write
> a comprehensible book on the subject.
I couldn't agree more. As regards Common Lisp, the most readable and
enjoyable introduction I've found yet is Paul Graham's ANSI Common Lisp,
published by Prentice hall (ISBN: 0-13-370875-6). It is also a fairly
comprehensive reference as well. I can't recommend it highly enough.
You might also try the Common Lisp Hyper Spec at
<http://www.harlequin.com/books/HyperSpec/FrontMatter/Starting-Points.html>.
This is a great reference and learning tool available free on the web.
Essentially, it's a complete hypertext reference of Common Lisp.
Raf
--
We make the sotfware that powers tomarrow!
What an interesting observation! You're right. When Dawkins came to Boston
to promote his new book, I took pains to see him talk, and was happy to sit
on the floor when there were no more chairs. I can't think of anybody in
computer science I'd do that for.
You said you didn't like books that immediately throw code samples at the
reader. I think the intent in such cases is to immerse the reader in the
language, like the idea of learning French by getting lost in Paris without
a guide book. But there's a question of impedance matching, or assumptions
about just how much immersion the reader is ready for. The immersion idea
has definite merits, but perhaps the art hasn't yet been perfected.
Cyber Surfer (cyber_...@wildcard.demon.co.uk) wrote:
: ...whether we're talking about Lisp or
: Linux, the books tend to be "preaching to the converted".
I think this might be the reason, that "the converted" exist in such numbers
and with such intensity of feeling in various areas of computer science
(OSes, languages). Joe Weizenbaum talked about "compulsive programmers"
and I think there was a little truth to that. There are lots of undergrads
who are powerfully drawn to computer science, and there's no pressure on
anybody to provide a greater incentive by writing really compelling books
on computer languages that you just can't put down. Maybe this isn't the
case for evolution; I don't recall ever meeting a "compulsive evolution
hacker" when I was in school.
As far as Dawkins, I love his stuff, but he harbors a powerful grudge against
creationists and at times, that grudge drags his writing in a particular
direction, leaving me wondering what it would have been like if he'd just
written out of a passion for explaining things without that kind of agenda.
"Climbing Mount Improbable" is a lovely book, but the whole thing is an
anti-creationist argument.
But you're right, really compelling computer science books would be a great
thing, particularly as computers continue to impact the lives of average
people. And even as somebody who is himself a slightly compulsive programmer,
I often find computer science books pretty inaccessible if they're about a
topic that's pretty new to me.
--
-------------------------------------------------------------
Will Ware <ww...@world.std.com> web <http://world.std.com/~wware/>
PGP fingerprint 45A8 722C D149 10CC F0CF 48FB 93BF 7289
Maynard Handley said:
The computer languages field seems to have no Dawkins, no Gould, no
popularizers and not even any very good text books. Every one of the
books I have tried has come across as boring, poorly structured, making
no obvious attempt to justify things and completely unwilling to stand by
itself.
I would suggest "The Structure and Interpretation of Computer Programs" by
Abelson and Sussman as a textbook counterexample. Elegant, clear and
useful. Anyone who has studied it will understand how to write good code;
among many others, a generation of MIT undergraduates has been trained with
it.
Larry
--
Lawrence Hunter, PhD.
National Library of Medicine phone: +1 (301) 496-9303
Bldg. 38A, 9th fl, MS-54 fax: +1 (301) 496-0673
Bethesda. MD 20894 USA email: hun...@nlm.nih.gov
I would also suggest "Element of Programming Style" by Kernighan &
Plaugher of Bell Labs. Circa 1974, if you can still find a copy.
It was only a paperback, but it would be difficult to find more
sense on the subject in less physical volume.
--
Glen Clark
gl...@clarkcom.com
I think asserting that one language is "superior" to another is rather
like asserting that a hammer is "superior" to a screwdriver. ;-)
Steve
In article <849729...@wildcard.demon.co.uk>,
Cyber Surfer <cyber_...@wildcard.demon.co.uk> wrote:
>>I agree that more could be done popularise certain languages.
We don't want merely popular languages, we NEED useful ones.
>>It seems that the supporters of Lisp expect everyone to see how
>>superior it is to other languages, and just accept it.
...
>>Catch-22.
You can divide-and-conquer the LISP people
(i.e. set flames wars off if you study them carefully,
remember many great advances never seemed to get past
version 7, LISP never got reached 2.0).
In article <5876cf$3...@rainbow.rmii.com>,
Stephen O Gombosi <s...@rmi.net> wrote:
>I think asserting that one language is "superior" to another is rather
>like asserting that a hammer is "superior" to a screwdriver. ;-)
A useful but almost impossible document to find these days to understand
language wars is The Rationale for the Design of the Ada Language.
It represents (1980) the still largely current state of the art in
language design (see the syntax and justification for UNSAFE PROGRAMMING).
One of the things which became clear was that languages themselves were
not enough, thus was born the APSE the Ada Programming Support
Environment, others might call it a pseudo operating system,
most based on Unix (most of Ada was based on Pascal). The LISP answer
at the time was InterLISP. Hence you needed a tool kit......
There, we just jumped from K&P Elements of Programming Style to
K&P Software Tools. ....."What's a software tool?".....
We could collect: great errors in programming language design
(e.g., GOTOs [or not], the original FORTRAN DO-loop, etc.).
We'll reinvent UNICOS(tm) one of these days....... Mark my word!
Wow, and I called Judy at the Usenix offices today for the first time in years.
Deja news.....
with a reference to K&P and this being c.s.s. I should also note:
Bentley: Writing Efficient Programs
Dowd: High Performance Computing
as always.....
GAM just called. I'm away from net.news for the next 3 days, I'm taking
him down to UCSB to see Glen Culler and others so he can chew the fat
and recall even earlier supercomputing history.
Let me know when you guys make it to the mid-1980s.
> I think this might be the reason, that "the converted" exist in such numbers
> and with such intensity of feeling in various areas of computer science
> (OSes, languages). Joe Weizenbaum talked about "compulsive programmers"
> and I think there was a little truth to that. There are lots of undergrads
> who are powerfully drawn to computer science, and there's no pressure on
> anybody to provide a greater incentive by writing really compelling books
> on computer languages that you just can't put down. Maybe this isn't the
> case for evolution; I don't recall ever meeting a "compulsive evolution
> hacker" when I was in school.
The impression I get from the hardcore Lisp people is that there
isn't a problem, and yet almost every programmer I meet thinks of
Lisp as a strange choice. As I have an interest in compilers and
interpreters, perhaps it's not so strange to me, but I can see
why it would seem that way to others.
I've only succeeding in readin CS books because I've had one or
recommended to me by older and wiser programmers, and because when
I see a name crop up again and again, I get curious. My curiosity
has driven me to some pretty far flung corners, but I do sometimes
come back again. When I return, I bring what I've learned with me
and then try and apply it to problems with which I work.
It doesn't suprise me that so few programmers know as much CS as
I do, and I really don't know much! I've still got a lot to learn,
but in the areas that interest me, I seem to know vastly more than
most programmers appear to. This shocks me, but others tell me that
it isn't at all unusual.
Are we overrun with "9 to 5" programmers, or are we too busy to
learn anything more what is absulately necessary in order to do
our jobs vaguely competently? I sometimes wonder.
Somewhere in my homepage, I'm constructing a page listing the books
that I've found most useful as a programmer, and even a few that've
been highly recommended to me, but which I've not yet read. The
danger of collection lists like this based on suggestions from people
on UseNet etc is that they get _very_ long. I'm not sure if anyone
would bother reading my "recommended" list, but it's there anyway.
> As far as Dawkins, I love his stuff, but he harbors a powerful grudge against
> creationists and at times, that grudge drags his writing in a particular
> direction, leaving me wondering what it would have been like if he'd just
> written out of a passion for explaining things without that kind of agenda.
> "Climbing Mount Improbable" is a lovely book, but the whole thing is an
> anti-creationist argument.
I've not read that one yet, but I'm aware of it. Someday...
> But you're right, really compelling computer science books would be a great
> thing, particularly as computers continue to impact the lives of average
> people. And even as somebody who is himself a slightly compulsive programmer,
> I often find computer science books pretty inaccessible if they're about a
> topic that's pretty new to me.
In the 70s, I used to watch Open University shows on TV (see
<URL:http://www.open.ac.uk/> for details), and it never worried
me that the subjects would go right over my head. I read CS books
the same way - if I don't understand it today, perhaps I will
tomorrow, or whenever it is that I'll need it. And I usually do.
There's always enough that I can understand soon after buying a
book to have made it worthwhile, and if I buy a book that I've chosen
myself, it tends to be something I'm pretty damn interested in!
But yes, few of these books are compelling reading. I like the
books with a little humour in them, but there are very few of them.
Knuth's Art of Programming stands out, not just as a CS book (or
books, to be pedantic), but as a fairly _readable_ example. Sadly,
that's not saying much...
I'm sure there are many better CS books that I've just not yet
read, and there are plenty of candidates in my lists of books that
others have recommended. Someday I'll get around to them. In this
respect, I'm the one who should "try harder". ;-)
That's largely because Dawkins and Gould are making observations.
They are not attempting to construct things which have never existed before.
I say that after reading Gould's text book (O&P, very good)
not his popular books (like Mismeasure of Man) which are enjoyable.
Wirth and Ishbiah (sp) are pretty bright guys, but they are not above
making mistakes (Niklaus himself wrote a letter/article when he forgot
an important piece of syntax in Pascal (catch-all exception to the multiway
branch: aka as "OTHERWISE" in some compilers, "ELSE" in other compilers,
and a slew of other keywords having identical semnatics). It was almost
impossible to get this simple (ha!) fix added to the language.
Ritchie is pretty bright, too.
I recommend The History of Programming Languages II (HOPL-II) published
by ACM Press/Addison-Wesley. I can tell you there are no Silicon Valley
copies at Comp Lit Bookshop as I cleaned them all out for friends (you
might find a copy at Stacey's in Palo Alto, the Sunnyvale library has a
copy (I was impressed).
Backus is also bright. Bill Wulf in conversation to me suggested that
the Griswolds are also bright. Oh, a LISP cross post: I occasionally
see John McCarthy at Stanford and Printers Inc. John is also quite bright.
I signed his petition against Computing the Future.
All bright guys, and they all learned (made mistakes along the way).
The brightest most inspired language designers I can think of might be
Alan Kay and Adele Goldberg and their world on Smalltalk-80. If you are
using a windowing system, you are most likely using a system inspired by
them. A very impressive chapter in HOPL-II about them (see the
paragraphs refering to "management").
In article <rb4ti11...@work.nlm.nih.gov>,
Larry Hunter <hun...@work.nlm.nih.gov> wrote:
>I would suggest "The Structure and Interpretation of Computer Programs" by
>Abelson and Sussman as a textbook counterexample. Elegant, clear and
>useful. Anyone who has studied it will understand how to write good code;
>among many others, a generation of MIT undergraduates has been trained with
I do not believe it is sufficient.
I can suggest a shorter reference:
IEEE Computer Dec. 1990.
You merely need a decent library or a 6 year IEEE member to get the gist.
Two articles stand out (one comes from the MIT AI Lab [and Stanford]).
The two articles stand as an interesting contrast: one is a perfect example
of the problems cited by the other:
The order which you read these articles might highly influence your
perception, so I will cite them in page order. Fair enough?
[The annotations are NOT all mine (collected over time).
In particular see the last sentence of the first annotation
to the first article.]
%A Cherri M. Pancake
%A Donna Bergmark
%T Do Parallel Languages Respond to the Needs of Scientific Programmers?
%J Computer
%I IEEE
%V 23
%N 12
%D December 1990
%P 13-23
%K fortran, shared memory, concurrency,
%X This article is a must read about the problems of designing, programming,
and "marketing" parallel programming languages.
It does not present definitive solutions but is a descriptive
"state-of-the-art" survey of the semantic problem. The paper reads like
the "war of the sexes." Computer scientist versus computational scientist,
some subtle topics (like shared memory models) are mentioned. An
excellent table summarizes the article, but I think there is one format error.
[e.g. of barriers versus subroutines.]
It is ironically followed for an article by computer scientists typifying
the author's thesis.
%X Points out the hierarchical model of "model-making (4-level)
very similar to Rodrigue's (LLNL) parallelism model (real world ->
math theory -> numerical algorithms -> code).
%X Table 1:
Category For scientific researcher For computer scientist
*
Convenience
Fortran 77 syntax Structured syntax and abstract
data types
Minimal number of new Extensible constructs
constructs to learn
Structures that provide Less need for fine-grain
low-overhead parallelism parallelism
Reliability
Minimal number of changes to Changes that provide
familiar constructs clarification
No conflict with Fortran models Support for nested scoping
of data storage and use and packages
Provision of deterministic Provision of non-deterministic
high-level constructs high-level constructs
(like critical sections, (like parallel sections,
barriers) subroutine invocations)
Syntax that clearly Syntax distinctions less
distinguishes parallel from critical
serial constructs
Expressiveness
Conceptual models that support Conceptual models adaptable to
common scientific programming wide range of programming
strategies strategies
High-level features for High-level features for
distributing data across distributing work across
processors processors
Parallel operators for array/ Parallel operators for abstract
vector operands data types
Operators for regular patterns Operators for irregular
of process interaction patterns of process
interaction
Compatibility
Portability across range of Vendor specificity or
vendors, product lines portability to related
machine models
Conversion/upgrading of Conversion less important
existing Fortran code (formal maintenance
procedures available)
Reasonable efficiency on most Tailorability to a variety of
machine models machine models
Interfacing with visualization Minimal visualization
support routines support
Compatibility with parallel Little need for "canned"
subroutine libraries routines
%A Andrew Berlin
%A Daniel Weise
%T Compiling Scientific Code Using Partial Evaluation
%J Computer
%I IEEE
%V 23
%N 12
%D December 1990
%P 25-37
%r AIM 1145
%i MIT
%d July 1989
%O pages 21 $3.25
%Z Computer Systems Lab, Stanford, University, Stanford, CA
%d March 1990
%O 31 pages......$5.20
%K partial evaluation, scientific computation, parallel architectures,
parallelizing compilers,
%K scheme, LISP,
%X Scientists are faced with a dilemma: Either they can write abstract
programs that express their understanding of a problem, but which do
not execute efficiently; or they can write programs that computers can
execute efficiently, but which are difficult to write and difficult to
understand. We have developed a compiler that uses partial evaluation
and scheduling techniques to provide a solution to this dilemma.
%X Partial evaluation converts a high-level program into a low-level program
that is specialized for a particular application. We describe a compiler that
uses partial evaluation to dramatically speed up programs. We have measured
speedups over conventionally compiled code that range from seven times faster
to ninety one times faster. Further experiments have also shown that by
eliminating inherently sequential data structure references and their
associated conditional branches, partial evaluation exposes the
low-level parallelism inherent in a computation. By coupling partial evaluation
with parallel scheduling techniques, this parallelism can be exploited for
use on heavily pipelined or parallel architectures. We have demonstrated this
approach by applying a parallel scheduler to a partially evaluated
program that simulates the motion of a nine body solar system.
While I am cutting and pasting....
42. You can measure a programmer's perspective by noting his
attitude on the continuing vitality of FORTRAN.
--Alan Perlis (Epigrams)
55. A LISP programmer knows the value of everything, but the
cost of nothing.
68. If we believe in data structures, we must believe in
independent (hence simultaneous) processing.
For why else would we collect items within a structure?
Why do we tolerate languages that give us the one without the other?
--Alan Perlis (Epigrams)
Alan Perlis in his Epigrams did:
"Just think with VLSI we can have 100 Eniacs on a chip.")
Have I covered enough ground on this one?
: I think asserting that one language is "superior" to another is rather
: like asserting that a hammer is "superior" to a screwdriver. ;-)
: Steve
or in the case of perl, a Swiss Army knife...
-eric
Larry Hunter (hun...@work.nlm.nih.gov) wrote:
: Maynard Handley said:
: The computer languages field seems to have no Dawkins, no Gould, no
: popularizers and not even any very good text books. Every one of the
: books I have tried has come across as boring, poorly structured, making
: no obvious attempt to justify things and completely unwilling to stand by
: itself.
: I would suggest "The Structure and Interpretation of Computer Programs" by
: Abelson and Sussman as a textbook counterexample. Elegant, clear and
: useful. Anyone who has studied it will understand how to write good code;
: among many others, a generation of MIT undergraduates has been trained with
: it.
It is certainly the first book that came to my mind when I read Maynard's
post. However I wouldn't claim it will automatically produce great
programmers...
There are many other programming language texts which I found compelling
reads. From when I was in high school, the SNOBOL IV book by Griswold et
al, LISP by Horn and Klaus (???), and an ancient PDP-10 macro assembler
manual the local DEC sales office was gracious enough to send to me. One
that really stands out is "The C Programming Language" by Ritchie and
Kernighan, though that was a few years after the others. This book was
clearly written and did a pretty good job of selling the C lanugage to lots
of programmers. C instantly made sense to me as a much more useful tool
that Pascal and other alternatives. It perhaps wasn't as totally cool as
LISP, but it was much more practical for writing the kinds of programs I
was interested in. (And the only other language design that has struck me
as better than C in that regard is Modula 3.)
The PERL Camel book is another good example of a book that has done much to
popularize a language.
If you are looking for an overview of all programming languages, you're
going to be left unsatisfied. This is like asking for a summary of many
spoken languages. Yeah, you can enumerate the syntax and grammar and
perhaps even some of the semantics. But without elemnts of the "culture"
behind these languages, they aren't going to make much sense. What is a
language without its idioms, its historical references, and its literature?
I suppose one could ask for a book introducing one to programming language
design. This supposes that such a discipline is evolved enough to have set
rules and principles. I don't think this is really true. And in fact, any
significant language is the product of eveolution and owes a great debt to
its predecessors. So to design a great language, you will probably have to
have a good bit of experience with many existing languages.
If you are going to design a general purpose programming language, its a
good idea to know what a language specification should be as well. Two of
my favorite examples are the Modula-3 spec (from DEC SRC and Olivetti) and
the ANSI-C specification. The Common LISP spec is pretty good too, though
it boggles the mind to understand how that much complexity is
necessary. (As opposed to Dylan where some really big names in the LISP
world totally failed to specify a language at all in the necessary level of
detail.) Note that for both ANSI C and Modula 3, one should be able to
write useful portable programs according to the specification.
Of course the ANSI C++ standard will make the Common LISP standard look
like light bedtime reading. And probably still won't tell you what many
programs should do. If you want proof that language design is not a hard
and fast science, look no further than C++.
-Z-
: > But you're right, really compelling computer science books would be a great
: > thing, particularly as computers continue to impact the lives of average
: > people.
Computer programming, OTOH, is far from the lives of even most unaverage people.
This is the boring, prosaic 'Why do I need to learn what a distributor is to
drive my car' argument.
And a book to explain why computers crash, for example (this being about the
only encounter you're forced to have with internals) would be a horrible mass
of special cases about bugs in the VM system of WinNT : I wouldn't read it, and
I read compulsively.
: But yes, few of these books are compelling reading. I like the
: books with a little humour in them, but there are very few of them.
: Knuth's Art of Programming stands out, not just as a CS book (or
: books, to be pedantic), but as a fairly _readable_ example. Sadly,
: that's not saying much...
For mathematics, I'd recommend *extremely* highly Knuth's 'Concrete
Mathematics'. It's a book with humour in it about binomial expansions,
recurrence relations, and the sort of mathematics that Applied Mathmos
think too pure, and Pure Mathmos think is contemptibly applied.
I'd recommend 'Computer Architecture - a Quantitative Approach' from
someone-and-Pedderson; it explains very well not only what superscalarity
is, but exactly why you'd want it and exactly how much it helps.
--
Tom
The Eternal Union of Soviet Republics lasted seven times longer than
the Thousand Year Reich
John L. Hennessy and David A. Patterson, Morgan Kaufmann Publishers.
http://Literary.com/mkp/new/hp2e/hp2e_index.shtml
Kai
--
Kai Harrekilde-Petersen <k...@dolphinics.no> http://www.dolphinics.no/~khp/
Yamaha XT600Z `88 TƩnƩrƩ DoD#3110 DoDK#2 UiOMC MCEK NMCU MCTC
"Argue for your limitations, and sure enough - they're yours". Richard Bach
> I think asserting that one language is "superior" to another is rather
> like asserting that a hammer is "superior" to a screwdriver. ;-)
I agree. That's why I prefer to avoid arguments about such things.
Perhaps it's possible to disagree over whether a language is a
hammer or a screwdriver, but that's a little different! We can
often make the error of trying to count the number of angels that
can stand on the head of a pin. Some questions are just a matter
of terminology, like whether a language is interpreted or compiled.
It's better to say, "this one is interactive, while that one is
batch", as that's less easily misunderstood. You'd only be talking
about implementations, however, as many of them come in a variety
of forms.
So, first state your preferences (interactive, native code, etc),
and then your choices, and finally how well they work for you.
That way, you can acknowledge just how subjective such issues are.
It seems to me that language (and OS) "wars" are merely signs that
people don't always share the same values. That should be obvious!
So why do so many of us forget it?
> You can divide-and-conquer the LISP people
> (i.e. set flames wars off if you study them carefully,
> remember many great advances never seemed to get past
> version 7, LISP never got reached 2.0).
I used to see hordes of C++ demonstrating hpw clueless they were
about compiler theory. Lisp programmers would patiently explain
what the C++ folk had misunderstood, sometimes refering them to
CS papers, implementations, and even applications.
The attacks seemed to stop last year. It may just be a coincidence
that the Java frenzy started around then. Hmm. I wonder...
All I know is that the "pro-Lisp" posts I've seen have been from
well-informed programmers and computer scientists, and the anti-Lisp
(not much doubt there, so no quotes) being based on special cases,
myths, and plain misunderstandings of what a compiler can do.
The only valid criticism I've seen made is that you don't see
many apps written in Lisp, to which the answer has been for
pro-Lisp advocates to list (no pun intended) various AI apps.
I don't think that many people count AI software as "apps".
It's also a little pointless claiming that operating systems
have been written in Lisp, as most people won't recognise
Lisp Machine software, like Genera, as an OS.
A design principle for Smalltalk-80 was that it didn't use an OS.
I remember when I first learned of Smalltalk, and thinking that
delivery would be a problem. I think that Lisp often has the same
problem, but only because few people seem to take it seriously.
On the other hand, C programmers have tended to take delivery
_very_ seriously - desipite the current complaints about code
bloat in, yes, apps written in C++.
I don't think that anyone has "got it right" yet, whether you're
using Lisp or C++. There are, however, a lot of solutions that
work for most people. What creates a lot of confusion are the
differences in values.
Strangely enough, I've heard this same complaint from all sorts
of people in CS, in many different areas. "I know so much more
than most people about [OS design/Prog Lang design/weird data
structures/exotic searching and sorting/complex system design/
distributed algorithms/logic design/cpu architecture/computer
design/numerical analysis/combinatorics/discrete mathematics/
real analysis/compiler design/computer graphics/software
complexity/software engineering/parallel algorithms/telecommunications/
network protocols/computability theory/computer simulation
and queueing theory] that it surprises me every day!"
Maybe the problem is partly that the whole field identified with
the label "CS" is much bigger than you give it credit for being
and ranges a lot more widely than perhaps you realize.
I'm sure there are many 9-to5'ers but at the same time CS seems
to me to span a huge range of expertise, making it impossible for
anyone to be an expert everywhere in the field.
Pat
But you could write a different book about why computers crash, not
the technical details, but the *real* reason. Talk about what
software engineering is and why it's so difficult, especially on large
projects like operating systems. That could be reasonably accessible
(a la Soul of a New Machine).
--
== Seth Tisue <s-t...@nwu.edu> http://www.cs.nwu.edu/~tisue/
Yes, I've has several arguments with a programmer where I work. She
does not believe
a) You need to know any Computer Science to program well.
b) You need to read manuals
Her idea was that you just take similar working code and copy it!
She reckons you very very rarely need to write new code and if you
do "it's simple"!!.
I had to explain that within a FOR loop you do not need to increment
the contrl variable!! She has been programming for >5 years and did
not know what a FOR loop was even though she had supported code which
used for loops!!
Also she said that "I never understood logic, it's all very hard".
Even simple truth tables did not play a factor in her low level
design of code.
I sometimes wonder too....
--
David Williams
>On the other hand, there are languages which are clearly inferior.
True; sometimes people come up with inspired languages. I used to
program in something almost worse than Intercal. Let me put it this
way: Intercal has control structures (or a form thereof), and local
variables. Intercal allows code on more than one line. Intercal is
actually more readable.
I'm still not really sure how this pinnacle of misery was achieved, but
I can testify to programming it for about two years, off and on.
There's a reason why people use C...
--
Graham Hughes (graham...@resnet.ucsb.edu)
alt.PGPlike-key."gra...@A-abe.resnet.ucsb.edu".finger.look.examine
alt.homelike-page."http://A-abe.resnet.ucsb.edu/~graham/".search.browse.view
alt.silliness."http://www.astro.su.se/~robert/aanvvv.html".look.go.laugh
> Computer programming, OTOH, is far from the lives of even most unaverage people.> This is the boring, prosaic 'Why do I need to learn what a distributor is to
> drive my car' argument.
How true. Few of us need to look inside our TV sets, and we're even
warned _not_ to do this. The attitude with computers is sometimes
different. We're often told that installing some free software is
"easy", only to discover that, no, it ain't easy. Cheap, yes, if
your time is free, but not necessarily easy.
> And a book to explain why computers crash, for example (this being about the
> only encounter you're forced to have with internals) would be a horrible mass
> of special cases about bugs in the VM system of WinNT : I wouldn't read it, and
> I read compulsively.
A lot of software will crash because the programmer was optimistic,
and assumed that something (e.g. a function call) couldn't fail.
I shudder whenever I see the source for a program that includes
a call to fopen and doesn't check that the result isn't NULL. Ok,
most of this code is in tutorials, but that doesn't make it any
less serious. After all, where do new programmers learn these bad
habits from? And then there's the gets function...
I'm sure the problem isn't unique to C. There's just a hell of
a lot of C source available. We all know how, err, expensive the
consequences can be. I guess either some programmers committed
mistakes like these _before_ seeing a worldwide software disaster,
or some of us think that it'll never happen to us, to _our_ code.
> For mathematics, I'd recommend *extremely* highly Knuth's 'Concrete
> Mathematics'. It's a book with humour in it about binomial expansions,
> recurrence relations, and the sort of mathematics that Applied Mathmos
> think too pure, and Pure Mathmos think is contemptibly applied.
I know someone who might like such a book...thanks.
> I'd recommend 'Computer Architecture - a Quantitative Approach' from
> someone-and-Pedderson; it explains very well not only what superscalarity
> is, but exactly why you'd want it and exactly how much it helps.
That's more like my kind of book. ;-)
Thanks.
[stuff deleted to cut to the interest bits ;-)]
> The brightest most inspired language designers I can think of might be
> Alan Kay and Adele Goldberg and their world on Smalltalk-80. If you are
> using a windowing system, you are most likely using a system inspired by
> them. A very impressive chapter in HOPL-II about them (see the
> paragraphs refering to "management").
Before I began to learn _anything_ significant about computer, and
the people in computing, these two became my "heroes". It's tempting
to say "the rest is history", but it's not over yet! Smalltalk hasn't
changed much since leaving the ivory tower, but it _has_ changed,
and fortunately, it hasn't stopped yet.
As for windowing systems, well. Most people feel very strongely about
these things, which is a good way of saying that they've arrived and
won't be going away. Everything else we might say is just an opinion,
and we can be sure that somebody will disagree. ;-)
> In article <rb4ti11...@work.nlm.nih.gov>,
> Larry Hunter <hun...@work.nlm.nih.gov> wrote:
> >I would suggest "The Structure and Interpretation of Computer Programs" by
> >Abelson and Sussman as a textbook counterexample. Elegant, clear and
> >useful. Anyone who has studied it will understand how to write good code;
> >among many others, a generation of MIT undergraduates has been trained with
>
> I do not believe it is sufficient.
I'd say it's a little too much! ;-) The intro warns you that there's
enough material to cover several programming courses. Somebody new
to Lisp might think that they're expected to digest the whole book
as a single learning experience. I wouldn't recommend that!
[references cut to save space]
I've yet to read The Little Schemer, but it sounds like a more gentle
intro to Lisp. Your own recommendations look good, too.
> While I am cutting and pasting....
>
>
> 42. You can measure a programmer's perspective by noting his
> attitude on the continuing vitality of FORTRAN.
> --Alan Perlis (Epigrams)
<grin>
> 55. A LISP programmer knows the value of everything, but the
> cost of nothing.
I sometimes wonder if we could say something similar about C/C++,
perhaps: A C++ programmer knows the cost of everything and the
value of nothing.
Still, I think that we're quoting Alan Perlis out of context...
> 68. If we believe in data structures, we must believe in
> independent (hence simultaneous) processing.
> For why else would we collect items within a structure?
> Why do we tolerate languages that give us the one without the other?
> --Alan Perlis (Epigrams)
A damn good question.
> Alan Perlis in his Epigrams did:
> "Just think with VLSI we can have 100 Eniacs on a chip.")
Aha.
> Have I covered enough ground on this one?
Very probably. ;-)
> Maybe the problem is partly that the whole field identified with
> the label "CS" is much bigger than you give it credit for being
> and ranges a lot more widely than perhaps you realize.
> I'm sure there are many 9-to5'ers but at the same time CS seems
> to me to span a huge range of expertise, making it impossible for
> anyone to be an expert everywhere in the field.
Another term for 9-5'ers might be "non-CS literate". It's not quite
the same thing, but very close.
My definition of non-CS is anyone who doesn't know or care about CS,
and thinks that they can manage ok withput it. And they do: people
still pay them to write code. ;-)
} In reguards to computer languages, I agree with Cyber Surfer. One
}should take the view that each language have inhearant strengths and
}weaknesses. Like tools, one language better suited in a set of tasks
}than another, but one is not inhearantly superior.
On the other hand, there are languages which are clearly inferior.
--
Mike Gebis ge...@ecn.purdue.edu
According to this level of cluelessness, she must have been promoted to
be your boss in the meantime... or is there sexual discrimination in
your company ;-).
The strange thing is, that you get some work done with this clueless
receipts. When you start to work in an environment, it's hard to do a
program from scratch. Take an existing small program and change it so
that it does what you want: it works. Most of the time. In fact, mere
application programming often consists of problems, where "cut'n paste"
programming is appropriate, given that you have enough to copy from.
That's why Visual Basic is a successful programming language.
Another problem is the "clean paper" paralysis. Many people just can't
start writing on a clean sheet of paper. That's why E-mail and Usenet
are so advantaguous over paper communication. There's always something
to start with.
--
Bernd Paysan
"Late answers are wrong answers!"
http://www.informatik.tu-muenchen.de/~paysan/
I find it shocking too, but not surprising. One of the problems is that
colleges and universities aren't really teaching Computer Science anymore -
they're teaching C programming. In college I often got into heated arguments
with professors because they were teaching things which had no intellectual
value. I took a course in compiler design at Smith College which devoted 90%
of the course to parsing! We spent most of the semester on Lex and Yacc, and
barely even touched code generation and optimization. Everything I currently
know about compiler design and general language implementation came the hard
way - by doing it myself.
The rationale the professor gave me was that was of very little use to most
people (I had some harsh words for him at that point).
This is not to say that what's being taught in school is useless - I've been
as succesful as I've been in large part due to several exceptional faculty
members, but those exceptional faculty members are vastly overrun by
run-of-the-mill types who don't instill any sense of wonder or even *interest*
in their students.
>Are we overrun with "9 to 5" programmers, or are we too busy to
>learn anything more what is absulately necessary in order to do
>our jobs vaguely competently? I sometimes wonder.
I often wonder too.
>But yes, few of these books are compelling reading. I like the
>books with a little humour in them, but there are very few of them.
>Knuth's Art of Programming stands out, not just as a CS book (or
>books, to be pedantic), but as a fairly _readable_ example. Sadly,
>that's not saying much...
The entire world of computers needs a better sense of humour. I had two
professors in college who made the dry topics of data structures and theory of
computation *fun*. As a consequence, the knowledge stuck with me, and was more
accesible while I was still clueless.
Adam
I think maybe your problem is Boston.
While I like the really neat work done by friends at the Media Lab,
and some of the work done in surrounded companies, for a particular
conference which I sit on a steering committee, we hear "Don't move the
Conference ot Boston. There's nothing happening here." And that says
nothing of the many other schools around the Boston area.
I'll keep the lisp, because several of those making that comment
are early lispers.
Who would I sit on the floor for in CS?
Well, I am biased. I have a PARC-style bean bag chair in my office
(made by the same person who makes PARC bean bags).
I think I have slouched on the floor for Knuth, Minsky, Norman, Alan Kay,
[I am too respectful of Adele Goldberg to sit on the floor]. Never met
Admiral Grace. I had a chair when I met Seymour. I think there are CS
people around. The guy who wrote the Infoseek search engine had an over flow
crowd in the William Gates Building at Stanford. I didn't get a chair
(sat on a window sill), and people were out in the hall and stayed there.
On Dawkins specifically (since his books are in my to be read pile)(:
I phoned Gould about his comments about evolution and particle accelerators
(with E.O. Wilson) in Ontogeny and Phylogeny (he was reaching, but still
made interesting comments).
>I think this might be the reason, that "the converted" exist in such numbers
>and with such intensity of feeling in various areas of computer science
>(OSes, languages). Joe Weizenbaum talked about "compulsive programmers"
>and I think there was a little truth to that. There are lots of undergrads
>who are powerfully drawn to computer science, and there's no pressure on
>anybody to provide a greater incentive by writing really compelling books
>on computer languages that you just can't put down. Maybe this isn't the
>case for evolution; I don't recall ever meeting a "compulsive evolution
>hacker" when I was in school.
I think the reason for the converted in the feedback interaction speed
of machines (over people) especially when you are doing code development
and you get an error. The imperative nature of programming excites many
people who are able to make the creative vision.
>But you're right, really compelling computer science books would be a great
>thing, particularly as computers continue to impact the lives of average
>people. And even as somebody who is himself a slightly compulsive programmer,
>I often find computer science books pretty inaccessible if they're about a
>topic that's pretty new to me.
Knuth suggested that we ask ourselves: have you read a good program lately?
Not for work, but just for reading pleasure. Not many people do that.
It's funny you mention Weizenbaum, I think that Eliza is one such program
for people to read.
John Lion's original Unix books were recently reprint. That's version 6 Unix.
That's an OS I doubt anyone is running. Why? It's almost insane, well,
there are some interesting ideas inside.
But I think the field is still too young to have something like
The Origin of Species (a good book).
> Yes, I've has several arguments with a programmer where I work. She
> does not believe
> a) You need to know any Computer Science to program well.
> b) You need to read manuals
> Her idea was that you just take similar working code and copy it!
> She reckons you very very rarely need to write new code and if you
> do "it's simple"!!.
> I had to explain that within a FOR loop you do not need to increment
> the contrl variable!! She has been programming for >5 years and did
> not know what a FOR loop was even though she had supported code which
> used for loops!!
> Also she said that "I never understood logic, it's all very hard".
> Even simple truth tables did not play a factor in her low level
> design of code.
> I sometimes wonder too....
>
> --
> David Williams
You work for Microsoft? :) MIKE...
> While I like the really neat work done by friends at the Media Lab,
> and some of the work done in surrounded companies, for a particular
> conference which I sit on a steering committee, we hear "Don't move the
> Conference ot Boston. There's nothing happening here." And that says
> nothing of the many other schools around the Boston area.
Is there anywhere else in this world ? I thought civilization as we know
it ends at the Berkshires, and a few miles south of I90. :-)
Alberto.
In article <849960...@wildcard.demon.co.uk>,
Cyber Surfer <cyber_...@wildcard.demon.co.uk> wrote:
>> Alan Kay and Adele Goldberg and their world on Smalltalk-80.
>
>Before I began to learn _anything_ significant about computer, and
>the people in computing, these two became my "heroes". It's tempting
>to say "the rest is history", but it's not over yet! Smalltalk hasn't
>changed much since leaving the ivory tower, but it _has_ changed,
>and fortunately, it hasn't stopped yet.
I didn't finish the story fortunately. I spun a nice sounding but
incomplete yarn, deliberately. I went to Comp.Lit shortly after that
plane flight returning from Redmond. There were all these books on
Smalltalk for IBM systems. Now, not to dump too much on IBM, these
books were all completely uninspiring.
>As for windowing systems, well. Most people feel very strongely about
>these things, which is a good way of saying that they've arrived and
>won't be going away. Everything else we might say is just an opinion,
>and we can be sure that somebody will disagree. ;-)
Ah, if you only get a chance to work with Cedar (which I've seen)........
Or maybe Xanadu (which I've not seen, but monthly see the principals).....
>> 55. A LISP programmer knows the value of everything, but the
>> cost of nothing.
--Alan Perlis
>I sometimes wonder if we could say something similar about C/C++,
>perhaps: A C++ programmer knows the cost of everything and the
>value of nothing.
>
>Still, I think that we're quoting Alan Perlis out of context...
The Epigrams were supposed to be self-contained.
So don't feel too bad about it.
>> 68. If we believe in data structures, we must believe in
>> independent (hence simultaneous) processing.
>> For why else would we collect items within a structure?
>> Why do we tolerate languages that give us the one without the other?
>> --Alan Perlis (Epigrams)
>
>A damn good question.
The reason, I believe, that that we did not know better.
We thought too much of the general and not enough about the special
cases like exception handling, processing conditionals (IF statements),
etc. We were lulled into a false sense of knowing because we mistook
the problem for a geometric one (parallel lines) when the reality was
that we needed costly marginally scalable synchronizations.
Dave Kuck in one of his ICPP (or could have been Sagamore) addresses
did a nice summary of the problems. You could not limit generality well
enough (I really should go back and find that paper for the FAQ).
>> Have I covered enough ground on this one?
>
>Very probably. ;-)
Good! Then we should progress!
"Beware the Turing tarpit."
--AP
>Maybe the problem is partly that the whole field identified with
>the label "CS" is much bigger than you give it credit for being
>Pat
It's true that CS is a very big umbrella, and covers a wide, and sometimes
exotic range of topics.
However, out of your list, it's *still* surprising how little many programmers
know about: software complexity, theory of computation, algorithms, and
important concepts like data abstraction.
Complexity theory and theory of computation are absolutely basic for an
understanding of what computers do and why you can't do certain things, and,
much more importantly, why people are complaining about the performance of you
just-shipped product.
I've seen highly paid programmers use an algorithm that had exponential
complexity where they could have substituted one with logarithmic complexity
if they had a deeper understanding of algorithms.
I get paid to work on a product that implements a flavor of lisp, and uses a
more primitive, less efficient object representation than an interpreter I
wrote as an undergrad.
My (slightly unfocused) point is that there's a core of CS knowledge that's
common across all sub-fields, and it's disturbing how many programmers don't
know the theory behind what they do.
Adam
Been there.
Wheeler and Capek and others come as well as NYC and Bell Labs people
(Ken Thompson showed his MiG-29 video, but that's not parallel computing).
But we are still told to meet in California (as the omni-present George
Michael says, "The land of the lotus eaters").
Our meeting has a reasonable pulse of what is happening in Japan from the high
end (we invite David Kahaner, and some day he will come) as well as low
end Yakhabara (I am butchering of my honorable ancestors' tongue).
I think our coverage of India might be a bit weak, but we see talent
from Europe and Russia.
It's not lisp except to say that I didn't get much time to talk with
Dick Greenblatt this year.
I think that in some ways California is overrated. But I am a native.
I guess a lot of people just don't like to shovel snow.
It was once believed that Illinois was the home of parallel machines and
computing. Or Rice/Cornell if you worked for Big Blue. The real
question will be the influence of Washington.
Some of the components of parallel computing need to be placed in the
hands of certain bright and knowledgeable people for a start, and maybe,
that will never be enough.
Tony Griffiths <to...@OntheNet.com.au> writes:
>I doubt that any "inspired" language is of much use in the real world!
>Most languages are the product of a lot of hard work...
The one I'm talking about was the product of a lot of hard work. :-)
Just the wrong type of work... Unfortunately, it was originally
concieved as a very simple addition to the program in question, and
capabilities (like list iteration and if/then statements) were grafted
onto it over a period of time. It is now Turing compliant, but you
wouldn't want to do much in it. Shows what can happen to a language if
you let enough incompetents hack at it for enough time.
>The reason people use C is the same reason Basic became so popular in
>the PC world... there is a zero learning curve needed to write your
>first 'working' program. Eg.
I don't know too much about that; the APL `first working program' is
just
__
|| <- "Hello World"
--
(compensated for lack of character set), but that doesn't mean people
use it anymore. I think lots of people started using C because it was
much more structured than the assembler and yet was fast enough to write
a Unix kernel, and the language grew from there. C has one major
benefit to an experienced programmer in that it will very happily stay
out of your way, even when you make mistakes. Some of us occasionally
like that trait.
>Most people can come to grips with the above!!! It is your second
>working program where a degree of difficulty set in. ;-))
Certainly true, though.
>As for programming languages, I would switch back to Ada from C any time
>given the opportunity. As a language is is "far" superior to C/C++
>simply because more grey matter was applied before the language was
>defined. I know it is not perfect, but what language is. I doubt that
>you could define such a language! Modula 3, and now possibly Java, fall
>into the same category as Ada although there is more "hope" for Java.
I actually looked at Ada for a while; it's nice in a number of ways
(like it's actually LALR(1)...), but I do most of my grunt work in C++
these days because I'm more comfortable with the object model.
Incidentally, C++ is a language where the features have been grafted on
after the fact, but Bjarne has done a much nicer job than I would of
making the language usable. Most of the cruftiness people associate
with C++ is inherited from the C subset, IMHO.
>The ONLY reason I stick with C is it's universallity (is that a real
>word?), but that fact remains that it is not a high quality programming
>language.
<shrug> The type declaration syntax I take exception with, the implicit
ints all over the place are a pain, and the switch statement really has
no business doing fallthroughs. Other than that, I like the language a
lot; it's very good at what it does.
- --
Graham Hughes (graham...@resnet.ucsb.edu)
alt.PGPlike-key."gra...@A-abe.resnet.ucsb.edu".finger.look.examine
alt.homelike-page."http://A-abe.resnet.ucsb.edu/~graham/".search.browse.view
alt.silliness."http://www.astro.su.se/~robert/aanvvv.html".look.go.laugh
-----BEGIN PGP SIGNATURE-----
Version: 2.6.2
iQCVAwUBMqzhCCqNPSINiVE5AQGibgP/c8dO6Y8jJESeF3hgRRBR7MmvJ+/R2aFu
ddEuArOaorew0KcIgjp2sp/VxgnV+CdKds4Y+2wEuk8Jmlyk6e9BDnWVp05MNx+7
gpuFuxQYah85n6xE5SYshTRZvjGyPsoXVku7X5A2gukqIm1P8CRtS5QqUcnRI6VU
OM//v32DI7c=
=6asj
-----END PGP SIGNATURE-----
>
>Another problem is the "clean paper" paralysis. Many people just can't
>start writing on a clean sheet of paper. That's why E-mail and Usenet
>are so advantaguous over paper communication. There's always something
>to start with.
>
--
David Williams
I doubt that any "inspired" language is of much use in the real world!
Most languages are the product of a lot of hard work...
>
> I'm still not really sure how this pinnacle of misery was achieved, but
> I can testify to programming it for about two years, off and on.
> There's a reason why people use C...
The reason people use C is the same reason Basic became so popular in
the PC world... there is a zero learning curve needed to write your
first 'working' program. Eg.
Basic: 10 print "Hello world"
20 end
C: #include <stdio.h>
main() {
printf("Hello world\n");
}
Most people can come to grips with the above!!! It is your second
working program where a degree of difficulty set in. ;-))
As for programming languages, I would switch back to Ada from C any time
given the opportunity. As a language is is "far" superior to C/C++
simply because more grey matter was applied before the language was
defined. I know it is not perfect, but what language is. I doubt that
you could define such a language! Modula 3, and now possibly Java, fall
into the same category as Ada although there is more "hope" for Java.
The ONLY reason I stick with C is it's universallity (is that a real
word?), but that fact remains that it is not a high quality programming
language.
> --
> Graham Hughes (graham...@resnet.ucsb.edu)
> alt.PGPlike-key."gra...@A-abe.resnet.ucsb.edu".finger.look.examine
> alt.homelike-page."http://A-abe.resnet.ucsb.edu/~graham/".search.browse.view
> alt.silliness."http://www.astro.su.se/~robert/aanvvv.html".look.go.laugh
Tony
The Berkshires?? You're generous! I think civilization is long gone by
the time you hit Worcester. Doesn't start up again til past Bridgeport, but
by then it's, as you say, not civilization as we know it. They have some
kind of strange red soup they call "clam chowder". Go figure.
--
-------------------------------------------------------------
Will Ware <ww...@world.std.com> web <http://world.std.com/~wware/>
PGP fingerprint 45A8 722C D149 10CC F0CF 48FB 93BF 7289
Scheme:
(display "Hello world")
(newline)
Common Lisp:
(format t "Hello world~%")
So... Explain to me again how that "#include <stdio.h>" and "main(){}" stuff
represents a *ZERO* learning curve???
-Rob
-----
Rob Warnock, 7L-551 rp...@sgi.com
Silicon Graphics, Inc. http://reality.sgi.com/rpw3/
2011 N. Shoreline Blvd. Phone: 415-933-1673 FAX: 415-933-0979
Mountain View, CA 94043 PP-ASEL-IA
> I didn't finish the story fortunately. I spun a nice sounding but
> incomplete yarn, deliberately. I went to Comp.Lit shortly after that
> plane flight returning from Redmond. There were all these books on
> Smalltalk for IBM systems. Now, not to dump too much on IBM, these
> books were all completely uninspiring.
My one IBM book is their "SAA CUA Advanced Interface Design Guide".
which I keep at one end of my shelf of programming language books.
Next to it are the "green", "red", and "blue" books, and on the
other side of those mighty tomes are the two P4 books. Most of the
other books are about Lisp. I hardly ever touch the IBM book.
So, I agree with you. ;-)
> >As for windowing systems, well. Most people feel very strongely about
> >these things, which is a good way of saying that they've arrived and
> >won't be going away. Everything else we might say is just an opinion,
> >and we can be sure that somebody will disagree. ;-)
>
> Ah, if you only get a chance to work with Cedar (which I've seen)........
> Or maybe Xanadu (which I've not seen, but monthly see the principals).....
Stop! You're making me envious. ;)
> The Epigrams were supposed to be self-contained.
That's what I guessed.
> So don't feel too bad about it.
I don't. ;)
> The reason, I believe, that that we did not know better.
> We thought too much of the general and not enough about the special
> cases like exception handling, processing conditionals (IF statements),
> etc. We were lulled into a false sense of knowing because we mistook
> the problem for a geometric one (parallel lines) when the reality was
> that we needed costly marginally scalable synchronizations.
Special cases have a bad habit of jumping up and bitting your bum.
> Dave Kuck in one of his ICPP (or could have been Sagamore) addresses
> did a nice summary of the problems. You could not limit generality well
> enough (I really should go back and find that paper for the FAQ).
That would be most welcome!
> Good! Then we should progress!
>
> "Beware the Turing tarpit."
> --AP
"What Turing tarpit - oh, _that_ one..."
We could all use some big rubber boots.
> But I think the field is still too young to have something like
> The Origin of Species (a good book).
Is that the 1st edition or the 9th edition? ;-)
I think the point that some of us here are making is that a lot
of CS books are like the 9th ed, when they should be more like
the 1st ed. I think that you may also be saying this!
Or have I completely misunderstood everyone?
Tony Griffiths <to...@OntheNet.com.au> wrote in article
<32ACC7...@OntheNet.com.au>...
> Graham Hughes wrote:
> >
> > ge...@purcell.ecn.purdue.edu (Michael J Gebis) writes:
> >
> > >On the other hand, there are languages which are clearly inferior.
Reall? I will get back to this later...
> >
> > True; sometimes people come up with inspired languages. I used to
> > program in something almost worse than Intercal. Let me put it this
> > way: Intercal has control structures (or a form thereof), and local
> > variables. Intercal allows code on more than one line. Intercal is
> > actually more readable.
>
When people say that a programming ,anguage is inferior they usually mean
that they don't like it or it doesn't suit a certain task well. As an
example, I took a two semester programming class; during the first semester
a language called ISetL was used (Interactive Set Language). As the name
suggests, it is a set language and crunches numbers quite well. However,
in the class it was used for everything from number crunching to string
manipulation (you know there is a flaw in your reasoning when you are
packing characters into a tuple in order to make a string :-). I like
ISetL, I didn't like ISetL as it was used in that class. There is also the
case of compiler/interpreter/platform dependencies. On MS-DOS ISetL is
very restricted as to memory usage and therefore was slooooooooow. It
would take several minutes to search a file containing Poe's "Raven" and
print out a word count. This implementation of ISetL is obviously inferior
to a 32-bit assembler version of "wc" on NT which can search the same file
instantly.
> I doubt that any "inspired" language is of much use in the real world!
> Most languages are the product of a lot of hard work...
I agree. There are tons of "neat" new (and old) languages out there that
no one uses because a) they attempt to introduce an entirely new paradigm
which makes them highly undesireable when writing mission-critical code, or
b) they do something that another language already does. Are they
inferior? Nope, just different. Are they inferior for a partcular purpose?
Yep.
>
> >
> > I'm still not really sure how this pinnacle of misery was achieved, but
> > I can testify to programming it for about two years, off and on.
> > There's a reason why people use C...
>
> The reason people use C is the same reason Basic became so popular in
> the PC world... there is a zero learning curve needed to write your
> first 'working' program. Eg.
>
> Basic: 10 print "Hello world"
> 20 end
>
> C: #include <stdio.h>
> main() {
> printf("Hello world\n");
> }
Hmmm...why is the "main" part outside of the parenthesis? And what do
those {} mean? Why aren't there any line numbers? :-) I think it is a
long shot to say that C has a zero learning curve for anything.
>
> Most people can come to grips with the above!!! It is your second
> working program where a degree of difficulty set in. ;-))
>
True, but your argument is non sequiter. What your saying is that because
the hello world program in C is easy, C became one of the most popular
programming languages.
> As for programming languages, I would switch back to Ada from C any time
> given the opportunity. As a language is is "far" superior to C/C++
> simply because more grey matter was applied before the language was
> defined. I know it is not perfect, but what language is. I doubt that
> you could define such a language! Modula 3, and now possibly Java, fall
> into the same category as Ada although there is more "hope" for Java.
>
How so? I see java as a hacked up C++...
> The ONLY reason I stick with C is it's universallity (is that a real
> word?), but that fact remains that it is not a high quality programming
> language.
Can you elaborate on what you mean by a high quality programming language?
Weak vs. strong typing? High level constructs?
>So... Explain to me again how that "#include <stdio.h>" and "main(){}" stuff
>represents a *ZERO* learning curve???
You are forgeting the "System" perspective.
For Basic and C, if the compiler/interpretor is not already on a
machine (any machine), any idiot can buy it at the corner computer
store and install.
For Scheme and Lisp, finding/buying/downloading and then compile/install
is distinctly not for people still learning to write "Hello World".
--
Stanley Chow; sc...@bnr.ca, stanley....@nt.com; (613) 763-2831
Bell Northern Research Ltd., PO Box 3511 Station C, Ottawa, Ontario
Me? Represent other people? Don't make them laugh so hard.
with text_io; use text_io;
procedure main is
begin
put("hello, world");
new_line;
end;
Paul
Alberto is right. Civilization does end south of I-90.
What is south of I-90? Iowa.
Rob
--
Rob Peglar StorageTek, Network Systems Group
ro...@network.com 7600 Boone Ave N. Mpls. MN 55428
612.391.1028 612.391.1358 (fax)
Well... Amherst is nice; ever had pizza at Antonio's ?
Alberto
I cited this document not to say either that Ada is the wave of the future
or that real-time embedded applications are "it,"
but more as document which gives insights into the thinking and
motivations for why languages are the way they are.
>We could collect: great errors in programming language design
>(e.g., GOTOs [or not], the original FORTRAN DO-loop, etc.).
P.S. A whole generation or two of programmers never saw the problems
with GOTOs. Whole generations of programmers never saw what the problem
with original DO-loops were (you could never NOT execute a loop without
resorting to use another branch: always executed one time was considered
a feature rather than a bug), and other empirically determined lessons.
Some truth in this.
Available libraries tend to be a big part in the useability of a language.
If you have to write basic I/O for things like floating point you are
almost in trouble. See next comment
>The reason people use C is the same reason Basic became so popular in
>the PC world... there is a zero learning curve needed to write your
>first 'working' program. Eg.
>
>Basic: 10 print "Hello world"
> 20 end
>
>C: #include <stdio.h>
> main() {
> printf("Hello world\n");
> }
I did a simple but surprisingly important CUG'95 paper using hello world
(Calibrating the Cray HPM, I need to html it and make it a web page).
In a way you should not have to #include stdio.
C began popular: 1) because it came with Unix, 2) developed a momentum
from Universities who initially got $150 Unix licenses in a world where
vendors kept OSes complex and proprietary. An economic thing.
It lacks some useful support protection (relies heavily on its underlying OS
and library of system calls) for parallelism.
What's great about C is that it embodies a KISS principle which we
frequently need to be reminded. It survived a fairly harsh environment
of skepticism and change (The Labs and the phone company). It has a
run-time environment generally better than BASIC or FORTRAN.
It is important not over blow its significance. I'm not a C++ user.
Maybe I should be, maybe I should not. The problem is that it sometimes
takes years to embody and test a language, and evaluation suffers from biases
and other effects.
>Most people can come to grips with the above!!! It is your second
>working program where a degree of difficulty set in. ;-))
>
>As for programming languages, I would switch back to Ada from C any time
>given the opportunity.
>
>The ONLY reason I stick with C is it's universallity (is that a real
>word?), but that fact remains that it is not a high quality programming
>language.
It helps to have portable compilers (pcc, P4, whatever).
I would like to work more with Icon and I'm impressed with some of the
features of CLU. I'm far from a decent LISP programmer, but the problem
with languages is that THEY AREN'T solving a lot of problems. If they
are a tool, they are a distraction. No wonder why we have few parallel
languages.
I think what made BASIC key for computing was the fact that it was
interpreted and interactive. Supercomputers and parallel systems need
more interpreters, to allow people to prototype ideas (George Michael
would disagree with me). APL was also interpreted. But we have to try
many things...... And they will all be distractions, but if we expect
to advance, we will have to re-invent many wheels to get to steel belted
radials.
In Alberto's defense, you do not have to have civilization, you merely
have to have critical-mass hacker culture. I should be able to put a
toy problem and watch a tinkering brain come up with a non-ordinary solution.
We need more of that kind of thinking away from the lawyers and managers
in NYC.
No longer architecture. F-Us combined. Lisp kept for historical reasons.
Supers.....Coin flip.
In article <01bbe683$61b72f40$5f51e0c7@aqualung>,
Phillip N Toland <tol...@epix.net> wrote:
>This is my favorite topic to hate, so I think I will add my $.02 worth
>:-)...
>I agree.
I will disagree. The problem is that every language is inspired to its
designer. The problems come over time when we ask languages to do
things they were never intended to do. Hence "environments."
Tony's statement while well intended is just slightly off the mark.
My big problem with a lot of computing is that it lack empirical
basis as a science. It's too young. We only learn about problems after
a period of time, not merely hard work.
>There are tons of "neat" new (and old) languages out there that
>no one uses because a) they attempt to introduce an entirely new paradigm
>which makes them highly undesireable when writing mission-critical code, or
>b) they do something that another language already does. Are they
>inferior? Nope, just different. Are they inferior for a partcular purpose?
>Yep.
We must accomplish a task, work, etc.
The problem is that we must balance that against featuritis.
That's by extensibility is important. People do use COMPLEX variables
in FORTRAN, but fewer do work in quarterions, should users have to balance
variable names because of the lack of user defined types?
The Little Languages philosophy is an important one. The second
language I attempted to learn was PL/1. Lots of lessons there.
>> C: #include <stdio.h>
>> main() {
>> printf("Hello world\n");
>> }
I will collect the Hellos and add them to the cft77, Pascal, Ada, C,
Scheme, and CL versions. Other versions can be mailed to me..... 8^)
and if a compiler runs on a Cray, I will HPM them.
>Can you elaborate on what you mean by a high quality programming language?
>Weak vs. strong typing? High level constructs?
"Strong typing is for weak minds...." famous quote.
First, a high quality programming language prevents me from
unintentionally shooting myself in the foot. Typing is merely one
attempt at a compile time solution.
Constructs, sure! User defined data types and structures: can be helpful.
But what about the compile and run-time environments?......
Big subjects. No time.
What about Rexx?
say "hello, world";
Or EXEC 2, where you don't even need the quotes:
&print hello, world
Or S/390 assembler, for that matter, given that most run-time environments
have some kind of 'print' function. In the system I use, the program is:
USING *,12 Local addressability (one quickly gets used to this)
LA 1,ST Addr of string
LA 2,L'ST Length thereof
B $WRITE Write to terminal; ret via R14 with R15=0 (success)
ST DC C'hello, world'
COPY $LOW Run-time environment, includes definition of $WRITE
END
This was a complete program. In practice one would probably use a macro,
and I have a choice between one that requires a quoted string argument, and
one that eats raw source lines for better readability:
USING *,12 (The calling convention sets R12)
TYPE 'hello, world' This does roughly what was shown above
TELL -END This macro generates one call to $WRITE per line of text
Here is some free-form text.
I don't even have to worry about embedded quotes or other funny chars (&).
-END
SR 15,15 Claim success
B $RET Return to caller
COPY $LOW
END
(Flame bait? Perhaps, but that is not my intent. Some 15 years ago I used
this example in a talk that addressed the size of object files for small or
simple source programs. That talk *was* flame-bait: "The case against
High-Level Languages" -- the point being that many Hello_world object files
were huge (tens or hundreds of Kbytes), whereas my first example above had
a size dominated by the object file header; the code&text occupied 24 bytes
including the 12-byte string.)
Michel.
He's got a point though: the whole infrastructure is in place for C, in a
way that it isn't for Lisp or Scheme. When you're a clueless newbie and you
first walk up to a computer, chances are it's set up for C (or maybe Basic)
and the whole environment around the computer is designed to support work
in C. The result is that you become another C programmer, and perpetuate
the whole thing.
People sometimes rage against the apparent conspiracy as if there were really
other people working somewhere to systematically wipe out Lisp, at whom the
first bunch ought to be angry. But of course, this isn't the case. Like
Adam Smith's invisible hand, the "conspiracy" that keeps C on top needs no
active sentient planners, just a lot of loosely coupled individuals each
pursuing their own ends.
In the KISS department, JAVA out-"C"s both C and C++,
although its performance currently sucks.
I am encouraged how much smaller my codes are in JAVA
for equivalent functionally in C or C++.
Hope to see an efficient interpreter on a super soon.
> >We could collect: great errors in programming language design
> >(e.g., GOTOs [or not], the original FORTRAN DO-loop, etc.).
>
> P.S. A whole generation or two of programmers never saw the problems
> with GOTOs. Whole generations of programmers never saw what the problem
> with original DO-loops were (you could never NOT execute a loop without
> resorting to use another branch: always executed one time was considered
> a feature rather than a bug), and other empirically determined lessons.
FORTRAN (the standard) *never* required always executing the loop
once. The '66 standard allowed it, by virtue of not defining what
should happen (it was before my time, and I don't recall whether the
old timers did it to accommodate some existing practice on some
platform or whether it was a simple oversight).
Failure to specify (even with the weasel words "processor dependent")
makes me somewhat inclined to the latter. Certainly it's hard to say
there is evidence the committee ever thought of it as a feature.
All that was from memory. Actually looking at the '77 standard,
appendix B section notes says:
If J1 > J2, ANSI X3.9-1966 does not allow execution of the
following do statement:
DO 100 J=J1,J2
some processors that allowed such a case executed the range of
the DO-loop once, whereas other processors did not execute the
range of the DO-loop. This standard allows such a case and
requires that the processor execute the range of the DO-loop
zero times.
Which I interpret as the later incarnation of the committee pointing
out that *writing* the "one trip do loop" was nonstandard conformant
to begin with.
Hardly evidence of conscious design. It's possible that some
implementors thought it was a "feature" to allow it; but I suspect
that what generally happened is that the "one trip" do fell out of a
particular implementation decision(s) (essentially whether to test at
the top or the bottom of the loop; with some processors favoring the
test at the bottom only for performance reasons), and was eventually
documented when someone (either in support or "techpubs") noticed that
it worked, and needed to explain it to users.
--
Keith H. Bierman keith....@Sun.COM| k...@chiba.Eng.Sun.COM
SunSoft Developer Products | k...@netcom.com
2550 Garcia UMPK16-304 415 786-9296 | (415 7869296) fax
Mountain View, CA 94043 <speaking for myself, not Sun*> Copyright 1996
>
> FORTRAN (the standard) *never* required always executing the loop
> once. The '66 standard allowed it, by virtue of not defining what
> should happen (it was before my time, and I don't recall whether the
> old timers did it to accommodate some existing practice on some
> platform or whether it was a simple oversight).
The original FORTRAN (which, by virtue of being first, comprised
a standard) had 1-trip DO-loops (i.e. code in the loop was always
executed at least once). This was, indeed, a feature - the old
IBM machines had an instruction that fit the execution of a
DO-loop test and branch quite nicely *if* the test was done at
the end of the loop. A 'standard' FORTRAN (one defined by a
consortium) came along quite a ways afterwards. As far as I can
remember, it wasn't until FORTRAN IV that some compilers started
allowing 0-trip Do-loops, provided you remember to drop the
correct option card into the deck.
The reason for the 1-trip DO-loop was the same as the reason
for the three-way branch: there was a particularly efficient
way to implement them on the original target machine, so they
were put into the language.
--
Steve Wampler - swam...@gemini.edu [Gemini 8m Telescopes Project (under
AURA)]
The Gods that smiled upon your birth are laughing now. -- fortune cookie
> But what about the compile and run-time environments?......
Make files should not be required, compiler should be able to figure
out dependencies on its own.
Language should be fully definable using EBNF.
I leave it to the reader to draw his own conclusions about the languages
popular today.
--------------------------------------------------------------------
Pascal Dornier pdor...@best.com http://www.best.com/~pdornier
Your Spec + PC Engines = Custom Embedded PC Hardware
--------------------------------------------------------------------
Including the semantics?
Tony.
--
Strc prst skrz krk
[major snip]
>I think what made BASIC key for computing was the fact that it was
>interpreted and interactive.
It's a mistake to think that BASIC was intended as an interpreted
language, however. Oh, I know it's often been implemented that way,
but that was not the original intent of Dartmouth BASIC, which was
always compiled, as far as I know.
-- David Wright :: wri...@hi.com :: Not an Official Spokesman for Anyone
These are my opinions only, but they're almost always correct.
"The difference between a printing press and a modern digital long-
distance network is that the press produces money much more slowly."
-- Neil Kirby, Lucent Technologies
Compared to what? I would say C is in direct opposition to the
Keep It Sweet and Simple principle, unless you are comparing
it to C++ or JAVA or something even more complex. I will agree
with you when you show me a C that can compile itself in a couple
of K of memory.
>In the KISS department, JAVA out-"C"s both C and C++,
>although its performance currently sucks.
>I am encouraged how much smaller my codes are in JAVA
>for equivalent functionally in C or C++.
It seems you judge complexity of the language by the length
of application source, while I would use the size of the
compiler source. When I see that many megabytes of source I don't
think of it as an example of simplicity.
Certainly C++ is even further away from KISS than C, but I would
say JAVA is in between. If you think these things are examples of
KISS what are you comparing them to?
Jeff Fox
My experience is that languages become popular when reasonably efficient
implementations become available in an environment where other alternatives
are, for whatever reason, less attractive. Given that learning a new
language is always an effort (even though pleasurable on occasion), they
will always run into some resistence as long as you can get away with using
the established languages in any given environment. Plus, there is always
the problem of "but we have so much invested in ...", "everybody knows ..."
Hartmann Schaffer
Thats a pretty meaningless language. EBNF specifies syntax but not
semantics. Describable using just EBNF means that all you can do is
parse the code and determine whether the code is syntactically correct.
--
William B. Clodius Phone: (505)-665-9370
Los Alamos Nat. Lab., NIS-2 FAX: (505)-667-3815
PO Box 1663, MS-C323 Group office: (505)-667-5776
Los Alamos, NM 87545 Email: wclo...@lanl.gov
> Scheme:
>
> (display "Hello world")
> (newline)
>
> Common Lisp:
>
> (format t "Hello world~%")
>
> So... Explain to me again how that "#include <stdio.h>" and "main(){}" stuff
> represents a *ZERO* learning curve???
That's easy. Scheme and CL both use symbolic expressions, and
learning curve you have to climb before you can
1) understand what they are,
2) why they're there, and
3) why you'd want to use them
is way to steep for most people.
Perhaps those of us who Lisp use are only those who're fortunate
enough to
1) take a look at Lisp
2) persist long enough to discover 1) thru to 3)
[see above paragraph], and
3) can then find a job in which it's possible to write code in Lisp
I'm in the group of Lisp programmers who would choose Lisp,
but can't find anyone willing to employ us to actually code
in Lisp. So, I'm coding in C++, instead.
Meanwhile, for fun and in my spare time, I'm writing a simple
Lisp to C++ compiler in O'Caml. Don't ask why - my answer will
only be something annoying like, "because I can". Since I prefer
to do other things in my spare time besides programming, I'm
making _very_ slow progress, thus ensuring that a compiler with
absolutely no commercial value will also be of no practical value,
either, as it'll probably be perpetually unfinished.
I do _not_ code in C++ for fun. I did that 10 years ago, when a
C compiler was the only commercially available "high level" language
available to me. The alternative was assembly language. I used
both to write crude Forth and Lisp systems...which were _real_ fun.
On the other hand, it's fun to write multimedia software in C++
and know that, for example, hundreds of thousands of doctors are
using it.
In article <58ivka$9...@tokyo.engr.sgi.com>, rp...@rigden.engr.sgi.com (Rob Warnock) writes:
> Tony Griffiths <to...@OntheNet.com.au> wrote:
> +---------------
> | The reason people use C is the same reason Basic became so popular in
> | the PC world... there is a zero learning curve needed to write your
> | first 'working' program. Eg.
> |
> | Basic: 10 print "Hello world"
> | 20 end
> |
> | C: #include <stdio.h>
> | main() {
> | printf("Hello world\n");
> | }
> +---------------
>
> Scheme:
>
> (display "Hello world")
> (newline)
>
> Common Lisp:
>
> (format t "Hello world~%")
>
> So... Explain to me again how that "#include <stdio.h>" and "main(){}" stuff
> represents a *ZERO* learning curve???
Well **I** want to know about that "\n" thingy! What the heck is it
there for? AND WHY DIDN'T IT PRINT OUT?! Everything I printed in my
Basic program came out. See? What's wrong with this "C" sh*t?
---------------- Cray Research ---------------- *** Roger Glover ***
XXXX\ \ / \ /XXX \ / \ X \ /\\\ X///X /\\\ *** Software Instructor ***
/ \ / \/ /\ / \ / \X /\ X \ / \ X\ \ X DISCLAIMER HAIKU:
//X/ X\\\X //X/ X \ X X\\ / \ X/X \ X \\\ C R^H^H^H^H S G I may not
/ \ X///X / \/ X//XX X \ / \ X \\ X \ Share these opinions with me
/ \ X X /\\\/ X X X///X /XXX/ X///X /XXX/ This is not my fault
---------- A Silicon Graphics Company --------- http://home.cray.com/~glover
Veteran soldier^H^H^H^H^H^H^H^H instructor of many a C, C++, UNIX
Usage, Shell Programming, and (more recently) Java class
"Repeat after me: there is joy beyond the pain, there is joy beyond
the pain, there is ..."
USING *,12 *(The calling convention sets R12)
Hmmm.... Now that wouldn't be from sonething on entry like:
STM 13,12,12(13) *(IBM BAL Standard Linkage for Subroutine Calls)
Geeze, I ain't had so much fun since I wrote my last BXLE!
--
David Ecale
ec...@cray.com Work = 612-683-3844 // 800-BUG-CRAY x33844
http://wwwsdiv.cray.com/~ecale Beep = 612-637-0873
Hmmm... Well, I can't attest to Autocoder on the IBM 1401, but, if we are
talking about FORTRAN66, then I am going to point out that the instruction
in mind on IBM OS360/370 was explicitely designed to work DO loops. It was
called Branch on indeX Low or Equal, BXLE. It's compatriate was Branck on
indeX High, BXH. I passed a job interview for EDS (I ended up not taking
the job) by being able to explain the instruction execution. Here's how
BXLE works:
* T&B here to bail before 1st pass (very effecient)
LA R1,START * Starting Address of Array
LA R2,DISP * Stride through the Array
LA R3,END-DISP * Ending Address of Array
LOOP = *
.
. * Pass through the array ...
.
* T&B here to bail after 1st pass (very in-effecient)
BXLE R1,R3,LOOP * R1 gets R1 + R2
* If new R1 =< R3 then branch to LOOP
* Else fall through, we're done ...
Note that the BXLE & BXH were designed to be placed at the bottom of the
loop. A test and branch sequence would need to be set up directly before
the LOOP tag to bail out on a no-pass option. Once in this loop, you were
going for the long haul. The (really bad BAL) code example above basicaly
defines a DO loop in FORTRAN. It is also used in a FOR loop in other
languages.
Note also that a test & branch inside the loop was a common way (in the bad
old days) for bailing out mid way through a DO loop. Needless to say,
putting a test and branch dependancy in the loop caused the code to fail to
vectorize. But, hey, this was the 60s, after all!
And, for the real affectionados, Test and Set, TS would set a trap door so
that a piece of code would only be used once! The way I originally learned
to do this was (NOP = No Operation, OI = Or Immediate, & B = Branch):
TAG NOP SKIP * This is a "47 00 A(SKIP)"
OI X'F0',TAG+1 * This makes "47 F0 A(SKIP)"
* which is "B SKIP" !!!
Yup, my first piece of self-modifying code, ever. !!!
<<God! It's good to get that out of my system! Can I go home now, Mr.
Watson?>>
Actually, it cannot do so instantly, but I know you know that. :-)
The fastest possible version of wc still needs to look at every
character, in some way, to determine word and line separators.
If we want a really general version, where the set of word and/or line
separators can be user-defined, what is the fastest possible
implementation?
I suspect a vector machine might be capable of filling a vector, do a
parallel table lookup on each element, and then combine pairs of results
to determine word boundaries.
How fast could this be done?
On the x86 platform I written code to do this that uses less than 1.5
cycles/char (Pentium or PPro), in both 16 and 32-bit mode, but I suspect
it might be possible to be even faster:
Convert 32 characters into a word/non-word bitmask, and use logic
operations to determine/count any 0->1 transitions.
The best C code I have been able to come up with seems to require about
5 RISC-style operations/pair of input characters, it is probably
load-bandwidth limited on most cpus.
Terje
PS. This code is definitely in the 'programming for fun' category:
Having a wc that is more than 10 times faster than a disk array is fast
enough. :-)
--
- <Terje.M...@hda.hydro.com>
Using self-discipline, see http://www.eiffel.com/discipline
"almost all programming can be viewed as an exercise in caching"
>The entire world of computers needs a better sense of humour. I had two
>professors in college who made the dry topics of data structures and theory of
>computation *fun*. As a consequence, the knowledge stuck with me, and was more
>accesible while I was still clueless.
The best book about basic architecture I have ever seen (and learned a
lot from) is "the Cartoon Guide to Computer Architecture" by Larry Gonick.
You'd have to look quite hard for it though -- I found it by accident
among the cartoon books in the Humor section of a book shop, and had
never even seen it mentioned in any list of CS books.
--
Amos Shapir Net: am...@nsof.co.il
Paper: nSOF Parallel Software, Ltd.
Givat-Hashlosha 48800, Israel
Tel: +972 3 9388551 Fax: +972 3 9388552 GEO: 34 55 15 E / 32 05 52 N
I believe the keyword count in JAVA and C are about the same.
Some are replaced by more powerful analogies e.g. class replaces struct
and final replaces #define; some are dropped entirely like short and dereferencing; and a few are added such as catch-throw error handling and threads.
C++ adds lots of keywords to C.
The other measure of complexity is source code- which is significantly shorter
in JAVA than C and slightly short in JAVA than C++.
*But most of the Java interpreters still are inefficient.*
That's a standard OS/360 (and derivatives all the way to OS/390) linkage
convention. The assembler language (BAL) does not prescribe this. Some
environments (e.g. CMS) have a much simpler convention, and my programming
environment is simpler yet (an old experimental operating system).
Somebody mentioned Visual Basic a bit earlier in this thread (which evolved
from language support for parallelism to the ease of writing "hello world"
programs). I have never used VB, but was stunned once when I browsed a VB
book which took 20 pages to explain how to write a "hello world" program!
I'm sure COBOL can do better than that, even with all DIVISION declarations
included.
Michel.
How little is enough. It could compile itself on PDP11s in 64 KB,
coinhabiting them with Unix (but it didn't have function prototypes then).
> ...
Hartmann Schaffer
> My big problem with a lot of computing is that it lack empirical
> basis as a science. It's too young. We only learn about problems after
> a period of time, not merely hard work.
Agreed. This applies to individual programmers, as well as to
programmers as a whole (and what a hole...)
> We must accomplish a task, work, etc.
> The problem is that we must balance that against featuritis.
> That's by extensibility is important. People do use COMPLEX variables
> in FORTRAN, but fewer do work in quarterions, should users have to balance
> variable names because of the lack of user defined types?
I'm not familiar with F90, but from what I've read about it, I'd
expect it to be possible to be able to cope with quarternions as
easily as C++ - once you've defined a Quarternion class, that is.
Neat things, quarternions. I'd love to get a video tape of the
4D (3D plus time) animation somebody did of quarternion fractals.
That's a great way to show how much fun these numbers can be!
> The Little Languages philosophy is an important one. The second
> language I attempted to learn was PL/1. Lots of lessons there.
There's another language that I'm unfamiliar with. No doubt I'm
better off not knowing more about it! If there's a version of
Colossal Cave written in PL/1, then I'd still dig out the Pascal
(p_system) version lurking on one of my old 5 1/4" floppies, and
see if I can port to something that I'm still using (Delphi?).
On the other hand, perhaps by avoiding PL/1, I've missed a class?
Hmmm.
> >> C: #include <stdio.h>
> >> main() {
> >> printf("Hello world\n");
> >> }
>
> I will collect the Hellos and add them to the cft77, Pascal, Ada, C,
> Scheme, and CL versions. Other versions can be mailed to me..... 8^)
> and if a compiler runs on a Cray, I will HPM them.
In Forth:
: hello ." Hello, World!" cr ;
Alternately, here's a Nepalese version:
: namaste! ." Namaste!" cr ;
> >Can you elaborate on what you mean by a high quality programming language?
> >Weak vs. strong typing? High level constructs?
>
> "Strong typing is for weak minds...." famous quote.
>
> First, a high quality programming language prevents me from
> unintentionally shooting myself in the foot. Typing is merely one
> attempt at a compile time solution.
Here's one of my favourite programming quotes:
"C makes it easy to shoot yourself in the foot. C++ makes it
harder, but when you do, it blows away your whole leg."
-- Bjarne Stroustrup on C++
BTW, I _like_ strong type checking. Just not all the time, ;)
On other words, I'm happy if the compiler can infer the type
of an object, and then _tell_ me what it thinks the type is.
This is why I like ML, and the O'Caml compiler in particular.
I also like compilers that report the first error and then
stop. Reporting 100 errors caused by a typo on a single line
is just plain silly. Perhaps this would be less of a problem
if "batch" (i.e. C/C++, etc) compilers could recover better.
> Constructs, sure! User defined data types and structures: can be helpful.
User defined abstractions, meta-programming: weird but wonderful.
> But what about the compile and run-time environments?......
I like the distinction. Not every language enviroment makes it.
Once upon a time, I couldn't find a Basic that made it. Now
it's hard to find one that _doesn't_.
Meanwhile, the Scheme to C compiler that seems closest to
working on my machine includes EVAL in the runtime. I think
of EVAL as a development tool, not a runtime tool.
See if you can guess why I don't want EVAL:
1,953 test.obj
76,357 _errors.obj
373,860 _eval.obj
4,314 c_intf.obj
1,777 main.obj
10,722 setup.obj
12,409 mem.obj
3,293 os.obj
Not that Lisp is unique in this respect...But on this particular
platform, the VB and VC++ runtime support can be shared between
apps. The solution is obvious - put the Lisp runtime in a DLL.
> Big subjects. No time.
There never is enough time, alas.
}Alberto is right. Civilization does end south of I-90.
}What is south of I-90? Iowa.
You are welcome to keep your `civilization', thank you.
John
--
John Hascall, Software Engr. Shut up, be happy. The conveniences you
ISU Computation Center demanded are now mandatory. -Jello Biafra
jo...@iastate.edu
http://www.cc.iastate.edu/staff/systems/john/welcome.html <-- the usual crud
The only really important measure of complexity is how long it takes
a person(s) to update a program to add some new functionality,
where the person(s) have not seen or understood the source code before
making the changes. This is where 90% of programmers time is spent
in the real world --- learning about an existing program and/or changing it.
-Kelly Murray k...@franz.com
Keyword count isn't the same as complexity. Better ask, which
Chomsky-level you need to parse the language. Forth can be parsed with a
state-machine (search for blank). Lisp's parser doesn't need a
lookahead, just count parenthesis and search for blanks. That's
simplicity. Same thing with "compiling": both do a symbol lookup
(string->pointer) and add this pointer to an array (Forth) or a linked
list (Lisp). Certainly there are more sophisticated implementations, but
you can't do this straight-forward approach with C, even if you want to.
> The other measure of complexity is source code- which is significantly shorter
> in JAVA than C and slightly short in JAVA than C++.
But still a factor of three or so times larger than if you do the same
thing in Forth or Lisp. And with Forth, you even can't complain about
performance, or binary compatibility (OpenBoot was cross-platform binary
compatible before Java).
> *But most of the Java interpreters still are inefficient.*
That's because the people at Sun had only one professional in writing
interpreted stack machines, Mitch Bradley (the author of the OpenBoot
interpreter), and AFAIK he left. On the other hand, the Java compiler is
clueless about how to use the stack (all local variables reside in the
stack frame instead around the top of stack when needed), and run-time
tests are performance killer.
--
Bernd Paysan
"Late answers are wrong answers!"
http://www.informatik.tu-muenchen.de/~paysan/
It doesn't really matter whether software engineering is hard or easy, the
market in the last ten years has shown that once you get the machine
stable enough to run for a couple of hours for the average user, it's
better to spend any additional money/resources on more features and
marketing than on getting out the remaining bugs.
-- Bruce
--
...in 1996, software marketers wore out a record 31,296 copies of Roget's
Thesaurus searching for synonyms to the word "coffee" ...
We should not forget that the quality and properties of the code
is not only due to the language used but also due to who is developing
the code.
A compilation of *Hello World programs* designed by various
categories of *developer* follows.
--------------------------------------
High School/Jr.High
===================
10 PRINT "HELLO WORLD"
20 END
First year in College
=====================
program Hello(input, output)
begin
writeln('Hello World')
end.
Senior year in College
======================
(defun hello
(print
(cons 'Hello (list 'World))))
New professional
================
#include <stdio.h>
void main(void)
{
char *message[] = {"Hello ", "World"};
int i;
for(i = 0; i < 2; ++i)
printf("%s", message[i]);
printf("\n");
}
Seasoned professional
=====================
#include <iostream.h>
#include <string.h>
class string
{
private:
int size;
char *ptr;
public:
string() : size(0), ptr(new char('\0')) {}
string(const string &s) : size(s.size)
{
ptr = new char[size + 1];
strcpy(ptr, s.ptr);
}
~string()
{
delete [] ptr;
}
friend ostream &operator <<(ostream &, const string &);
string &operator=(const char *);
};
ostream &operator<<(ostream &stream, const string &s)
{
return(stream << s.ptr);
}
string &string::operator=(const char *chrs)
{
if (this != &chrs)
{
delete [] ptr;
size = strlen(chrs);
ptr = new char[size + 1];
strcpy(ptr, chrs);
}
return(*this);
}
int main()
{
string str;
str = "Hello World";
cout << str << endl;
return(0);
}
Master Programmer
=================
[
uuid(2573F8F4-CFEE-101A-9A9F-00AA00342820)
]
library LHello
{
// bring in the master library
importlib("actimp.tlb");
importlib("actexp.tlb");
// bring in my interfaces
#include "pshlo.idl"
[
uuid(2573F8F5-CFEE-101A-9A9F-00AA00342820)
]
cotype THello
{
interface IHello;
interface IPersistFile;
};
};
[
exe,
uuid(2573F890-CFEE-101A-9A9F-00AA00342820)
]
module CHelloLib
{
// some code related header files
importheader(<windows.h>);
importheader(<ole2.h>);
importheader(<except.hxx>);
importheader("pshlo.h");
importheader("shlo.hxx");
importheader("mycls.hxx");
// needed typelibs
importlib("actimp.tlb");
importlib("actexp.tlb");
importlib("thlo.tlb");
[
uuid(2573F891-CFEE-101A-9A9F-00AA00342820),
aggregatable
]
coclass CHello
{
cotype THello;
};
};
#include "ipfix.hxx"
extern HANDLE hEvent;
class CHello : public CHelloBase
{
public:
IPFIX(CLSID_CHello);
CHello(IUnknown *pUnk);
~CHello();
HRESULT __stdcall PrintSz(LPWSTR pwszString);
private:
static int cObjRef;
};
#include <windows.h>
#include <ole2.h>
#include <stdio.h>
#include <stdlib.h>
#include "thlo.h"
#include "pshlo.h"
#include "shlo.hxx"
#include "mycls.hxx"
int CHello::cObjRef = 0;
CHello::CHello(IUnknown *pUnk) : CHelloBase(pUnk)
{
cObjRef++;
return;
}
HRESULT __stdcall CHello::PrintSz(LPWSTR pwszString)
{
printf("%ws\n", pwszString);
return(ResultFromScode(S_OK));
}
CHello::~CHello(void)
{
// when the object count goes to zero, stop the server
cObjRef--;
if( cObjRef == 0 )
PulseEvent(hEvent);
return;
}
#include <windows.h>
#include <ole2.h>
#include "pshlo.h"
#include "shlo.hxx"
#include "mycls.hxx"
HANDLE hEvent;
int _cdecl main(
int argc,
char * argv[]
) {
ULONG ulRef;
DWORD dwRegistration;
CHelloCF *pCF = new CHelloCF();
hEvent = CreateEvent(NULL, FALSE, FALSE, NULL);
// Initialize the OLE libraries
CoInitializeEx(NULL, COINIT_MULTITHREADED);
CoRegisterClassObject(CLSID_CHello, pCF, CLSCTX_LOCAL_SERVER,
REGCLS_MULTIPLEUSE, &dwRegistration);
// wait on an event to stop
WaitForSingleObject(hEvent, INFINITE);
// revoke and release the class object
CoRevokeClassObject(dwRegistration);
ulRef = pCF->Release();
// Tell OLE we are going away.
CoUninitialize();
return(0); }
extern CLSID CLSID_CHello;
extern UUID LIBID_CHelloLib;
CLSID CLSID_CHello = { /* 2573F891-CFEE-101A-9A9F-00AA00342820 */
0x2573F891,
0xCFEE,
0x101A,
{ 0x9A, 0x9F, 0x00, 0xAA, 0x00, 0x34, 0x28, 0x20 }
};
UUID LIBID_CHelloLib = { /* 2573F890-CFEE-101A-9A9F-00AA00342820 */
0x2573F890,
0xCFEE,
0x101A,
{ 0x9A, 0x9F, 0x00, 0xAA, 0x00, 0x34, 0x28, 0x20 }
};
#include <windows.h>
#include <ole2.h>
#include <stdlib.h>
#include <string.h>
#include <stdio.h>
#include "pshlo.h"
#include "shlo.hxx"
#include "clsid.h"
int _cdecl main(
int argc,
char * argv[]
) {
HRESULT hRslt;
IHello *pHello;
ULONG ulCnt;
IMoniker * pmk;
WCHAR wcsT[_MAX_PATH];
WCHAR wcsPath[2 * _MAX_PATH];
// get object path
wcsPath[0] = '\0';
wcsT[0] = '\0';
if( argc > 1) {
mbstowcs(wcsPath, argv[1], strlen(argv[1]) + 1);
wcsupr(wcsPath);
}
else {
fprintf(stderr, "Object path must be specified\n");
return(1);
}
// get print string
if(argc > 2)
mbstowcs(wcsT, argv[2], strlen(argv[2]) + 1);
else
wcscpy(wcsT, L"Hello World");
printf("Linking to object %ws\n", wcsPath);
printf("Text String %ws\n", wcsT);
// Initialize the OLE libraries
hRslt = CoInitializeEx(NULL, COINIT_MULTITHREADED);
if(SUCCEEDED(hRslt)) {
hRslt = CreateFileMoniker(wcsPath, &pmk);
if(SUCCEEDED(hRslt))
hRslt = BindMoniker(pmk, 0, IID_IHello, (void **)&pHello);
if(SUCCEEDED(hRslt)) {
// print a string out
pHello->PrintSz(wcsT);
Sleep(2000);
ulCnt = pHello->Release();
}
else
printf("Failure to connect, status: %lx", hRslt);
// Tell OLE we are going away.
CoUninitialize();
}
return(0);
}
Apprentice Hacker
===================
#!/usr/local/bin/perl
$msg="Hello, world.\n";
if ($#ARGV >= 0) {
while(defined($arg=shift(@ARGV))) {
$outfilename = $arg;
open(FILE, ">" . $outfilename) || die "Can't write $arg: $!\n";
print (FILE $msg);
close(FILE) || die "Can't close $arg: $!\n";
}
} else {
print ($msg);
}
1;
Experienced Hacker
===================
#include <stdio.h>
#define S "Hello, World\n"
main(){exit(printf(S) == strlen(S) ? 0 : 1);}
Seasoned Hacker
===================
% cc -o a.out ~/src/misc/hw/hw.c
% a.out
Guru Hacker
===================
% cat
Hello, world.
^D
New Manager
===================
10 PRINT "HELLO WORLD"
20 END
Middle Manager
===================
mail -s "Hello, world." bob@b12
Bob, could you please write me a program that prints "Hello,
world."?
I need it by tomorrow.
^D
Senior Manager
===================
% zmail jim
I need a "Hello, world." program by this afternoon.
Chief Executive
===================
% letter
letter: Command not found.
% mail
To: ^X ^F ^C
% help mail
help: Command not found.
% damn!
!: Event unrecognized
% logout
There's "The Cartoon Guide to the Computer", ISBN: 0-06-273097-5, but I
can't find one with "architecture" in the title.
--
Chris Perrott
| From: cyber_...@wildcard.demon.co.uk (Cyber Surfer)
| Newsgroups: comp.arch,comp.sys.super,comp.lang.lisp
| Date: Wed, 11 Dec 96 08:09:19 GMT
| Meanwhile, the Scheme to C compiler that seems closest to
| working on my machine includes EVAL in the runtime. I think
| of EVAL as a development tool, not a runtime tool.
|
| See if you can guess why I don't want EVAL:
|
| 1,953 test.obj
| 76,357 _errors.obj
| 373,860 _eval.obj
| 4,314 c_intf.obj
| 1,777 main.obj
| 10,722 setup.obj
| 12,409 mem.obj
| 3,293 os.obj
|
| Not that Lisp is unique in this respect...But on this particular
| platform, the VB and VC++ runtime support can be shared between
| apps. The solution is obvious - put the Lisp runtime in a DLL.
Interesting argument, since I view EVAL as the source-level dynamic
loader of Lisp.
Or do you only mean shared libraries by DLLs?
The semantic "problems" presented by EVAL are no worse or different
than those introduced by any of the dynamic loaders on current
systems. The rest is just an issue of whether the compiler has an
API, and whether your components can be shipped as source.
> Bowing down to that false god of computers, BAL, Michel Hack wrote:
>
> USING *,12 *(The calling convention sets R12)
>
> Hmmm.... Now that wouldn't be from sonething on entry like:
>
> STM 13,12,12(13) *(IBM BAL Standard Linkage for Subroutine
Calls)
>
> Geeze, I ain't had so much fun since I wrote my last BXLE!
It must have been a while, since the USING is an assembler directive, not
an runtime instruction, that says "use R12 for my default base register,
and assume that it has this location in it." The instruction to set this
up might have been a subroutine call or a
BALR 12,0
that told the 360 to "branch to register 0 and link using register 12".
Now, branching to register 0 is a special case that says don't branch at
all, so the only effect is to load R12 with the current PC (so it will work
as the base register for all those tiny 12 bit offsets.)
Next week's quiz will be on the translate instruction as used to count bits....
--
Michael
Michael Lyle (ml...@wco.com)
So you are saying that one of the qualities of a language is where you
can buy it? I'll remember that next time I'm creating a computer
language...get contracts with Egghead and K-Mart before writing the
technical spec.
Regards,
Steve
--
stephen...@citicorp.com, piala...@aol.com Steve Miklos @ home
Not speaking for the Big Bank
http://members.aol.com/pialamodem
1. It has all the control structures and support for data structures
that I need.
2. It lets me get as close to the machine as I want to get.
3. Pointer arithmetic is the single greatest language feature I've ever
seen for a performance-oriented programmer (Java and Fortran90
notwithstanding).
4. It is extremely portable, if I decide that I want to write portable
code.
5. It doesn't take the stance that I'm an idiot and have to be protected
from my own mistakes. It lets me shoot myself in the foot if I want to.
6. At the same time, it allows me to use the compiler to detect many
programming errors, assuming I have the good sense to use function
prototyping.
I say this as a former Lisp hacker of many years (in my previous AI
life). The only problem I have with Lisp is that it hides the machine,
which is bad if your God is performance.
(I did like *lisp on the CMs.)
Steve Barnard
During my time at University a group of us undergraduates endeavoured
to create a collection of hello world programs, in over 30 languages.
I've put the collection on the web - though only on a part-time
web site (http://nsict.demon.co.uk/somewhere-or-other) - if I remember
I'll upload it to http://www.nsict.demon.co.uk/hello/ this evening.
(Contributions also welcomed. ;-)
Mike.
Have I wandered into comp.geography.bigot by mistake?
---
chris
I'll quality my use of the word "inspired" by comparing in to a 'road to
Damascus revelation' as opposed to a 'let's sit down for a year or so
and think about things'. Alternatively, "inspired" in a '60s sense
would mean under the influence of chemical compounds not normally
associated with clear thinking!!! ;-))
> >> Most languages are the product of a lot of hard work...
>
[stuff removed]
>
> >There are tons of "neat" new (and old) languages out there that
> >no one uses because a) they attempt to introduce an entirely new paradigm
> >which makes them highly undesireable when writing mission-critical code, or
> >b) they do something that another language already does. Are they
> >inferior? Nope, just different. Are they inferior for a partcular purpose?
This is true but it is often the square peg, round hole problem. Once a
language has come into existence, the proponents of that language try to
force it down everyone throat, saying it can be used to solve every
problem!
A perfect example is C. A great language for knocking up little (< a
couple of thousand line of code) utilities for Unix and other systems,
but God help you if you want to write 1 million lines of code
mission-critical systems in it. The testing/QA team had better be well
staffed and the Engineers writing the code had better be prepared to
spend a lot of their time in code reviews!!!
> >Can you elaborate on what you mean by a high quality programming
language?
> >Weak vs. strong typing? High level constructs?
>
> "Strong typing is for weak minds...." famous quote.
>
> First, a high quality programming language prevents me from
> unintentionally shooting myself in the foot. Typing is merely one
> attempt at a compile time solution.
Couldn't agree more... Again C is the perfect example of a language
that actually seems to encourage bad programming practices. All of
these obscure operators that constantly trick you up. As an example,
what you you prefer to see in some code you have just come across-
(a) mask &= ~some_random_bits; /* C syntax */
(b) mask := XOR(mask, some_random_bits); -- Ada syntax
In the second case, I would probably even rename XOR() to be BIT_CLEAR()
or some other name that clearly defined the function being performed.
Ie.
(b') mask := BIT_CLEAR(mask, some_random_bits);
Once you get into the area of objects/pointers C becomes even more
dangerous...
I hate to pick on only C but it is a commonly used language and so it
deserves more attention as to it's inadequacies. A similar problem with
Fortran where-
C Zero out array A
do 10 i=1.1000
a(i)=0.0
10 continue
is perfectly acceptable code (or at least it was back when I wrote
Fortran) but doesn't do what the programmer expects. Try finding this
bug in 10's of thousands of line of code in some complex simulation.
I spent about 3 years writing Ada and yes, I did need a 3 week course
before writing any "useful" code, and even the code I wrote in the first
3-4 months I eventually tossed away because it was basically shit. This
is a bit hard to swallow for management when you tell them that a quite
competent C programmer is not going to be productive in Ada for almost 6
months but the end result is (potentially) a far happier outcome.
Is Ada a perfect language? No, but I can see why DoD-type people insist
on using it. If I was asked to work on a system where people's lives
are at stake, then I would simply refuse if C (possibly even C++) were
being used.
I've lost enough sleep working on systems where only the factory stops
when a critical part of the system crashes! Hearing the scream of high
pressure steam being vented through pressure release valves while the
general manager stands behind you while you try to find the bug and fix
it it bad enough.
Is any language perfect? Definitely NO!
Are some languages superior? Definitely YES!
Tony
Roger Moore (Hopper Award winner for APL\360) did this back in the
early 70's to perform the 3 common Boolean reductions in SHARP APL:
+/ x NB. Count of 1 bits in each row of array x
^/ x NB. Predicate for all bits in row of x being 1 (AND reduce)
or/x NB. Predicate for any bit in row of x being 1 (OR reduce)
He did this in two steps:
a. Copy a hunk of the array to a work area, then apply TR
with a translate table initialized to the number of 1 bits
in the corresponding byte.
b. (Here's where we part company from the norm) Perform
a loop over the translated data, A WORD AT A TIME, summing
the words into some GP register.
At the end of the loop perform a few operations to sum the
bytes in the register.
Exercise for extra credit: How long could the TR argument be,
maximum? Why? Hint: It's NOT 256.
Bob
ps: There was a bit of AND mask prolog when the array row did
not start/end on a byte boundary, of course.
Piece of easy, just define a translation table (these are restricted to
2^8 characters with the correct bit count for a character in OS/360 BAL.
So, for example (represented here in binary):
<Tr Table Index> <Output table text after Index fetch>
Orig Char: Bit Count:
0000 0000 0000 0000
0000 0001 0000 0001
0000 0010 0000 0001
0000 0011 0000 0010
0000 0100 0000 0001
The text of the table would be (in hex):
00 01 01 02 01 02 02 03 01 02 02 03 02 03 03 04 ... 06 07 07 08
The real tough use of "tr" in BAL was in going from Fieldata (6-bit code)
to EBCDIC (8-bit code) to ASCII (7-bit code) and not losing special
character representations. Especially when the next computer in the
network invariably translated ASCII back to Fieldata! This nightmare was
(partially) resolved by FIPS-14 in 1975 (if memory serves me right).
<<Rule of translation: All foreign codes will be translated into the local
computer's native code for text processing. All text will be re-translated
back to the foreign code, or next outgoing target computer's foreign code
as a last step before transmission.>> Religious wars were fought for years
over the implementation and maintainence of this rule!
Please, don't tell me that Ada uses XOR to mask away bits intead of
toggling them, I refuse to believe that!
Terje
> 3. Pointer arithmetic is the single greatest language feature I've ever
> seen for a performance-oriented programmer (Java and Fortran90
> notwithstanding).
Jesus! Btw., pointers on a Lisp machine are called "Locatives".
> 5. It doesn't take the stance that I'm an idiot and have to be protected
> from my own mistakes. It lets me shoot myself in the foot if I want to.
This makes me *really* confident in your software.
> 6. At the same time, it allows me to use the compiler to detect many
> programming errors, assuming I have the good sense to use function
> prototyping.
Basic type errors - maybe.
> I say this as a former Lisp hacker of many years (in my previous AI
> life). The only problem I have with Lisp is that it hides the machine,
> which is bad if your God is performance.
Great! Lisp hides the machine!
Everything else is implementation depended.
Rainer Joswig
You appear to be the type of person who likes religious wars. I say why
I like C and you seem to draw the conclusions that:
- I despise all other programming languages, and especially Lisp,
- I write bug-filled code, as of course all C programmers must.
It's as though I said why I'm a Muslim (I'm not), and you implied that I
must be anti-Christian and evil.
I really like Lisp, too. It is far more pleasant to use Lisp than C,
but you'd have to be a fool to use Lisp for performance-oriented
supercomputer codes, unless there have been some developments in the
last few years that I'm unaware of.
Steve Barnard
This is likely true of *any* language. You can write bad assembly
code and good assembly code. If you take the time, C code can be
just as easily supported as any other language. If you go the other
way, it can be just as difficult to understand as assembly language.
But "C" doesn't *make* the code bad, that's done by a *programmer*.
I've seen ugly APL, ugly ADA, ugly PASCAL, ugly SMALLTALK, ugly C++,
ugly VISUAL BASIC, even saw an unly truck the other day.. :)
:
: > >Can you elaborate on what you mean by a high quality programming
: language?
: > >Weak vs. strong typing? High level constructs?
: >
: > "Strong typing is for weak minds...." famous quote.
: >
: > First, a high quality programming language prevents me from
: > unintentionally shooting myself in the foot. Typing is merely one
: > attempt at a compile time solution.
:
: Couldn't agree more... Again C is the perfect example of a language
: that actually seems to encourage bad programming practices. All of
: these obscure operators that constantly trick you up. As an example,
: what you you prefer to see in some code you have just come across-
:
: (a) mask &= ~some_random_bits; /* C syntax */
:
: (b) mask := XOR(mask, some_random_bits); -- Ada syntax
fix it. If you look at my c code, you'd see
mask=And(mask, some_random_bits);
and you'd see that everywhere. The preprocessor can clean up a lot of
coding screwups given half a chance.. Note that the above is not a function
but a simple macro.
:
: In the second case, I would probably even rename XOR() to be BIT_CLEAR()
: or some other name that clearly defined the function being performed.
: Ie.
: (b') mask := BIT_CLEAR(mask, some_random_bits);
:
: Once you get into the area of objects/pointers C becomes even more
: dangerous...
Again it doesn't have to, just like handguns don't *have* to be dangerous
either. Of course they usually are, because they usually have loose nuts
attached to them. :)
:
: I hate to pick on only C but it is a commonly used language and so it
: deserves more attention as to it's inadequacies. A similar problem with
: Fortran where-
:
: C Zero out array A
: do 10 i=1.1000
: a(i)=0.0
: 10 continue
:
: is perfectly acceptable code (or at least it was back when I wrote
: Fortran) but doesn't do what the programmer expects. Try finding this
: bug in 10's of thousands of line of code in some complex simulation.
most good compilers catch that. even though do10i is a valid variable
name, it can still be used to produce a warning, just like this does in
most C compilers:
if (a=b) { };
most compilers make you write this as
if ((a=b))
if you want to stop the warning about a possible unexpected assignment
used as a truth value...
:
: I spent about 3 years writing Ada and yes, I did need a 3 week course
: before writing any "useful" code, and even the code I wrote in the first
: 3-4 months I eventually tossed away because it was basically shit. This
: is a bit hard to swallow for management when you tell them that a quite
: competent C programmer is not going to be productive in Ada for almost 6
: months but the end result is (potentially) a far happier outcome.
:
: Is Ada a perfect language? No, but I can see why DoD-type people insist
: on using it. If I was asked to work on a system where people's lives
: are at stake, then I would simply refuse if C (possibly even C++) were
: being used.
I'd be willing to bet that many programmers are "writing C programs in ADA
syntax". I've seen it hundreds of times. Practices learned years ago, and
then re-enforced by years of programming are going to be difficult to
overcome. I've not found a language that can solve this.
:
: I've lost enough sleep working on systems where only the factory stops
In article <32B0C7...@megafauna.com>, Steve Barnard <st...@megafauna.com> writes:
> I like C because:
<snip>
> 3. Pointer arithmetic is the single greatest language feature I've ever
> seen for a performance-oriented programmer (Java and Fortran90
> notwithstanding).
Conversely, most Fortran HPC programming advocates consider C's
pointer *aliasing* constraints to be one of the more important
performance killers in C.
---------------- Cray Research ---------------- *** Roger Glover ***
---------- A Silicon Graphics Company --------- http://home.cray.com/~glover
In article <32B0BE...@OntheNet.com.au>, Tony Griffiths
<to...@OntheNet.com.au> wrote to comp.arch, comp.sys.super, and
comp.lang.lisp:
<big snip>
> I hate to pick on only C but it is a commonly used language and so it
> deserves more attention as to it's inadequacies. A similar problem with
> Fortran where-
>
> C Zero out array A
> do 10 i=1.1000
> a(i)=0.0
> 10 continue
>
> is perfectly acceptable code (or at least it was back when I wrote
> Fortran) but doesn't do what the programmer expects. Try finding this
> bug in 10's of thousands of line of code in some complex simulation.
The example above passes for "correct" in Fortran 77 and in the
obsolete "fixed source form" in Fortran 90/95. However, it would be
rejected by the preferred "free source form" in Fortran 90 due to:
- the space after the word "do", or
- the lack of an "endpoint" number after the "startpoint" number 1.1000
(depending on whether the processor sees a flawed candlestick or two
flawed faces).
This is no longer even vaguely related to computer architectures,
supercomputers, or LISP, please note the Followup setting.
---------------- Cray Research ---------------- *** Roger Glover ***
XXXX\ \ / \ /XXX \ / \ X \ /\\\ X///X /\\\ *** Software Instructor ***
/ \ / \/ /\ / \ / \X /\ X \ / \ X\ \ X DISCLAIMER HAIKU:
//X/ X\\\X //X/ X \ X X\\ / \ X/X \ X \\\ C R^H^H^H^H S G I may not
/ \ X///X / \/ X//XX X \ / \ X \\ X \ Share these opinions with me
/ \ X X /\\\/ X X X///X /XXX/ X///X /XXX/ This is not my fault
Years ago when I was working at JPL, I learned to play backgammon
at lunch time by looking over the shoulders of colleagues. I got pretty
good (except the doubling cube) after I did some distributions (the
wrong ones to think about are the simple distributes of standard rolls).
In the case of programming languages, George Michael likes to relate the
story of a fellow who learned Fortran by a similar manner. At that time
(punch cards era I presume) the guy wrote up his first code (be they
card images or real cards, it was not important), but being clever, he
decided to label each of his statements in advance (like BASIC).
But he knew about people modifying codes (he was smarter after all), so
he came up with a scheme which had the first character being a "1".
Needless to say, he didn't look close enough during lunch to learn about
carriage control. The LTSS printed many boxes of printer pages bare
except for one line on the top of each page. The boxes of course were
delivered to the man's office of course.
Unfortunately "God" is found in those kinds of details.
>A compilation of *Hello World programs* designed by various
>categories of *developer* follows.
Yes, I forgot about this joke post.
Programming will remain an art for some time to come.
No, what Stanley is say was best said by Fred Brooks (now at UNC) in
The Mythical Man-Month:
A system reflects the bureaucracy which created/uses/etc. it.
I would amend that Bob with:
I'd be willing to bet that many programmers are "writing C/Ada programs in
Ada/FORTRAN syntax".
I checked the Ada news group before writing my paper. It's not simply
that they write the code in a scheme (pun) of another language, but
in some applications they intentionally gut the compiler (e.g., go from
a dynamic memory model to a static memory model). Better?
Depends on your view. At the time they were discussing the Boeing 777's
use of Ada. That's the aero community's view of programming language
(i.e., you don't want to get on a plane run by dynamic languages like LISP).
I learned a language called HAL/S by Intermetrics. It's not a language
that I would recommend learning (very PL/1ish, only worse, it was said
to be "FORTRAN"-Like, Ha!).
My point is that few computer people are willing to die or kill people
for their code. That's the beauty of rebooting and simulation.
"How many drops is this for you, Lt.?
"Thirty-eight ... simulated."
"How many *combat* drops?"
"Two ... including this one."
It's funny and tragic at the same time.
More in a moment on this thread.
I remind readers that we are diverging from supercomputers
and a little from architecture, and certain away from LISP.
If you really want to read about programming languages, get a book
(more than one of us has recommended the HOPL-II book)
or go over to comp.lang.misc or favorite group.
"Insisting on perfect safety is for people who don't have the balls to
live in the real world." -- Mary Shafer, NASA Ames Dryden
My company has had a safety program for 150 years. The program was instituted
as a result of a French law requiring an explosives manufacturer to live
on the premises with his family.
--Crawford Greenwalt, Former President of Dupont.
I had one other quote from Hackers about Gosper in the database, but
I'll skip it.
P.S. Just a reminder: Ada is named after a person, FORTRAN is an acronym.
In article <58s1s2$3...@walter.cray.com>,
Roger Glover <glo...@hikimi.cray.com> wrote:
>
>In article <32B0C7...@megafauna.com>, Steve Barnard <st...@megafauna.com> writes:
>> I like C because:
>
><snip>
>
>> 3. Pointer arithmetic is the single greatest language feature I've ever
>> seen for a performance-oriented programmer (Java and Fortran90
>> notwithstanding).
>
>Conversely, most Fortran HPC programming advocates consider C's
>pointer *aliasing* constraints to be one of the more important
>performance killers in C.
My pet theory:
if the compiler has trouble with analysis of a program,
then the typical programmer will not understand that program.
Please note:
1) I am not saying the compiler is smarter. I am saying the compiler
"knows" more of the program than the typical maintainence guy
trying to make a little change.
2) I am not saying "ALL" programmers, just the typical. As in someone
who is not given enough time, has impossible deadlines, etc.
The C language is particularly rich with ways of writing a program
that totally hides the original design intent. And C++ has raised the
obscurity ante to the syntactic level - a real achievement.
--
Stanley Chow; sc...@bnr.ca, stanley....@nt.com; (613) 763-2831
Bell Northern Research Ltd., PO Box 3511 Station C, Ottawa, Ontario
Me? Represent other people? Don't make them laugh so hard.
> Interesting argument, since I view EVAL as the source-level dynamic
> loader of Lisp.
EVAL is useful for creating an interactive language, but it's
not something I want to deliver to a client.
> Or do you only mean shared libraries by DLLs?
Yes, I'm refering to code sharing, via DLLs or similar means.
Since I'm developing for Windows, I'm interested in DLLs.
Programmers using other platforms might have a different name
for such things. I can't remember what the the P-system called
them, but I vaguely recall that the code was placed in a
"library" with 16 slots (?).
> The semantic "problems" presented by EVAL are no worse or different
> than those introduced by any of the dynamic loaders on current
> systems. The rest is just an issue of whether the compiler has an
> API, and whether your components can be shipped as source.
I don't wish to deliver a compiler to clients. They'd probably
object if I did, even if the the extra code size wasn't a problem,
which it could be, easily.
--
<URL:http://www.enrapture.com/cybes/> You can never browse enough
Future generations are relying on us
It's a world we've made - Incubus
We're living on a knife edge, looking for the ground -- Hawkwind
--
Glen Clark
gl...@clarkcom.com
State College, PA
In the beginning... there was Philadelphia. But hardly anybody
alive remembers. There are no know survivors of that tribe who
live on their fathers' land. There is a marker to show where
the tribe once lived.
Then came Poughkeepsie. But few people remember or care.
Then there was a company in Minneapolis that made thermostats
that saw computers as a logical extension of their product line.
A few members of the original tribe set out riding donkeys (or
were they burros? I always forget.) and headed west. They were
said to have been last seen near Blue Bell, Pennsylvania. No
one ever heard from them again.
Then there was Route 128. This is where many people think man
first learned to stand erect and is the beginning of modern
history. Tracy Kidder wrote a tear-jerking book about goings-on
there. Then there were more words which only The Old Ones remember,
like Eclipse and TOPS-10.
At the same time, a splinter group of a splinter group of the
thermostat company decided that there was something about the
cold that allowed them to make computers than ran faster. Deciding
that Minneapolis was not near cold enough, they moved to Chippewa
Falls.
Then Scott McNeily and the surfer boys discovered that you could
build computers *and* have nice weather. An oracle named Terman
had told Scott there wasn't anything indiginous to the water in
Boston that was essential to making computers and that made it
worth putting up with the winters in Boston.
The people on the Hudson, seeing the recruiting advantages for
new_grads for warmer climates, sent a scout party to Austin to
see if one could use the sand in Texas to make silicon wafers.
The scout party never returned. Their parents get Christmas
cards from them occassionally.
Shortly afterward, Jim Clark and George Lucas discovered that
under the right conditions of temperature and humidity, you
could make dinosaurs out of silicon in southern California.
Not long afterward, the housing market went in the tank in much
of the US. Unable to sell his house in Boston without taking a
bath, therefore unable to go join the fun and frolic in the sun,
and in a fit of anger, Ken Olson locked his entire fab team in
the lab and refused to let them see their wives until they
had developed something called "Alpha".
Most recently, archaeologists have become aware of a completely
new tribe in the Seattle area. Very little is known at this
time about the Seattle tribe. So far, this tribe appears to have
no genetic roots in any other previously-known tribe. The current
thinking is that this tribe traces its roots to Asia rather than
North America and that the early members of the tribe walked
across the land bridge which used to exist over the Bering Sea
between Russia and Alaska. Scholars theorize that this tribe has
eluded notice until this time due to the foul weather in Seattle.
Whether the gene pool evolved by this tribe in isolation is as
robust as the genes of the other tribes which have intermarried
is not yet known.
Now do you understand why we cherish our geographic roots?
--
Glen Clark
gl...@clarkcom.com
It takes more than a little gall to send a pro-C diatribe to a Lisp spec-
ific newsgroup, and then to complain of religious wars when you are scolded.
Quit pulling chains if you wish to live peacefully. I'm finishing a project
which is a Scheme/C hybrid, and I see quite definite uses for both. Lang-
uages are tools; do you repair all things with a mallet?
: - I write bug-filled code, as of course all C programmers must.
Many do. I've read and fixed more than a few disgusting things written
in C.
: but you'd have to be a fool to use Lisp for performance-oriented
: supercomputer codes, unless there have been some developments in the
: last few years that I'm unaware of.
But, being absolutely miserly with cycles is not a universal need in all
problem domains. Not all software are expensive numeric thingies or drive
realtime embedded controllers with low-power hardware; in fact with many
workstations, the CPU spends more than a small percent of its capacity
patiently waiting for keystrokes and mouse actions. (See the work on
CONDOR for a review of this.) Kelly Murray stated it best in that speed
is usually a vastly over-rated criterion. Unfortunately, in my opinion
it often comes in many minds before correctness.
I think most people in comp.lang.LISP are paid to write software, and not
to engage in protracted arguments over their choice of tools. I suspect
the typical members of the other groups in which this flame fodder was
cross-posted share similar concerns. I come to comp.lang.lisp as do
others to read about Lisp, and I instead see your needless chain yanking.
Avec politesse, s'il vous plaƮt!
(Note new followup!)
--
Christopher Oliver Traverse Communications
Systems Coordinator 223 Grandview Pkwy, Suite 108
oli...@traverse.com Traverse City, Michigan, 49684
The loop macro: because no language is complete without a little COBOL.