Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

[9fans] nice quote

24 views
Skip to first unread message

ron minnich

unread,
Sep 2, 2009, 10:34:34 AM9/2/09
to
Q: "Will C continue to be important into the future?"
(Dave Kirk, Nvidia)A: "No, I think C will die like Fortran has"

ron

Rodolfo kix

unread,
Sep 2, 2009, 10:55:15 AM9/2/09
to
I believe OS/2 is destined to be the most important operating system,
and possibly program, of all time.
(Bill Gates, OS/2 Programmers Guide, November 1987)

... we are all human ...

:-)

--
Rodolfo García "kix"
EA4ERH - IN80ER

Enrique Soriano

unread,
Sep 2, 2009, 11:42:17 AM9/2/09
to
> (Dave Kirk, Nvidia) A: "No, I think C will die like Fortran has"

http://developer.nvidia.com/page/cg_main.html

erik quanstrom

unread,
Sep 2, 2009, 12:42:07 PM9/2/09
to
On Wed Sep 2 10:33:07 EDT 2009, rmin...@gmail.com wrote:
> Q: "Will C continue to be important into the future?"
> (Dave Kirk, Nvidia)A: "No, I think C will die like Fortran has"

isn't this the same company that claims that the cpu is dead?
it may be true, but given nvidia's propensity to make
claims that stretch credulity a wee bit that all just so happen
to lead one to the conclusion — that nvidia will dominate the
computer world in the near future with massive gpus, directx,
and a tiny cpu.

- erik

David Leimbach

unread,
Sep 2, 2009, 12:59:41 PM9/2/09
to
I know people claiming the GPU is dead.  (The folks who make the Unreal gaming engine to start).
 

- erik


Robert Raschke

unread,
Sep 2, 2009, 1:02:21 PM9/2/09
to
Gamers have a lot to answer for. Not just social decline ... ;-)

Robby

David Leimbach

unread,
Sep 2, 2009, 1:06:17 PM9/2/09
to

Eric Van Hensbergen

unread,
Sep 2, 2009, 1:34:52 PM9/2/09
to
Clarifying context: this was at a hpc clusters conference -- their
view of fortran is not your view of fortran.

Sent from my iPhone

erik quanstrom

unread,
Sep 2, 2009, 1:39:39 PM9/2/09
to

on p. 43/44 i believe it is claimed that one
cannot do CSP without pure functional
programming.

the thread library is clearly better than i thought.
it can turn ordinary c into a functional programming
language! ☺

- erik

Richard Miller

unread,
Sep 2, 2009, 2:11:58 PM9/2/09
to
>> http://graphics.cs.williams.edu/archive/SweeneyHPG2009/TimHPG2009.pdf
>>
> on p. 43/44 i believe it is claimed that one
> cannot do CSP without pure functional
> programming.

(p ⇒ q) ⇏ (¬p ⇒ ¬q)


David Leimbach

unread,
Sep 2, 2009, 2:30:17 PM9/2/09
to
That's interesting because pure functional programming doesn't exist at all in the strictest sense on a computer.  One MUST be able to cause side effects during computation or your CPU will just get hot... if even that.

Dave 

David Leimbach

unread,
Sep 2, 2009, 2:28:58 PM9/2/09
to
On Wed, Sep 2, 2009 at 10:31 AM, Eric Van Hensbergen <eri...@gmail.com> wrote:
Clarifying context: this was at a hpc clusters conference -- their view of fortran is not your view of fortran.

Having supported Fortran for MPI implementations before, I know what you mean :-)

erik quanstrom

unread,
Sep 2, 2009, 2:38:52 PM9/2/09
to
> > > on p. 43/44 i believe it is claimed that one
> > > cannot do CSP without pure functional
> > > programming.
> >
> > (p ⇒ q) ⇏ (¬p ⇒ ¬q)
> >
> >
> That's interesting because pure functional programming doesn't exist at all
> in the strictest sense on a computer. One MUST be able to cause side
> effects during computation or your CPU will just get hot... if even that.

i read the slides as contrasts, not as
logical conjunctions.

i still don't understand the claim that message passing
requires "thousands of message protocols"
and can't do syncronization.

- erik

ron minnich

unread,
Sep 2, 2009, 2:49:51 PM9/2/09
to
On Wed, Sep 2, 2009 at 7:29 AM, ron minnich<rmin...@gmail.com> wrote:
> Q: "Will C continue to be important into the future?"
> (Dave Kirk, Nvidia)A: "No, I think C will die like Fortran has"

let me explain the joke. In HPC circles, people have been predicting
the death of fortran for 30 years. Fortran has continued to grow and
thrive. The predictions continue, but the latest fortran standard
includes objects.

So, what Dave is saying, tongue in cheek, is that C will die in the
way fortran has -- i.e., not at all.

ron

David Leimbach

unread,
Sep 2, 2009, 2:49:36 PM9/2/09
to
I also don't get that. What was meant by his usage of "protocol".  Erlang uses only a handful of patterns that work really well for interaction in each subsystem.  If they think of messaging and protocols in a smalltalky way, then each class has a protocol of messages (methods) that must be implemented, but I don't get why that's bad.  It's called an API.

I mean HTTP has a small protocol, but if you count all the things you can do with REST, then it looks like a lot more.

Dave
 
- erik


Brian L. Stuart

unread,
Sep 2, 2009, 3:15:04 PM9/2/09
to
> > Q: "Will C continue to be important into the future?"
> > (Dave Kirk, Nvidia)A: "No, I think C will die like
> Fortran has"
>
> let me explain the joke. In HPC circles, people have been
> predicting
> the death of fortran for 30 years. Fortran has continued to
> grow and
> thrive. The predictions continue, but the latest fortran
> standard
> includes objects.
>
> So, what Dave is saying, tongue in cheek, is that C will
> die in the
> way fortran has -- i.e., not at all.

I just hope standards committees don't "enhance" C into
Frankenstein's monster.

That reminds me of another amusing story. It seems that
back in the 70s or 80s someone asked some big name in CS
what people would be programming with in the year 2000.
His response: "I don't know, but it'll be called FORTRAN."

This isn't your father's FORTRAN...

BLS


David Leimbach

unread,
Sep 2, 2009, 3:34:52 PM9/2/09
to
On Wed, Sep 2, 2009 at 12:11 PM, Brian L. Stuart <blst...@bellsouth.net> wrote:
> > Q: "Will C continue to be important into the future?"
> > (Dave Kirk, Nvidia)A: "No, I think C will die like
> Fortran has"
>
> let me explain the joke. In HPC circles, people have been
> predicting
> the death of fortran for 30 years. Fortran has continued to
> grow and
> thrive. The predictions continue, but the latest fortran
> standard
> includes objects.
>
> So, what Dave is saying, tongue in cheek, is that C will
> die in the
> way fortran has -- i.e., not at all.

I just hope standards committees don't "enhance" C into
Frankenstein's monster.


I actually think they might enhance C in this way in the ISO standard one day.  The only nice bit is this is like C + a taped on block "thingy".  You don't have to use it, and your other C is not affected by this change. (I think)

It's not like they're changing the semantics of the ; or anything. (or did they?)
 

Jonathan Cast

unread,
Sep 2, 2009, 3:41:36 PM9/2/09
to

*delurk*

That's an excessively strict view. You need *output* for a program to
be useful, but producing that output doesn't need to be intermixed with
the program's algorithm to be useful; you can compute first, then output
the results.

Furthermore, I don't think it's sophistry to say that you don't need
side effects to do output. ALGOL-derived languages use side effects for
output, because (to take C as an example) the type of an expression like

print("Hello, world\n")

is taken to be `int', and thus the `value' of that expression must be
some integer --- say, 13. Then you need to add a concept of `side
effects' to express the fact that there's more going on here than just
calculating the number 13.

Purely functional programming doesn't eschew I/O (although it encourages
a style that separates I/O from algorithms --- as does good programming
style in any language); rather, it re-works the types of the I/O
operations so that, if you have a function

foo :: String -> Int

the value of

foo "Hello, world!\n"

really is just an integer (and there's nothing else going on to
introduce side-effects to talk about). Whereas if you have a function

bar :: String -> IO Int

then the value (as expressed in the language and understood by the
programmer) of

bar "Hello, world!\n"

is a combination of I/O behavior, concurrency, etc. So you don't need
to introduce a concept of `side effects' to talk about those things.

If you were building a denotational semantics for C, this how you would
deal with I/O. The value (denotation) of a C expression of type int
would be a combination of I/O behavior, assignment behavior, etc., as
well as (possibly, due to the possibility of non-termination) an
integer. Purely functional programming just reduces the degree of
difference between the denotational semantics of the language and the
programmer's mental model of it.

Which is very likely all I have to say on the subject.

jcc

David Leimbach

unread,
Sep 2, 2009, 4:05:29 PM9/2/09
to
On Wed, Sep 2, 2009 at 12:10 PM, Jonathan Cast <jonath...@fastmail.fm> wrote:
On Wed, 2009-09-02 at 11:27 -0700, David Leimbach wrote:
> On Wed, Sep 2, 2009 at 11:08 AM, Richard Miller <9f...@hamnavoe.com>
> wrote:
>         >>
>         http://graphics.cs.williams.edu/archive/SweeneyHPG2009/TimHPG2009.pdf
>         >>
>         > on p. 43/44 i believe it is claimed that one
>         > cannot do CSP without pure functional
>         > programming.

>         (p ⇒ q) ⇏ (¬p ⇒ ¬q)


> That's interesting because pure functional programming doesn't exist
> at all in the strictest sense on a computer.  One MUST be able to
> cause side effects during computation or your CPU will just get hot...
> if even that.

*delurk*

That's an excessively strict view.  You need *output* for a program to
be useful, but producing that output doesn't need to be intermixed with
the program's algorithm to be useful; you can compute first, then output
the results.

Compute what first?  You compute input, to produce output.  You have no choice really.  In haskell the entry point is 

main :: IO ().

I rest my case.  

Note that I didn't say  "some code can't be pure", that's for the most part false (some would argue that even floating point math must be done in an impure way because one can set up the representation of floats differently, and that changes the purity of what would have been a pure function).  Some code certainly can be executed in a pure sense, but at some point those values came in via a very dirty input process.

The best part about Haskell is you can know by a functions type that no impure actions took place in a subset of your code.  This does not falsify my claim that even pure functional programming languages require impure code.

And if you prefer a plea to authority over logic, I haven't said anything that Simon Peyton Jones hasn't himself said about Haskell.

Dave

Jonathan Cast

unread,
Sep 2, 2009, 4:28:21 PM9/2/09
to
On Wed, 2009-09-02 at 13:02 -0700, David Leimbach wrote:


> And if you prefer a plea to authority over logic, I haven't said
> anything that Simon Peyton Jones hasn't himself said about Haskell.

Well, I disagree quite strongly about Simon Peyton Jones about a number
of things. Which I think I indicated by contradicting his stated
positions on several of those points in my original post.

My original message still stands as a reply to the rest of your post, so
I won't repeat it.

jcc

David Leimbach

unread,
Sep 2, 2009, 4:49:25 PM9/2/09
to
Fair enough! :-)
 
jcc




Roman V Shaposhnik

unread,
Sep 2, 2009, 7:01:52 PM9/2/09
to
On Wed, 2009-09-02 at 12:11 -0700, Brian L. Stuart wrote:
> > > Q: "Will C continue to be important into the future?"
> > > (Dave Kirk, Nvidia)A: "No, I think C will die like
> > Fortran has"
> >
> > let me explain the joke. In HPC circles, people have been
> > predicting
> > the death of fortran for 30 years. Fortran has continued to
> > grow and
> > thrive. The predictions continue, but the latest fortran
> > standard
> > includes objects.
> >
> > So, what Dave is saying, tongue in cheek, is that C will
> > die in the
> > way fortran has -- i.e., not at all.
>
> I just hope standards committees don't "enhance" C into
> Frankenstein's monster.

A friend of mine, who is still serving on the C committee, once
mentioned who lucky they were to have C++ around as a perfect
dumping ground for all the "cool" enhancements that got proposed
along the way.

Thanks,
Roman.

P.S. Another friend of mine still feels sad that Fortress didn't
become that same sort of dumping ground for Fortran ;-)


Greg Comeau

unread,
Sep 3, 2009, 5:52:46 AM9/3/09
to
In article <3096bd910909020751o120...@mail.gmail.com>,

Rodolfo kix <k...@kix.es> wrote:
>On Wed, Sep 2, 2009 at 4:29 PM, ron minnich<rmin...@gmail.com> wrote:
>> Q: "Will C continue to be important into the future?"
>> (Dave Kirk, Nvidia)A: "No, I think C will die like Fortran has"
>
>I believe OS/2 is destined to be the most important operating system,
>and possibly program, of all time.
>(Bill Gates, OS/2 Programmers Guide, November 1987)
>
>... we are all human ...
>
>:-)

When push comes the shove, these are probably both said in the
same spirit (I doubt Kirk feels C will die, nor Gates that
OS/2 was such (nor that MS products have no bugs))....
--
Greg Comeau / 4.3.10.1 with C++0xisms now in beta!
Comeau C/C++ ONLINE ==> http://www.comeaucomputing.com/tryitout
World Class Compilers: Breathtaking C++, Amazing C99, Fabulous C90.
Comeau C/C++ with Dinkumware's Libraries... Have you tried it?

Greg Comeau

unread,
Sep 3, 2009, 5:53:21 AM9/3/09
to
In article <1251932394.16...@work.SFBay.Sun.COM>,

Well, this is probably not a good time to mentioned that lambdas
and closures have been well discussed by the C++ committe with
lots of draft wording for them in a forthcmoing C++ standard.
That then may or may not mean the "dump" will make its was back to C.
Coming full circle, if it does, it means, Apple's block stuff
will not be compatible, at least not syntactically (at least
not what I recall of if -- have not look at it for a while).

Skip Tavakkolian

unread,
Sep 3, 2009, 7:18:53 AM9/3/09
to
> When push comes the shove, these are probably both said in the
> same spirit (I doubt Kirk feels C will die, nor Gates that
> OS/2 was such (nor that MS products have no bugs))....

what spirit is that? the one that says "i'm a rational person but
will say irrational things if it helps me sell my wares"?


tlar...@polynum.com

unread,
Sep 3, 2009, 8:05:22 AM9/3/09
to
On Thu, Sep 03, 2009 at 04:24:50AM -0700, Skip Tavakkolian wrote:
>
> i think by now most of us expect new ornamentation added to C++
> periodically. it is surprising that this is being considered seriously
> for C.
>

I'd like to say that my distate for C++ is purely technical, but to be
honest, I'm not quite sure.

I have the principle that, since a programming language aims to express
clearly what you want to be done, if the author doesn't explane clearly
his language, there is a problem.

K&R is beautiful in this respect. In contrast, I never managed to
bite in Stroustrup's description.

But the whole story is that, during my childhood, there was the
Muppet's. And a character was a swedish cook, whose name was almost
impossible to pronounce, whose recipes were not understandable, and
whose results were not engaging.

And I fear that, behind the conscious, this has played a role...
--
Thierry Laronde (Alceste) <tlaronde +AT+ polynum +dot+ com>
http://www.kergis.com/
Key fingerprint = 0FF7 E906 FBAF FE95 FD89 250D 52B1 AE95 6006 F40C

Skip Tavakkolian

unread,
Sep 3, 2009, 8:07:48 AM9/3/09
to
> Well, this is probably not a good time to mentioned that lambdas
> and closures have been well discussed by the C++ committe with
> lots of draft wording for them in a forthcmoing C++ standard.

i think by now most of us expect new ornamentation added to C++

Brantley Coile

unread,
Sep 3, 2009, 8:10:30 AM9/3/09
to
If the language can't be explained in 50 pages, it's no good.

Greg Comeau

unread,
Sep 3, 2009, 9:59:17 AM9/3/09
to
In article <ab7093364284b1ab...@9netics.com>,

That seems to be one valid interpretation :)

Greg Comeau

unread,
Sep 3, 2009, 10:03:02 AM9/3/09
to
In article <A879919A-6E6F-4425...@coraid.com>,

Brantley Coile <bran...@coraid.com> wrote:
>If the language can't be explained in 50 pages, it's no good.

Well, that rules out C too then! :) (not even considering the library parts)

Greg Comeau

unread,
Sep 3, 2009, 10:02:53 AM9/3/09
to
In article <2009090312...@polynum.com>, <tlar...@polynum.com> wrote:
>On Thu, Sep 03, 2009 at 04:24:50AM -0700, Skip Tavakkolian wrote:
>> i think by now most of us expect new ornamentation added to C++
>> periodically. it is surprising that this is being considered seriously
>> for C.
>
>I'd like to say that my distate for C++ is purely technical, but to be
>honest, I'm not quite sure.
>
>I have the principle that, since a programming language aims to express
>clearly what you want to be done, if the author doesn't explane clearly
>his language, there is a problem.
>
>K&R is beautiful in this respect. In contrast, I never managed to
>bite in Stroustrup's description.

Ok, now I'll get provocative:
Then why do so many people have a problem understanding C?
Please don't seriously say they don't. In fact, these same
arguments are used against C by those who don't care for C.
Go figure? I think not.

>But the whole story is that, during my childhood, there was the
>Muppet's. And a character was a swedish cook, whose name was almost
>impossible to pronounce, whose recipes were not understandable, and
>whose results were not engaging.
>
>And I fear that, behind the conscious, this has played a role...

That's good, because he's from Denmark :) Let's drop this part of things...

Robert Raschke

unread,
Sep 3, 2009, 11:00:06 AM9/3/09
to
On Thu, Sep 3, 2009 at 3:02 PM, Greg Comeau <com...@panix.com> wrote:
Ok, now I'll get provocative:
Then why do so many people have a problem understanding C?
Please don't seriously say they don't.  In fact, these same
arguments are used against C by those who don't care for C.
Go figure?  I think not.


I guess you gotta actually say what particular group of the population you are taking your "many people" out of. Programmers? People who work with computers? Artists? Europeans?

I'll assume you mean active programmers. Many of those will have a problem understanding assembly code. Funnily enough many more than those not understanding C will have a problem understanding high level programming languages like R, Haskell, or even Smalltalk.

What were we talking about again ...

Robby

Uriel

unread,
Sep 3, 2009, 11:05:18 AM9/3/09
to
On Wed, Sep 2, 2009 at 8:46 PM, David Leimbach<lei...@gmail.com> wrote:
> I mean HTTP has a small protocol, but if you count all the things you can do
> with REST, then it looks like a lot more.

HTTP might be many things, small is not one of them. That said, your
overall point is correct.

Peace

uriel

David Leimbach

unread,
Sep 3, 2009, 11:06:06 AM9/3/09
to
On Thu, Sep 3, 2009 at 2:52 AM, Greg Comeau <com...@panix.com> wrote:
In article <3096bd910909020751o120...@mail.gmail.com>,
Rodolfo kix <k...@kix.es> wrote:
>On Wed, Sep 2, 2009 at 4:29 PM, ron minnich<rmin...@gmail.com> wrote:
>> Q: "Will C continue to be important into the future?"
>> (Dave Kirk, Nvidia)A: "No, I think C will die like Fortran has"
>
>I believe OS/2 is destined to be the most important operating system,
>and possibly program, of all time.
>(Bill Gates, OS/2 Programmers Guide, November 1987)
>
>... we are all human ...
>
>:-)

When push comes the shove, these are probably both said in the
same spirit (I doubt Kirk feels C will die, nor Gates that
OS/2 was such (nor that MS products have no bugs))....

If I recall correctly, conspiracy theorists might even claim that Microsoft was singing the praises of OS/2 while simultaneously putting more effort into NT.  Note that Microsoft was working on OS/2 for IBM at the time, and probably consciously chose to make NT win that battle.  

I'm not saying that's what happened, I'm saying others *have* said so.

Jason Catena

unread,
Sep 3, 2009, 11:25:46 AM9/3/09
to
> If the language can't be explained in 50 pages, it's no good.

If it's not possible to clearly describe the core of a computer
programming language in fifty pages, then it has probably been
embellished with features, unnecessary to the language proper, to help
it compete in the lame one-size-fits-all strand of programming
language debate. In this respect Perl is a cautionary example, having
no coherent core that I could tell, just a cobbled-together collection
of features intended to try to replace single purpose programs. (But
then again, "those days are dead and gone and the eulogy was delivered
by Perl.")

Jason Catena

David Leimbach

unread,
Sep 3, 2009, 11:26:01 AM9/3/09
to
Well I meant small compared to all the APIs you can call with REST through it :-)
 

Peace

uriel


Brian L. Stuart

unread,
Sep 3, 2009, 1:44:48 PM9/3/09
to
> >K&R is beautiful in this respect. In contrast, I
> never managed to
> >bite in Stroustrup's description.
>
> Ok, now I'll get provocative:
> Then why do so many people have a problem understanding C?

Are you saying that there is a significant number of
people who understand C++ but not C? The reason I
ask is that it's exactly the other way around for me.
C is a simple enough language that I can understand
it in the sense of knowing what the compiler is doing
with my code. With C++, I have a much harder time
keeping my head around what's being done with the
code, and that makes it much harder for me to understand
the code, much less the language.

BLS


tlar...@polynum.com

unread,
Sep 3, 2009, 3:40:30 PM9/3/09
to
On Thu, Sep 03, 2009 at 02:02:53PM +0000, Greg Comeau wrote:
> In article <2009090312...@polynum.com>, <tlar...@polynum.com> wrote:
> >I have the principle that, since a programming language aims to express
> >clearly what you want to be done, if the author doesn't explane clearly
> >his language, there is a problem.
> >
> >K&R is beautiful in this respect. In contrast, I never managed to
> >bite in Stroustrup's description.
>
> Ok, now I'll get provocative:
> Then why do so many people have a problem understanding C?

Whether because there are too many people doing programming when they should
not (including me). Or because they are trying to learn C from another
book than K&R's.

C shall be the test. If you don't even understand C, explained by K&R,
then do something else.

Daniel Lyons

unread,
Sep 3, 2009, 5:57:32 PM9/3/09
to

On Sep 3, 2009, at 1:38 PM, tlar...@polynum.com wrote:

> C shall be the test. If you don't even understand C, explained by K&R,
> then do something else.


I'm glad this attitude exists, particularly here in the Plan 9
community, where it belongs. But I don't agree. There are many
languages because there are many ways of thinking about programming.
Most people don't get introduced to the right one on the first try.
Obviously C programmers have unique talents, but I prefer to think the
world has a shortage of good programmers, not a glut of bad ones. Few
got to excellence by birth alone.


Daniel Lyons


Tharaneedharan Vilwanathan

unread,
Sep 3, 2009, 6:37:26 PM9/3/09
to
> C shall be the test. If you don't even understand C, explained by K&R,
> then do something else.
i agree to this completely. after taking a formal course in computers,
if someone cannot read/follow K&R C book and cannot write C code, i
would think that the candidate is not good enough.

thanks
dharani

Greg Comeau

unread,
Sep 4, 2009, 5:03:42 AM9/4/09
to
In article <561059....@web83913.mail.sp1.yahoo.com>,

Brian L. Stuart <blst...@bellsouth.net> wrote:

I wasn't saying anything, I was asking a question. :)
But if I were to say something, it would include all you
just said and more. Focusing slightly, most people do have
a problem understanding/using/whatevering C including from
a high level and low level perspective. Even more focusing,
most people don't know what the compiler is doing with their C
code, even though say C++ usually gets the short end of the
stick on this one (deservingly so, but C ain't 0&, far from
it IMO).

Greg Comeau

unread,
Sep 4, 2009, 5:04:17 AM9/4/09
to
In article <d50d7d460909030813u703...@mail.gmail.com>,

Jason Catena <jason....@gmail.com> wrote:
>> If the language can't be explained in 50 pages, it's no good.
>
>If it's not possible to clearly describe the core of a computer
>programming language in fifty pages, then it has probably been
>embellished with features, unnecessary to the language proper, to help
>it compete in the lame one-size-fits-all strand of programming
>language debate.

As mentioned, then, that includes C too. For that matter,
a whole pack of stuff. So, I can't imagine that's really
the point being brought forth.

Greg Comeau

unread,
Sep 4, 2009, 5:04:02 AM9/4/09
to
In article <6a3ae47e0909030757n31a...@mail.gmail.com>,

Robert Raschke <rtrl...@googlemail.com> wrote:
>On Thu, Sep 3, 2009 at 3:02 PM, Greg Comeau <com...@panix.com> wrote:
>> Ok, now I'll get provocative:
>> Then why do so many people have a problem understanding C?
>> Please don't seriously say they don't. In fact, these same
>> arguments are used against C by those who don't care for C.
>> Go figure? I think not.
>I guess you gotta actually say what particular group of the population you
>are taking your "many people" out of. Programmers? People who work with
>computers? Artists? Europeans?
>
>I'll assume you mean active programmers.

Right, not the general population, but programmers, or, at lest
those claiming to be said.

>Many of those will have a problem
>understanding assembly code. Funnily enough many more than those not
>understanding C will have a problem understanding high level programming
>languages like R, Haskell, or even Smalltalk.

Probably so.

>What were we talking about again ...

Something about C being beautifully explained, or something like that.

Greg Comeau

unread,
Sep 4, 2009, 5:15:46 AM9/4/09
to
In article <2009090319...@polynum.com>, <tlar...@polynum.com> wrote:
>On Thu, Sep 03, 2009 at 02:02:53PM +0000, Greg Comeau wrote:
>> In article <2009090312...@polynum.com>, <tlar...@polynum.com> wrote:
>> >I have the principle that, since a programming language aims to express
>> >clearly what you want to be done, if the author doesn't explane clearly
>> >his language, there is a problem.
>> >
>> >K&R is beautiful in this respect. In contrast, I never managed to
>> >bite in Stroustrup's description.
>>
>> Ok, now I'll get provocative:
>> Then why do so many people have a problem understanding C?
>
>Whether because there are too many people doing programming when they should
>not (including me). Or because they are trying to learn C from another
>book than K&R's.
>
>C shall be the test. If you don't even understand C, explained by K&R,
>then do something else.

Personally, so IMO, from a few perspectives, I have found that to
categorically false.

Brian L. Stuart

unread,
Sep 4, 2009, 1:51:54 PM9/4/09
to
> >> >K&R is beautiful in this respect. In
> contrast, I
> >> never managed to
> >> >bite in Stroustrup's description.
> >>
> >> Ok, now I'll get provocative:
> >> Then why do so many people have a problem
> understanding C?
> >
> >Are you saying that there is a significant number of
> >people who understand C++ but not C?  The reason
>
> I wasn't saying anything, I was asking a question. :)

Ah, I misunderstood. The question about why people don't
understand C on the heels of a reference to Stroustrup
led me to think that was a suggestion C++ was easier to
understand than C. Of course, I may be a little too
sensitive to such a claim, because of what I've been
hearing in the academic community for a while. Some
keep saying that we should use more complex languages
in the introductory course because they're in some way
easier. But I've yet to understand their definition
of easier.*

BLS

*Well, actually I do kind of realize they are suggesting
that a tinkertoy approach makes it easier for a beginner
to see something happen. The problem I have is that's
not the point of teaching that material. Just getting
something to happen might be training, but it sure isn't
education.


Jack Norton

unread,
Sep 4, 2009, 2:05:45 PM9/4/09
to
Brian L. Stuart wrote:
> Just getting something to happen might be training, but it sure isn't
> education.
>
Thats the best one-liner I have ever heard on the subject.

-Jack

Eris Discordia

unread,
Sep 4, 2009, 4:24:04 PM9/4/09
to
Caveat: please add IMH(UI)O in front of any assertion that comes below.

Since education was brought up: I remember I found it seriously twisted
when I was told mathematics freshmen in a top-notch university not
(geographically) far from me are taught not one but two courses in computer
programming... in Java.

Being the hobbyist (as contrasted to the professional) here, and the one
who's got the smaller cut out of the intelligence cake, I think I am sure C
was a lot easier to learn and comprehend than either Pascal--all the kids
were into "Pascal or C? That's the problem" back then--or C++ or even the
mess of a language called GW-BASIC (which I learnt as a kid and before I
knew C, too, could be learnt by kids). Even if Pascal got all the buzz
about being a "teaching language."

What seems to distinguish--pedagogically, at least--C is, as I noted on
that other thread, its closeness to how the small computer, not the actual
small computer but the mental model of a small computer, works. Pointers?
They're just references to "pigeonholes" in a row of such holes. Scope?
It's just how long your variables are remembered. Invocation? Just a way to
regurgitate your own cooking. If one has to solve a problem, implement an
algorithm, on a small computer one needs to be able to explain it in terms
of the primitives available on that computer. That's where C shines.
There's a close connection between language primitives and the primitives
of the underlying computer. I'm not saying this is something magically
featuring in C--it's a property that _had_ to feature in some language some
time, C became that. In a different time and place, on different machines,
another language would/will be that (and it shall be called C ;-))

I whined about LISP on yet another thread. Above says precisely why I did.
LISP is twofold hurtful for me as a naive, below average hobbyist. For one
thing the language constructs do not reflect the small computer primitives
I was taught somewhere around the beginning of my education. For another,
most (simple) problems I have had to deal with are far better expressible
in terms of those very primitives. In other words, for a person of my (low)
caliber, LISP is neither suited to the family of problems I encounter nor
suited to the machines I solve them on. Its claim to fame as the language
for "wizards" remains. Although, mind you, the AI paradigm LISP used to
represent is long deprecated (Rodney Brooks gives a good overview of this
deprecation, although not specifically targeting LISP, in "Cambrian
Intelligence: The Early History of the New AI"). One serious question today
would be: what's LISP _really_ good for? That it represents a specific
programming paradigm is not enough justification. Ideally, a language
should represent a specific application area, as does C, i.e.
general-purpose system and (low-level) application programming.

A favorite quote out of my favorite physics textbook:

> Further progress lies in the direction of making our equations invariant
> under wider and still wider transformations. This state of affairs is
> very satisfactory from a philosophical point of view, as implying an
> increasing recognition of the part played by the observer in himself
> introducing the regularities that appear in his observations, and a lack
> of arbitrariness in the ways of nature, but it makes things less easy for
> the learner of physics.

-- P. A. M. Dirac, The Principles of Quantum Mechanics

Unlike physical phenomena languages (natural or artificial) are subject to
constraints that act (in comparison) very slowly and very leniently.
There's a great deal of arbitrariness in how a computer language might
look. It is epistemologically, aesthetically, and pragmatically
advantageous to "remove arbitrariness" by fitting a language to either its
target platform or its target problem, preferably both. C did and continues
to do so, LISP doesn't (not anymore, to say the least).


P.S. UI stands for "uninformed."

--On Friday, September 04, 2009 10:47 -0700 "Brian L. Stuart"

Daniel Lyons

unread,
Sep 4, 2009, 5:41:14 PM9/4/09
to
Let me be a little pedantic.

On Sep 4, 2009, at 2:18 PM, Eris Discordia wrote:
> Above says precisely why I did. LISP is twofold hurtful for me as a
> naive, below average hobbyist.

FYI, it's been Lisp for a while.

> For one thing the language constructs do not reflect the small
> computer primitives I was taught somewhere around the beginning of
> my education.

Like what? The if statement, which was invented by Lisp? The loop
statement, for expressing loops? It sounds like you got a dose of
Scheme rather than Lisp to me.

> For another, most (simple) problems I have had to deal with are far
> better expressible in terms of those very primitives. In other
> words, for a person of my (low) caliber, LISP is neither suited to
> the family of problems I encounter nor suited to the machines I
> solve them on.

This hasn't been true for a while. Common Lisp is a general purpose
language like any other. The only thing I have ever found obnoxious
about CL was the filesystem API. Most CL implementations are compilers
these days and they produce surprisingly efficient machine code. The
Scheme situation is more diverse but you can definitely find
performance if that's what you're eluding to.

> Its claim to fame as the language for "wizards" remains.

I think this has more to do with Lisp users being assholes than
anything intrinsic about Lisp. This is one of the nice things about
Clojure. It's a break from tradition in this regard, as well as many
others.

> Although, mind you, the AI paradigm LISP used to represent is long
> deprecated (Rodney Brooks gives a good overview of this deprecation,
> although not specifically targeting LISP, in "Cambrian Intelligence:
> The Early History of the New AI"). One serious question today would
> be: what's LISP _really_ good for? That it represents a specific
> programming paradigm is not enough justification. Ideally, a
> language should represent a specific application area, as does C,
> i.e. general-purpose system and (low-level) application programming.


It's as though you have the up-to-date negative propaganda, but not
the up-to-date facts. Lisp is "really good for" the same kinds of
things other general purpose languages are good for. The main benefits
it had in AI were features that came from garbage collection and
interactive development. You get those benefits today with lots of
systems, but that doesn't mean they aren't still there in Lisp. An
advantage it has these days is that it produces code that performs
better than, say, Python or Perl. I definitely would not call being a
"general purpose system" and suitability for "application programming"
a "specific application area." This is like saying agglutinative
languages are worse for conquering the world with than isolating
languages because the Ottoman empire fell before the English empire.

Please don't interpret this as "Lisp kicks C's ass." I'm saying,
you're only seeing the negative half of the situation, and seeing too
much causality. I think it's mostly happenstance. Lots of languages
succeed despite having a killer app or app area. Python's a good
example. Isolating the exact ingredients for the success of any
language is probably impossible. I'd say only with C is it really
clear what led to success, and it wasn't exclusively features of the
language itself (though it was a part of it), but also that it came
with Unix along with the source code. If the quacks had chosen C
instead of Lisp for their "AI research" perhaps C would have taken a
big hit during the so-called AI winter instead of Lisp. Perhaps if the
Lisp machine vendors hadn't misunderstood basic economics so
thoroughly, their machines would have become more common and taken
Lisp with them the way Unix brought C. There are simply too many
variables to lay the blame at Lisp's alleged functional basis.
Especially today when languages like Haskell exist that take
functional so much further they make Lisp look like a procedural
language by comparison.


Daniel Lyons


Jason Catena

unread,
Sep 4, 2009, 6:41:24 PM9/4/09
to
Hailed Eris:

> One serious question today would be: what's LISP _really_ good for?

It's not LISP, but I've found Haskell good for writing terse code that
works. Once you get your code past the type checker, it's likely to
just work for the forseeable future if it's pure. Most tricky code
ends up pure, since the transforms are usually the more extensive,
interesting, and clever (ie difficult to debug) part of a (especially
pipeline-based) program.

I don't really care that a language is or is not close to the machine,
if the compiler (ie GHC) gets it in the same order of magnitude
runtime as C. In fact, I'd rather manipulate lists with higher-order
functions, and just get the job done, than hack around with this
year's new idioms to make C all things to all people. Best tool for
the job and all that: C has a great niche as an OS language, but
sometimes it's better just to write less, more stable code (eg xmonad
vs any C-based window manager).

Jason Catena

andrey mirtchovski

unread,
Sep 4, 2009, 6:54:30 PM9/4/09
to
> This is like saying
> agglutinative languages are worse for conquering the world with than
> isolating languages because the Ottoman empire fell before the English
> empire.

I wish there was a way to record this for the next generation. Perhaps
in a list of worthy sayings and fortune cookies we could store
together with the rest of the system?

I must now find a way to somehow apply this simile in casual conversation.

Richard Miller

unread,
Sep 5, 2009, 7:08:37 AM9/5/09
to
>> One serious question today would be: what's LISP _really_ good for?

http://www.paulgraham.com/avg.html


Akshat Kumar

unread,
Sep 5, 2009, 7:26:15 AM9/5/09
to
> http://www.paulgraham.com/avg.html
"Programming languages are just tools, after all."

Considering that Plan 9 has only two inherent languages,
and its users often push for work to be done in only those,
what is the Plan 9 perspective of languages and tools in
relation to each other?
Is it in agreement with this statement?


ak

tlar...@polynum.com

unread,
Sep 5, 2009, 8:14:46 AM9/5/09
to
On Sat, Sep 05, 2009 at 07:22:37AM -0400, Akshat Kumar wrote:
> "Programming languages are just tools, after all."
>
> Considering that Plan 9 has only two inherent languages,
> and its users often push for work to be done in only those,
> what is the Plan 9 perspective of languages and tools in
> relation to each other?

I don't know for "the Plan 9 perspective" and have no authority to talk
"for Plan 9", but since almost all interpreters or compilers are written
in C, whether completely or the bootstapping procedure (a C core that is
able to interpret a subset of the language to generate a binary for the
non optimized version of a complete compiler etc.), there are all the
tools as long as there is a C compiler for the machine.

The remaining is, IMHO, user stuff: one has all tools needed to
customize.

The only "lack" in C is perhaps defined full control for
arithmetic/calculus. That's probably why FORTRAN is still here and has
still its strength in this area.

Just my 2 centimes,

Anthony Sorace

unread,
Sep 5, 2009, 9:42:07 AM9/5/09
to
Akshat said:

// Considering that Plan 9 has only two inherent languages...

I'm curious which two you meant. Most of the code running on my Plan 9
installations is written in either C or rc. For code I've written running on it,
Limbo is about as high. And of course there's a little assembly down deep.
And a bunch of awk and mk, obviously. And acid is invaluable for the set
of tasks for which it was designed.

I also don't really know what "inherent" means. "Thing which generates
machine code directly"? Or maybe "compiler/interpreter included in the
distribution"? That's closest, I guess.

// ...and its users often push for work to be done in only those...

Simply disagree. Good Unix (and, here, by extension) Plan 9 folks tend to
be fond of "little languages" - they coined the term, after all. I think in that
sense, I'd be very surprised to find many Plan 9 folks argue against using
the right tool (language) for the job.

What I think you might be thinking of is that Plan 9 folks are a little more
conservative in their selection of languages. You're not likely to see much
perl here, because overall people aren't really convinced it offers anything
over awk, maybe awk+rc. You're not likely to see much sh because we've
got rc. Just because a tool exists doesn't mean it's the right tool for
anything.

This has its costs, mainly in application support. We might not like C++
because we don't see much advantage over C, and we might be right, but
that doesn't change the fact that we've now got a higher barrier between
us and application authors that made a different decision. That's often a
good thing (less crap), but it does hurt us in places, too.

Eris Discordia

unread,
Sep 5, 2009, 10:19:49 AM9/5/09
to
> Let me be a little pedantic.

The 9fans know given the haphazard nature of a hobbyist's knowledge I am
extremely bad at this, but then let me give it a try.

> FYI, it's been Lisp for a while.

As long as Britannica and Merriam-Webster call it LISP I don't think
calling it LISP would be strictly wrong. Has LISt Processing become
stigmatic in Lisp/LISP community?

> Like what? The if statement, which was invented by Lisp? The loop
> statement, for expressing loops? It sounds like you got a dose of Scheme
> rather than Lisp to me.

I just read in Wikipedia that, "Lisp's original conditional operator, cond,
is the precursor to later if-then-else structures," without any citations.
Assuming that to be true conditional branching is a fundamental element of
control flow and it has existed in machine languages ever since early days.
There's really very little to brag about it.

Regardless, I offer the following comparison:

> 19.2. How to Use Defstruct
<http://www.cs.cmu.edu/Groups/AI/html/cltl/clm/node170.html>

> Struct (C programming language)
<http://en.wikipedia.org/wiki/Struct_(C_programming_language)>

In the (small mind?) mental model of small computer there's the row of
pigeonholes and the stencil you may slide along the row for "structured"
access to its contents. I leave it to you to decide which of the above
better corresponds to that. My opinion you already know.

Indeed, my only encounter with LISP has been Scheme and through a failed
attempt to read SICP.

> This hasn't been true for a while. Common Lisp is a general purpose
> language like any other. The only thing I have ever found obnoxious about
> CL was the filesystem API. Most CL implementations are compilers these
> days and they produce surprisingly efficient machine code. The Scheme
> situation is more diverse but you can definitely find performance if
> that's what you're eluding to.

I was alluding to the expressive power of C versus LISP considered with
respect to the primitives available on one's computing platform and
primitives in which solutions to one's problems are best expressed. It
isn't a matter of whether the language you use is supplemented by good
libraries or how fast the binary image you produce can run as I have little
doubt out there exist lightning fast implementations of complex algorithms
in LISP. I was trying to give my personal example for why I managed to
learn C and failed to learn LISP.

If you have a scrawny x86 on your desktop and are trying to implement, say,
a bubble sort--yes, the notorious bubble sort, it's still the first thing
that comes to a learner's mind--it seems C is quite apt for expressing your
(embarrassing) solution in terms of what is available on your platform.
Loops, arrays, swapping, with _minimal_ syntactic distraction. Simple,
naive algorithms should end up in simple, immediately readable (and
refutable) code. Compare two implementations and decide for yourself:

<http://en.literateprograms.org/Bubble_sort_(Lisp)>
<http://en.literateprograms.org/Bubble_sort_(C)>

>> Its claim to fame as the language for "wizards" remains.
>
> I think this has more to do with Lisp users being assholes than anything
> intrinsic about Lisp. This is one of the nice things about Clojure. It's
> a break from tradition in this regard, as well as many others.

I really did mean "wizards" by "wizards." I intended no insult--merely sort
of an awed jealousy.

> It's as though you have the up-to-date negative propaganda, but not the
> up-to-date facts.

Of course. Propaganda has a wider outreach than facts, particularly when
for every textbook on a subject there are, I don't know, ten (or more?) on
the competing subject.

> The main benefits it had in AI were features that came from garbage
> collection and interactive development.

More importantly, LISt Processing which used to be an element of the expert
systems approach to AI and which is now defunct (as a way of making
machines intelligent, whatever that means). While "expert systems" continue
to exist the word causes enough reverb of failure to be replaced by other
buzzwords: knowledge-based systems, automated knowledge bases, and whatnot.

I think, and may be dead wrong, LISP's ominous appearance came from
adhering to an AI paradigm. Now that the paradigm's no more viable why
should the appearance persist?

> An advantage it has these days is that it produces code that performs
> better than, say, Python or Perl.

I cannot comment on this. Have no knowledge of Python and beg to disagree
about Perl. The entry barrier for learning Perl was low enough for me to
learn and use it, unlike LISP.

> I definitely would not call being a "general purpose system" and
> suitability for "application programming" a "specific application area."

Well, for one thing I believe you have misread me. I said C was a
general-purpose language good for "system programming"--you seem to call
that "being a good OS language"-- and low-level application programming. I
probably should have taken more care and wrote the precise term: systems
programming.

> This is like saying agglutinative languages are worse for conquering the
> world with than isolating languages because the Ottoman empire fell
> before the English empire.

Correlation doesn't imply causation--that's true. But there _are_ ways to
ascertain a correlation is due to a causal relationship. One such way is to
identify known causes of success or failure. _If_ one claims a language
costs more to learn and rewards similarly or even less than another
language one already has identified a known cause of failure. If failure
does occur, causation by the language itself, rather than its surrounding
elements (marketers, users, designers, climate, serendipity), cannot be
ruled out.

> I think it's mostly happenstance. Lots of languages succeed despite
> having a killer app or app area. Python's a good example.

Despite _not_ having those, you mean, right? I think it's too early to talk
about Python's success. It has barely lived half as long as C and one-third
as long as LISP. If you're really going to call Python successful I don't
know how you're going to describe Java.

> Please don't interpret this as "Lisp kicks C's ass."

I don't, and I certainly weren't implying "C kicks LISP's ass." I don't
qualify for that sort of assertion.

> There are simply too many variables to lay the blame at Lisp's alleged
> functional basis.

That's a very good point. I did say "LISP represents a programming
paradigm" but I don't think its (perceived?) failure has to do with the
paradigm itself, rather with whether mere mortals can find application
areas where the cost of assimilating that paradigm (and therefore learning
the language) is justified by measurable gains.


--On Friday, September 04, 2009 15:36 -0600 Daniel Lyons

Eris Discordia

unread,
Sep 5, 2009, 10:32:27 AM9/5/09
to
>>> One serious question today would be: what's LISP _really_ good for?
>
> http://www.paulgraham.com/avg.html

I could do a similar thing:

<http://www.schnada.de/quotes/contempt.html#struetics>

... and leave you wondering (or not). I won't.

Paul Graham's essay/article consists of a success story, _his_ success
story (which, in minor part, depends on continued sales of his two LISP
books), and a variety of claims I am unqualified to verify or refute. What
is there for me to learn? That there exists/existed one successful LISP
application? Is that really what I had tried to negate?

Besides, if quoting ESR were a measure of credibility I'd be given some
when I appeared to 9fans out of the blue and quoted him saying something to
the effect that Plan 9 is dead and buried because it wasn't up to replacing
UNIX (at the moment, that is _not_ my opinion).

--On Saturday, September 05, 2009 12:02 +0100 Richard Miller

Eris Discordia

unread,
Sep 5, 2009, 10:38:43 AM9/5/09
to
I forgot this: Graham basically accuses programmers who don't find LISP as
attractive (or powerful, as he puts it) as he does of living on lower
planes of existence from which the "heavens above" of functional (or only
LISP) programming seem incomprehensible. He writes/speaks persuasively,
he's a successful businessman, but is he also an honest debater?

--On Saturday, September 05, 2009 12:02 +0100 Richard Miller
<9f...@hamnavoe.com> wrote:

John Floren

unread,
Sep 5, 2009, 10:40:24 AM9/5/09
to
On Sat, Sep 5, 2009 at 7:27 AM, Eris Discordia<eris.di...@gmail.com> wrote:
>>>> One serious question today would be: what's LISP _really_ good for?
>>
>> http://www.paulgraham.com/avg.html
>
> I could do a similar thing:
>
> <http://www.schnada.de/quotes/contempt.html#struetics>
>
> ... and leave you wondering (or not). I won't.
>

Oh, yay, a Xah Lee quote, he's surely a trusted source on all things
Lisp. Didja read his page about hiring a prostitute in Las Vegas? Or
the one about how he lives in a car in the Bay Area because he's too
crazy to get hired?


John
--
"Object-oriented design is the roman numerals of computing" -- Rob Pike

Eris Discordia

unread,
Sep 5, 2009, 10:41:32 AM9/5/09
to
> general-purpose language good for "system programming"--you seem to call
> that "being a good OS language"--

I take this part back. I mixed your post with Jason Catena's for a moment.

--On Saturday, September 05, 2009 15:14 +0100 Eris Discordia

Eris Discordia

unread,
Sep 5, 2009, 10:57:24 AM9/5/09
to
> Oh, yay, a Xah Lee quote, he's surely a trusted source on all things
> Lisp. Didja read his page about hiring a prostitute in Las Vegas? Or
> the one about how he lives in a car in the Bay Area because he's too
> crazy to get hired?

Patience, brother. Search "Paul Graham" on that page and let your mind do
the free association. And I did say it was about wondering, didn't I?

--On Saturday, September 05, 2009 07:36 -0700 John Floren

erik quanstrom

unread,
Sep 5, 2009, 2:34:30 PM9/5/09
to
i'm not a lisp fan. but it's discouraging to see
such lack of substance as the following (collected
from a few posts):

> Oh, yay, a Xah Lee quote, he's surely a trusted source on all things
> Lisp. Didja read his page about hiring a prostitute in Las Vegas? Or
> the one about how he lives in a car in the Bay Area because he's too
> crazy to get hired?

surely an ad hominum attack like this neither furthers an
argument nor informs anyone.

> I forgot this: Graham basically accuses programmers who don't find LISP as
> attractive (or powerful, as he puts it) as he does of living on lower
> planes of existence from which the "heavens above" of functional (or only
> LISP) programming seem incomprehensible. He writes/speaks persuasively,
> he's a successful businessman, but is he also an honest debater?

and here i don't see an argument at all.

> I just read in Wikipedia that, "Lisp's original conditional operator, cond,
> is the precursor to later if-then-else structures," without any citations.
> Assuming that to be true conditional branching is a fundamental element of
> control flow and it has existed in machine languages ever since early days.
> There's really very little to brag about it.

i'd love to argue this factually, but my knowledge isn't
that extensive. i think you'll find in the wiki entry for
Computer that much of what we take for granted today
was not obvious at the time. stored program computers
with branching didn't come along until about 1948
(einiac). i hope someone will fill in the gaps here.
i think it's worth appreciating how great these early
discoveries were.

in the same vein, i don't know anything much about file
systems that i didn't steal from ken thompson.

- erik

Daniel Lyons

unread,
Sep 5, 2009, 3:34:40 PM9/5/09
to
Eris,

Using your theories, please explain why Lisp and Plan 9 both hover
around the same level of popularity (i.e., not very, but not dead
either).


Daniel Lyons


Eris Discordia

unread,
Sep 5, 2009, 7:54:54 PM9/5/09
to
> Using your theories, please explain why Lisp and Plan 9 both hover around
> the same level of popularity (i.e., not very, but not dead either).

I don't think I can say anything in that respect that cannot either be
easily refuted or greatly improved upon by someone already reading this
list and just too busy with their own stuff to post. Some of them
explicitly avoid feeding the troll (that I be, supposedly).

Anyway, here's what I think: Plan 9 and LISP are different, evolutionarily.
LISP seems to me like a downsized reptile that has survived and been forced
to exist in the shadow of mammals after the Mesozoic while Plan 9 looks
more like a lemur. A rather recently developed mammal driven into a small
area by its close kin from a common ancestor.

And one primary note: I have come to understand, in part thanks to this
very list, that popularity isn't really a good measure of merit for
computer stuff but you asked about popularity so I'll try to focus on that.
(Case in point, there's a lot I read about on this list that I don't think
I'd hear about in a lifetime, and this isn't a popular list.)

**********

LISP evolved in a parallel path to the line of languages that descended
from ALGOL. It represented/represents a programming paradigm--whose
significance is beyond me but visible to CS people--and it used to also
embody an application area. That application area, at the time, overlapped
with the ambitions of some of the best experts in computation. LISP gained
momentum, became an academic staple, was the pride and joy of world's best
CS/CE departments. The application area got hit but the programming
paradigm was strong as before.

The paradigm has scientific value--which is again beyond me but I trust CS
people on that--so it continues to be taught at world's best CS/CE
departments and to up-and-coming programmers and future computer
scientists. SICP is witness to that. In the academy, LISP will live on as
long as the paradigm it's attached to lives on and is deemed significant.
Those same people who are educated in some dialect of LISP, as well as
other languages, found businesses and apply their knowledge; occasionally,
by way of their training in LISP. For whatever reason they see merit in it
that many self-educated programmers or those trained at lesser institutions
don't. Obviously, there aren't that many top CS/CE departments and those
with founder status or strongly influences by founder institutions are
still fewer. Hence, LISP's living dead state: "popularity" among the elite.
Mind you, the natural divide between the two groups can sometimes be a
cause of resentment and get non-LISP people badmouthing it.

**********

Plan 9, on the other hand, was supposed to be a drop-in successor to
UNIX--a natural step forward. It was supposed to satisfy long-time UNIX
users by deceiving them with a similar-looking toolset while implementing a
large change of philosophy whose impact would only become clear after
(previous) UNIX users had already settled in. The factors that kept it from
actually replacing UNIX everywhere are many.

One factor was timing. It reached various tiers of "ignorant masses" when
not one but multiple possible continuations of UNIX, all of them FOSS, had
already gained foothold (GNU/Linux and *BSD).

The other factor was its overly complex arrangement compared to the mundane
purposes of lowly creatures more or less like me. I have tried arguing why
Plan 9 as it is is a hassle on desktop systems and have been met with
criticism that mostly targeted my lack of computer aptitude in general
rather than my argument. I stressed what I termed "conceptual complexity"
of Plan 9's model of how things should be and the lack of _any_ user
friendly, albeit sane, abstraction on top of that complexity.

A third, more important, factor is that it was advocated to people who
probably couldn't understand how Plan 9 would serve them better than things
they heard of more regularly, where was this new thing's edge that
justified the cost of its adoption. I for one am still at a loss on that
matter. As a hobbyist, I lurk, and occasionally--they say--troll, around
here but I'm not keeping my huge media collection on a Plan 9 installation
or using Acme for entering multi-lingual (up to three languages until a
while ago, four recently) text. Either task would be extremely cumbersome
to do on Plan 9 (and this really has little to do with the OS itself). In
short, I won't be doing Plan 9 because it's Plan 9. I, and most of the
lowly ones, need further justification that either hasn't been presented or
is way above my, or our, head.

The fourth factor I can think of is Plan 9's owners' attitude towards it. I
once dared go as far as saying it was actually "jettisoned." For reasons
that are beyond me Plan 9 isn't seeing much attention from Bell Labs or its
creators. It currently seems to lack the Benevolent Dictator for Life
figure many FOSS projects have. The overall air around it is one of
dereliction even if it is in fact being actively worked on behind the
scenes. Whether this is desired is again beyond me.

As a final note I think I should draw your attention to Linux's and *BSD's
path of ascendancy. All of these OSs seem to have consecutively attracted
distinct groups of users: serious programmers/contributors/researchers,
startups, the pleb (that's my kind), and corporate users--in that specific
order. Plan 9 seems to have stopped at stage 2 (startups), or maybe it's
just progressing and I am unaware of the progress. Regardless, attracting
the pleb seems to be the key to entering corporate user market and
widespread popularity, i.e. the stage where corporations, hoping to win the
pleb or higher pleb (industries and businesses), are willing to sponsor
(read: bribe) universities, students, and their own R&D departments in
teaching, learning, and furthering the new thing's cause.

**********

I must stress again these are all my impressions; hallucinations, if you
will.


--On Saturday, September 05, 2009 13:30 -0600 Daniel Lyons

Eris Discordia

unread,
Sep 5, 2009, 8:10:57 PM9/5/09
to
>> I forgot this: Graham basically accuses programmers who don't find LISP
>> as attractive (or powerful, as he puts it) as he does of living on
>> lower planes of existence from which the "heavens above" of functional
>> (or only LISP) programming seem incomprehensible. He writes/speaks
>> persuasively, he's a successful businessman, but is he also an honest
>> debater?
>
> and here i don't see an argument at all.

I was trying to say the same thing about Paul Graham's view of people who
don't like, or "grok," LISP. That he doesn't argue the point--he presents
it as a fact.

> i'd love to argue this factually, but my knowledge isn't
> that extensive. i think you'll find in the wiki entry for
> Computer that much of what we take for granted today
> was not obvious at the time. stored program computers
> with branching didn't come along until about 1948
> (einiac). i hope someone will fill in the gaps here.
> i think it's worth appreciating how great these early
> discoveries were.

I agree with your point about non-triviality of much about computers that's
taken for trivial today. However, I happened to have consulted this book
couple of years ago:

<http://books.google.com/books?id=nDWPW9uwZPAC&dq=%22the+first+computers%22&printsec=frontcover&source=bl&ots=Z_Kegt6Rwn&sig=zrlQEVkK8z7fAmBtXsW2lx754Zo&hl=en&ei=uPqiSsDVKY-GmwPkncjAAw&sa=X&oi=book_result&ct=result&resnum=7#v=onepage&q=branching&f=false>

(This is Google Books search inside the book with the term "conditional
branching.")

I wasn't, in this case at least, implying something not backed by firm
evidence. Conditional branching embodied in actual computers goes back to
Plankalkuel on Z3. The idea is as early as Babbage. It comes as natural
even to first-timers, following much more difficult conception of a notion
of control flow, that there must be a manner of conditionally passing it
around.

--On Saturday, September 05, 2009 14:26 -0400 erik quanstrom

erik quanstrom

unread,
Sep 5, 2009, 8:22:04 PM9/5/09
to
> I wasn't, in this case at least, implying something not backed by firm
> evidence. Conditional branching embodied in actual computers goes back to
> Plankalkuel on Z3. The idea is as early as Babbage. It comes as natural
> even to first-timers, following much more difficult conception of a notion
> of control flow, that there must be a manner of conditionally passing it
> around.

so you're saying that the table in this section is wrong?

http://en.wikipedia.org/wiki/Computer#History_of_computing

if it is and you can back it up, i sugeest you fix wikipedia.

- erik

Eris Discordia

unread,
Sep 5, 2009, 8:42:44 PM9/5/09
to
> so you're saying that the table in this section is wrong?
>
> http://en.wikipedia.org/wiki/Computer#History_of_computing
>
> if it is and you can back it up, i sugeest you fix wikipedia.

It isn't wrong.

The exact wording from "The First Computers: History and Architectures"
goes:

> The instruction most conspicuously absent from the instruction set of the
> Z3 is conditional branching. [...] but there is no straightforward way to
> implement conditional sequences of instructions. However, we will show
> later than conditional branching can be simulated on this machine.

On the other hand, Wikipedia's article on Plankalkuel says:

> Plankalkül drew comparisons to APL and relational algebra. It includes
> assignment statements, subroutines, conditional statements, iteration,
> floating point arithmetic, arrays, hierarchical record structures,
> assertions, exception handling, and other advanced features such as
> goal-directed execution.

-- <http://en.wikipedia.org/wiki/Plankalkuel>

In other words, both statements are correct. Z3 did not have conditional
branching (given the type of store it used it would be too hard), however
Plankalkuel did provision conditionals, invocation, and subroutines--all
that is necessary to implement conditional branching.

--On Saturday, September 05, 2009 20:17 -0400 erik quanstrom

erik quanstrom

unread,
Sep 5, 2009, 9:00:59 PM9/5/09
to
> > The instruction most conspicuously absent from the instruction set of the
> > Z3 is conditional branching. [...] but there is no straightforward way to
> > implement conditional sequences of instructions. However, we will show
> > later than conditional branching can be simulated on this machine.

i think your reasoning is going backwards in time. the fact that
a historian later can note that they *could* have had conditional
branching, if they'd thought of it further bolsters my position
that it is not immediately obvious that conditional branching
is what you want.

in fact, none of the things we take for granted --- e.g., binary,
digital, stack-based, etc. --- were immediately obvious. and it
might be that we've got these thing that we "know" wrong yet.

i would imagine that in 30 years there will be several "obvious"
things about quatum computers that nobody's thought of
yet.

- erik

Jason Catena

unread,
Sep 5, 2009, 10:03:20 PM9/5/09
to
Hailed Eris:

> I was alluding to the expressive power of C versus LISP considered with
> respect to the primitives available on one's computing platform and
> primitives in which solutions to one's problems are best expressed.

I think one of the reasons there exists little languages, and cliches
such as "the right tool for the job", is that the "primitives
available on one's computing platform" are very often not the


"primitives in which solutions to one's problems are best expressed."

In this respect rating the "expressive power of C versus LISP" depends
very much on the problem domain under discussion.

For "systems programming", C has the advantage in both practice (use
in running systems) and theory: processes which take a lot of system
resources to execute also tend to take a lot of C code, whereas in
most higher-order languages, things which represent high-level
runtime-consuming abstractions tend look little different than simple
bit-level operations. The difference is one of approach, I guess:
whether you want to write optimal code yourself, and see what the
machine is doing, or trust the compiler to find a good way to
translate to machine language and run (in real-time) your
efficient-to-code higher-order functions. The better the translation
from the higher-level language, the more this difference becomes a
matter of taste, programming style, availability of programmers, and
the body of domain knowledge already represented in language
libraries.

I would like to see Haskell fill C's niche: it's close to C's
execution speed now, and pure functions and a terse style gives real
advantages in coding speed (higher-order functions abstract common
"patterns" without tedious framework implementations), maintainability
(typeclasses of parameters in utility functions means you don't write
different implementations of the same function for different types,
yet preserve type compatibility and checking), and reliability (pure
functions don't depend on state, so have fewer moving parts to go
wrong).

Jason Catena

David Leimbach

unread,
Sep 5, 2009, 11:45:47 PM9/5/09
to
Well I can think of 3 operating systems written in Haskell now.  One was an executable specification for validating a secure L4 implementation.  One is hOp, and then there's also House, based on hOp.

There's also Kinetic, written primarily in Haskell. http://www.ninj4.net/kinetic/

The newest fork of House has TCP/IP networking uhm working.  http://web.cecs.pdx.edu/~kennyg/house/

There is a haskell file system in development too now http://www.haskell.org/halfs/, but a lot of links don't work :-).

I've been writing a good bit of Haskell these days at work as well, mainly due to the fact that it's possible to write some fairly sophisticated code quickly, and even get pretty darned good performance out of it.

 
Jason Catena


J.R. Mauro

unread,
Sep 6, 2009, 12:27:03 AM9/6/09
to

There's a talk Doug McIllroy gave where he joked about how he
basically invented (or rather, discovered) recursion because someone
said ``Hey, what would happen if we made a FORTRAN routine call
itself?'' IIRC he had to tinker with the compiler to get it to accept
the idea, and at first, no one realized what it would be good for.

Eris Discordia

unread,
Sep 6, 2009, 12:57:30 PM9/6/09
to
> in fact, none of the things we take for granted --- e.g., binary,
> digital, stack-based, etc. --- were immediately obvious. and it
> might be that we've got these thing that we "know" wrong yet.

I don't think we are actually in disagreement here. I have no objections to
your assertion. However, the particular case at hand indicates a different
thing than historians (of computer technology) "backporting" today's
trivial matters. I believe that a concept existed in a language
(Plankalkuel) but not the machine it was supposed to control (Z3) by all
means indicates the designer of the machine and the language was aware of
the concept but faced technical limitations of his time. Stored-program
computers weren't only consequences of a person's (von Neumann's)
genius--they also were consequences of the culmination, and return point,
of delay line technology (EDSAC's memory components).

A parallel can be drawn with the emergence of quantum mechanics. Many
students of physics who aren't taught or don't teach themselves history of
physics tend to think quantum mechanics emerged at a particular time due to
that physical thinkers shortly before the time just weren't up to the
mental challenge and it would take visionaries/revolutionaries to institute
the new understanding. Historians of physics, however, can tell you with
quite some confidence that the improvements of experimental instrumentation
and becoming technically feasible of certain experiments that weren't
feasible before around the end of 19th century were very probably a more
influential agent.


--On Saturday, September 05, 2009 20:56 -0400 erik quanstrom

Eris Discordia

unread,
Sep 6, 2009, 1:13:01 PM9/6/09
to
> In this respect rating the "expressive power of C versus LISP" depends
> very much on the problem domain under discussion.

Of course. I pointed out in my first post on the thread that "[...] for a

person of my (low) caliber, LISP is neither suited to the family of

problems I encounter nor suited to the machines I solve them on." I cannot
exclude other machines and other problems but can talk from what little I
have personally experienced.

> I would like to see Haskell fill C's niche [...]

Is it as readily comprehensible to newcomers as C? Are there texts out
there that can welcome a real beginner in programming and help him become
productive, on a personal level at least, as rapidly as good C
textbooks--you know the classic example--do? Is there a coherent mental
model of small computers--not necessarily what you or I deem to be a small
computer--that Haskell fits well and can be taught to learners? I imagine
those will be indispensable for any language to replace existing languages,
much more so in case of C.


--On Saturday, September 05, 2009 20:58 -0500 Jason Catena

Eris Discordia

unread,
Sep 6, 2009, 1:30:19 PM9/6/09
to
> There's a talk Doug McIllroy gave where he joked about how he
> basically invented (or rather, discovered) recursion because someone
> said ``Hey, what would happen if we made a FORTRAN routine call
> itself?'' IIRC he had to tinker with the compiler to get it to accept
> the idea, and at first, no one realized what it would be good for.

Are you implying Doug McIlroy hadn't been taught about (and inevitably
occupied by) Church-Turing Thesis or even before that Ackermann function
and had to wait to be inspired by a comment in passing about FORTRAN to
realize the importance of recursion?! This was a rhetorical question, of
course.

--On Sunday, September 06, 2009 00:23 -0400 "J.R. Mauro"

tlar...@polynum.com

unread,
Sep 6, 2009, 1:34:45 PM9/6/09
to
On Sun, Sep 06, 2009 at 05:51:33PM +0100, Eris Discordia wrote:
>
> I don't think we are actually in disagreement here. I have no objections to
> your assertion. However, the particular case at hand indicates a different
> thing than historians (of computer technology) "backporting" today's
> trivial matters. I believe that a concept existed in a language
> (Plankalkuel) but not the machine it was supposed to control (Z3) by all
> [...]

There is a rather extensive review of

"The Early Development of Programming Languages"
by Donald E. Knuth and Luis Trabb Pardo,
reproduced and completed in "Selected Papers on Computer Languages",
Donald E. Knuth, CSLI ISBN 1-57586-382-0

presenting the state and achievement, among many others, of Zuss
Plankalkül (followed by Goldstine/von Neumann, Curry, Mauchly etc.).

Rob Pike

unread,
Sep 6, 2009, 2:07:40 PM9/6/09
to
> Are you implying Doug McIlroy hadn't been taught about (and inevitably
> occupied by) Church-Turing Thesis or even before that Ackermann function and
> had to wait to be inspired by a comment in passing about FORTRAN to realize
> the importance of recursion?! This was a rhetorical question, of course.

Doug loves that story. In the version he told me, he was a (math) grad
student at MIT in 1956 (before FORTRAN) and the discussion in the lab
was about computer subroutines - in assembly or machine language of
course. Someone mused about what might happen if a subroutine called
itself. Everyone looked bemused. The next day they all returned and
declared that they knew how to implement a subroutine that could call
itself although they had no idea what use it would be. "Recursion"
was not a word in computing. Hell, "computing" wasn't even much of a
word in math.

Don't be Whiggish in your understanding of history. Its participants
did not know their way.

-rob

David Leimbach

unread,
Sep 6, 2009, 2:10:00 PM9/6/09
to
On Sun, Sep 6, 2009 at 10:08 AM, Eris Discordia <eris.di...@gmail.com> wrote:
In this respect rating the "expressive power of C versus LISP" depends
very much on the problem domain under discussion.

Of course. I pointed out in my first post on the thread that "[...] for a person of my (low) caliber, LISP is neither suited to the family of problems I encounter nor suited to the machines I solve them on." I cannot exclude other machines and other problems but can talk from what little I have personally experienced.

I would like to see Haskell fill C's niche [...]

Is it as readily comprehensible to newcomers as C? Are there texts out there that can welcome a real beginner in programming and help him become productive, on a personal level at least, as rapidly as good C textbooks--you know the classic example--do? Is there a coherent mental model of small computers--not necessarily what you or I deem to be a small computer--that Haskell fits well and can be taught to learners? I imagine those will be indispensable for any language to replace existing languages, much more so in case of C.

According to the designer of F# (another functional programming language that takes it's syntax from O'Caml as well as Haskell and even Python), one of the best experiences he'd had was working with a high school student who was able to modify a solar system simulation written in F# with no prior programming experience.  (from http://www.computerworld.com.au/article/271034/)

There's books on F# out there, and F# for Scientists.


There's books on multimedia programming in Haskell out there that also attempt to show programming to newcomers, but I'm not sure any of them really assume no prior programming experience. 

I think people learning C get one view of the computer that folks learning assembly really learn to appreciate :-).  Folks learning Haskell learn another mental model of programming as well.  

My personal belief is that learning new languages makes one think about the languages they are used to in a new light, and can make them better programmers overall.

Tim Newsham

unread,
Sep 6, 2009, 2:30:20 PM9/6/09
to
> I would like to see Haskell fill C's niche: it's close to C's
> execution speed now, and pure functions and a terse style gives real
> advantages in coding speed (higher-order functions abstract common
> "patterns" without tedious framework implementations), maintainability
> (typeclasses of parameters in utility functions means you don't write
> different implementations of the same function for different types,
> yet preserve type compatibility and checking), and reliability (pure
> functions don't depend on state, so have fewer moving parts to go
> wrong).

Do you know of any garbage collectors written in Haskell? Do
you know of any thread/process schedulers written in Haskell
that can schedule arbitrary code (ie. not just code that is
written in a continuation monad)?

I would like to see a language that lets you write low level code
(like memcpy) efficiently, in a style that makes reasoning about
the code easy, and which doesnt require (but can coexist and support)
garbage collection.

"while(n--) *p++ = *q++;"
is still quite elegant compared to many other expressive langauges.
setjmp and longjmp are still quite powerful.

> Jason Catena

Tim Newsham
http://www.thenewsh.com/~newsham/

Tim Newsham

unread,
Sep 6, 2009, 2:32:57 PM9/6/09
to
> Well I can think of 3 operating systems written in Haskell now. One was an
> executable specification for validating a secure L4 implementation. One is
> hOp, and then there's also House, based on hOp.

Keep in mind that House and hOp both used the ghc runtime (written in C)
as a base. I would argue that this is most of the "OS". The seL4 spec is
more like an operating system simulation than an operating system (or more
accurately it is a spec that can be executed).

I'm not familiar with the other projects you mention. Thank you,
I'll check em out...

> I've been writing a good bit of Haskell these days at work as well, mainly
> due to the fact that it's possible to write some fairly sophisticated code
> quickly, and even get pretty darned good performance out of it.

I'm a big fan. Just want to make sure the hype isn't overblown.

Tim Newsham
http://www.thenewsh.com/~newsham/

James Chapman

unread,
Sep 6, 2009, 2:37:41 PM9/6/09
to
As you mentioned beginners books for Haskell I couldn't resist plugging Graham Huttons excellent beginners book "Programming in Haskell":


It is based on 10 years of teaching a first year undergraduate course and is certainly accessible I believe. I've taught an undergraduate course myself using it.

There is also the this book which complements Graham's quite well:


I agree with David in that it is asking the wrong question as to whether there is a model of a computer that fits with Haskell. Haskell is based on a different model of computation. Conceptually, Haskell programs are executed by rewriting expressions not by manipulating memory in a machine. 

A trivial example:

Here's a function to append a list onto a list:

append :: [a] -> [a] -> [a]
append [] ys = ys
append (x:xs) ys = x:append xs ys

and here we run it (on paper, no machine required :) ) on a some lists by applying the above rules where the match:

Note: [1,2] is syntactic sugar for (1:(2:[]))

append [1,2] [3,4] 
= { apply first pattern match equation }
1 : append [2] [3,4]
= { apply first pattern match equation }
1 : 2 : append [] [3,4] 
= { apply second pattern match equation }
1 : 2 : [3,4]
= { just syntactic sugar }
[1,2,3,4]

I wouldn't be as bold as to suggest that Haskell should replace C but certainly it is a nice language to use in my opinion. Does it explain how a computer works? No. Does it explain 'computation'? Yes.





David Leimbach

unread,
Sep 6, 2009, 2:43:40 PM9/6/09
to
On Sun, Sep 6, 2009 at 11:26 AM, Tim Newsham <new...@lava.net> wrote:
I would like to see Haskell fill C's niche: it's close to C's
execution speed now, and pure functions and a terse style gives real
advantages in coding speed (higher-order functions abstract common
"patterns" without tedious framework implementations), maintainability
(typeclasses of parameters in utility functions means you don't write
different implementations of the same function for different types,
yet preserve type compatibility and checking), and reliability (pure
functions don't depend on state, so have fewer moving parts to go
wrong).

Do you know of any garbage collectors written in Haskell?  Do
you know of any thread/process schedulers written in Haskell
that can schedule arbitrary code (ie. not just code that is
written in a continuation monad)?

I would like to see a language that lets you write low level code
(like memcpy) efficiently, in a style that makes reasoning about
the code easy, and which doesnt require (but can coexist and support)
garbage collection.


Hmmm, pre-scheme perhaps?  http://en.wikipedia.org/wiki/PreScheme

It doesn't do garbage collection, and is meant for low level code, but provides scheme's macros.  Scheme48 is written in it.

David Leimbach

unread,
Sep 6, 2009, 2:47:37 PM9/6/09
to
On Sun, Sep 6, 2009 at 11:29 AM, Tim Newsham <new...@lava.net> wrote:
Well I can think of 3 operating systems written in Haskell now.  One was an
executable specification for validating a secure L4 implementation.  One is
hOp, and then there's also House, based on hOp.

Keep in mind that House and hOp both used the ghc runtime (written in C) as a base.  I would argue that this is most of the "OS". The seL4 spec is more like an operating system simulation than an operating system (or more accurately it is a spec that can be executed).

I suppose this is true, though I thought GHC's runtime was still mostly Haskell. (haven't looked, but one would think porting GHC would be a lot simpler if it was in all C).
 

I'm not familiar with the other projects you mention.  Thank you,
I'll check em out...

I've been writing a good bit of Haskell these days at work as well, mainly
due to the fact that it's possible to write some fairly sophisticated code
quickly, and even get pretty darned good performance out of it.

I'm a big fan.  Just want to make sure the hype isn't overblown.

Oh I agree with your point of view.  I even write some code in C, and make Haskell bindings for it still today when Haskell seems like too much of a pain to use (like a ring buffer implementation I did).

I'm a big fan of multi-paradigm programming.  I've got Erlang calling Haskell and C++ in a system we actually deploy at work.  Pick the weapon that's easiest to express the algorithms you need correctly in, and *then* measure performance to make sure everything is still ok.  

I do this for the same reasons people say C makes assembly mostly obsolete.  Why work the low level stuff if the heavy lifting can be done for you in advance.

Dave
 

Eris Discordia

unread,
Sep 6, 2009, 3:34:04 PM9/6/09
to
Thanks for the first-hand account :-)

> Don't be Whiggish in your understanding of history. Its participants
> did not know their way.

Given your original narrative I really can't argue. Maybe, as you note, I'm
wrongly assuming everyone knew a significant part of that which had come
before them without accounting for natural propagation delays and barriers
between thought pools. Nonetheless, it can't be denied a lot of ideas, and
words used to denote them, in computation were conceived at earlier times
than one might expect, sometimes even more comprehensively than today. For
instance, von Foerster was consistently using "computing" in an
astonishingly wide sense, e.g. bio-computing, by the 1950s. Even today most
people don't immediately generalize that notion the way he did while such
generalization is more than warranted.


--On Sunday, September 06, 2009 11:03 -0700 Rob Pike <rob...@gmail.com>
wrote:

Rudolf Sykora

unread,
Sep 6, 2009, 4:08:37 PM9/6/09
to
>> Considering that Plan 9 has only two inherent languages,
>> and its users often push for work to be done in only those,
>> what is the Plan 9 perspective of languages and tools in
>> relation to each other?

I guess rc & C are meant.
True, I feel to be pushed to these. On the other hand I really like
rc. Compared to bash/sh/ksh/zsh... I like its simplicity as well as
that it is the only shell in plan9. I use it in linux too (although I
miss some abilities it really should have, like ability to break from
a loop).
With C, I confess I do not use it often. In my life I found C a good
tool to program microcontrollers. But otherwise I prefer python/ruby
way unless speed is important---which, either really is (computation;
physics) -> I use Fortran, see below, or is not at all.
Fortunately, there are some ports of python and ruby to plan9. But it
was always so difficult in my eyes, that I backed out from trying to
use them (do you also have a feeling that the simples installation is
often in windows, even for open-source projects?).
There is also limbo, but for that I guess inferno must be installed...
(am I right?)

> I don't know for "the Plan 9 perspective" and have no authority to talk
> "for Plan 9", but since almost all interpreters or compilers are written
> in C, whether completely or the bootstapping procedure (a C core that is
> able to interpret a subset of the language to generate a binary for the
> non optimized version of a complete compiler etc.), there are all the
> tools as long as there is a C compiler for the machine.

Well, maybe. But it probably can be rather difficult to get some
software to work in plan9 even though it is written in C, but 'for
another system'... E.g. give me python+numpy+matplotlib...

> The only "lack" in C is perhaps defined full control for
> arithmetic/calculus. That's probably why FORTRAN is still here and has
> still its strength in this area.

Here, I must agree. Though I first hated Fortran for what it carries
with itself from the times of FORTRAN, for all it's inabilities to
work with strings, I must truly confess that I do not know of a better
language for doing calculations. There is no way to compare C to
Fortran, the latter being by no question superior. E.g. (not in any
order of importance)
Fortran can be (and usually is) quicker (better pointers).
Fortran can have optional parameters to functions.
Fortran can easily define/overload operators (not so nice yet, but
improving, e.g. the priority of new operators cannot be set) --- this
is nice to e.g. multiply matrices like this: C = A .x. B, do
inversions like this .I.A, or transpose .T.A, among others.
Fortran has elemental functions.
Fortran can slice arrays, like a(2:8), similarly to matlab or numpy.
Some people claim it is better suited for parallelism, but I can't say
much about this point
....
It's difficult to find anything where C would be better. Fortran still
has some very ugly places, but it has become really powerful.

But I guess there is nobody who would plan to put Fortran in plan9.

erik quanstrom

unread,
Sep 6, 2009, 4:53:54 PM9/6/09
to
> True, I feel to be pushed to these. On the other hand I really like
> rc. Compared to bash/sh/ksh/zsh... I like its simplicity as well as
> that it is the only shell in plan9. I use it in linux too (although I
> miss some abilities it really should have, like ability to break from
> a loop).

i've added it as an experiment:
/n/sources/contrib/quanstro/src/futharc

- erik

Vinu Rajashekhar

unread,
Sep 7, 2009, 3:55:27 AM9/7/09
to
Concrete Abstractions
An Introduction to Computer Science Using Scheme

http://gustavus.edu/+max/concrete-abstractions.html

Chapter 11: Computers with Memory
We first address the questions of what a computer is and how it comes
to compute by presenting a simplified RISC architecture and showing
how to program in its assembly language. We call attention to memory
as a numbered collection of storage locations, and use this as
motivation for introducing Scheme's vectors (one-dimensional arrays)
as a way to access storage from within the high-level language. In the
application section, we use vectors to program a simulator for our
RISC architecture in Scheme.

--
Vinu Rajashekhar,
5th Year Dual Degree Student,
Deptt of Computer Science & Engg,
IIT Kharagpur,
India.

Paul Donnelly

unread,
Sep 7, 2009, 4:54:32 AM9/7/09
to
eris.di...@gmail.com (Eris Discordia) writes:

> I whined about LISP on yet another thread. Above says precisely why I
> did. LISP is twofold hurtful for me as a naive, below average
> hobbyist. For one thing the language constructs do not reflect the
> small computer primitives I was taught somewhere around the beginning
> of my education. For another, most (simple) problems I have had to
> deal with are far better expressible in terms of those very
> primitives. In other words, for a person of my (low) caliber, LISP is


> neither suited to the family of problems I encounter nor suited to the

> machines I solve them on. Its claim to fame as the language for
> "wizards" remains. Although, mind you, the AI paradigm LISP used to
> represent is long deprecated (Rodney Brooks gives a good overview of
> this deprecation, although not specifically targeting LISP, in
> "Cambrian Intelligence: The Early History of the New AI").

Consider that your introduction to Lisp may have been very poor. You're
right that the mapping from Lisp primitives to machine primitives isn't
as direct as that in, but Lisp doesn't represent any AI paradigm at all,
nor a particular programming paradigm, and its name hasn't been written
in caps for perhaps 30 years. I'm not trying to nitpick; I'm only saying
that there are a lot of weird ideas about Lisp floating around which a
person can hardly be blamed for picking up on, and these are the reasons
it sounds to me like you have.

> One serious question today would be: what's LISP _really_ good for?
> That it represents a specific programming paradigm is not enough
> justification.

I think most Lispers would say it's _really_ good for anything but the
most demanding number crunching, or perhaps A-list games
programming. Probably you'd run into trouble in some parallel
programming situations, for reasons more related to implementation
support and libraries than reasons intrinsic to the language. And the
justification would be that Lisp is an embarrassingly multiparadigm
language, as general-purpose as they come.

Greg Comeau

unread,
Sep 7, 2009, 4:54:52 AM9/7/09
to
In article <dac0a5820909031501q676...@mail.gmail.com>,
Tharaneedharan Vilwanathan <vdha...@gmail.com> wrote:
>> C shall be the test. If you don't even understand C, explained by K&R,
>> then do something else.
>i agree to this completely. after taking a formal course in computers,
>if someone cannot read/follow K&R C book and cannot write C code, i
>would think that the candidate is not good enough.

Comes full circle, as if that is the metric, then that rules out
most candidates.
--
Greg Comeau / 4.3.10.1 with C++0xisms now in beta!
Comeau C/C++ ONLINE ==> http://www.comeaucomputing.com/tryitout
World Class Compilers: Breathtaking C++, Amazing C99, Fabulous C90.
Comeau C/C++ with Dinkumware's Libraries... Have you tried it?

Paul Donnelly

unread,
Sep 7, 2009, 4:54:40 AM9/7/09
to
eris.di...@gmail.com (Eris Discordia) writes:

>> Let me be a little pedantic.
>
> The 9fans know given the haphazard nature of a hobbyist's knowledge I
> am extremely bad at this, but then let me give it a try.
>
>> FYI, it's been Lisp for a while.
>
> As long as Britannica and Merriam-Webster call it LISP I don't think
> calling it LISP would be strictly wrong. Has LISt Processing become
> stigmatic in Lisp/LISP community?

Just the orthography.

> Indeed, my only encounter with LISP has been Scheme and through a
> failed attempt to read SICP.

Next time you get a hankering to see what all the fuss is about, you
could try a book like Practical Common Lisp (which can be read online at
http://gigamonkeys.com/book/ ). SICP is a good book, but it's geared
toward introducing fundamental programming concepts like abstraction
with a minimum of language features, which is necessarily at odds with
getting stuff done in a straightforward way.

> If you have a scrawny x86 on your desktop and are trying to implement,
> say, a bubble sort--yes, the notorious bubble sort, it's still the
> first thing that comes to a learner's mind--it seems C is quite apt
> for expressing your (embarrassing) solution in terms of what is
> available on your platform. Loops, arrays, swapping, with _minimal_
> syntactic distraction. Simple, naive algorithms should end up in
> simple, immediately readable (and refutable) code. Compare two
> implementations and decide for yourself:
>
> <http://en.literateprograms.org/Bubble_sort_(Lisp)>
> <http://en.literateprograms.org/Bubble_sort_(C)>

I must say that the Lisp version is much simpler and clearer to me,
while the C version is mildly baffling. Does that make me a wizard who
can hardly read simple C code, or is it just a matter of what you and I
are respectively more comfortable with?

>> The main benefits it had in AI were features that came from garbage
>> collection and interactive development.
>
> More importantly, LISt Processing which used to be an element of the
> expert systems approach to AI and which is now defunct (as a way of
> making machines intelligent, whatever that means). While "expert
> systems" continue to exist the word causes enough reverb of failure to
> be replaced by other buzzwords: knowledge-based systems, automated
> knowledge bases, and whatnot.

Don't assume that just because Lisp is useful for list processing that
it's not useful for a wide variety of problem-solving approaches. I've
seen many people get hung up on lists (and recursion), thinking that
they are somehow the essence of Lisp programming.

Greg Comeau

unread,
Sep 7, 2009, 5:05:20 AM9/7/09
to
In article <542783....@web83904.mail.sp1.yahoo.com>,
Brian L. Stuart <blst...@bellsouth.net> wrote:
>>>>>K&R is beautiful in this respect. In contrast, never managed
>>>>>bite in Stroustrup's description.
>>>> Ok, now I'll get provocative:
>>>> hen why do so many people have a problem understanding C?
>>>Are you saying that there is a significant number of people
>>>who understand C++ but not C?
>>I wasn't saying anything, I was asking a question. :)
>
>Ah, I misunderstood. The question about why people don't understand
>C on the heels of a reference to Stroustrup led me to think that
>was a suggestion C++ was easier to understand than C.

That's wasn't the orginal movitivation, although, that can be true.

>Of course, I may be a little too sensitive to such a claim,
>because of what I've been hearing in the academic community for
>a while.

Understood.

>Some keep saying that we should use more complex languages in
>the introductory course because they're in some way easier.
>But I've yet to understand their definition of easier.

I've seen this before. It's usually a combo of people
not knowing what they're talking about, making stuff up
as they go along, generalizing their personal programming
universe, being elite, and, miscommunication their point.

>Well, actually I do kind of realize they are suggesting that a
>tinkertoy approach makes it easier for a beginner to see something happen.
>The problem I have is that's not the point of teaching that material.

It's not. But that doesn't have to mean throwing the other
parts out the window either.

>Just getting something to happen might be training, but it sure isn't
>education.

No, and theory and practical experience are two different things
too. I would not necessarily say only one or only the other,
but probably often some balances combination.

As to easier/harder above, that can be slippery.
However, with the right approach, different steppingstones
can be provided depending upon the strategy chosen.
Of course, that can be a prescription for being doomed
to fail, but it's a juggle for sure even with success,
whatever that is, and with event he easier approach,
whatever that is :)

Greg Comeau

unread,
Sep 7, 2009, 5:07:05 AM9/7/09
to
In article <fe41879c0909050422s772...@mail.gmail.com>,
Akshat Kumar <aku...@mail.nanosouffle.net> wrote:
>> http://www.paulgraham.com/avg.html
> "Programming languages are just tools, after all."

>
>Considering that Plan 9 has only two inherent languages,
>and its users often push for work to be done in only those,
>what is the Plan 9 perspective of languages and tools in
>relation to each other?
>Is it in agreement with this statement?

It's certainly true that cultures and mindsets build up
around different things, some of it reasonable, some of it not,
and I observe Plan 9 is not much different in this regard.
That said, saying "push" and "inherent" are probably
inherently pushing things :)

Greg Comeau

unread,
Sep 7, 2009, 5:07:15 AM9/7/09
to
In article <BB8E3A2E5419E566D0361D29@[192.168.1.2]>,
Eris Discordia <eris.di...@gmail.com> wrote:
>I think I am sure C
>was a lot easier to learn and comprehend than either Pascal

Might depend how you define easier.

>What seems to distinguish--pedagogically, at least--C is, as I noted on
>that other thread, its closeness to how the small computer, not the actual
>small computer but the mental model of a small computer, works. Pointers?
>They're just references to "pigeonholes" in a row of such holes. Scope?
>It's just how long your variables are remembered. Invocation? Just a way to
>regurgitate your own cooking. If one has to solve a problem, implement an
>algorithm, on a small computer one needs to be able to explain it in terms
>of the primitives available on that computer. That's where C shines.
>There's a close connection between language primitives and the primitives
>of the underlying computer. I'm not saying this is something magically
>featuring in C--it's a property that _had_ to feature in some language some
>time, C became that. In a different time and place, on different machines,
>another language would/will be that (and it shall be called C ;-))

It is indeed true that C can "hug the hardware" and the rest is
history as they say. However, to implement an algorithm solely
based upon the primitive of a computer, if I understand you,
I can't agree to as a carte blanche statement.

>...There's a great deal of arbitrariness in how a computer language
>might look. It is epistemologically, aesthetically, and pragmatically
>advantageous to "remove arbitrariness" by fitting a language to either its
>target platform or its target problem, preferably both. C did and continues
>to do so, LISP doesn't (not anymore, to say the least).

These bits and piece do come into play, however, I think your
conclusion greatly exaggerates the situation, at least as I
understand what you said.

Akshat Kumar

unread,
Sep 7, 2009, 5:42:46 AM9/7/09
to
>>Considering that Plan 9 has only two inherent languages,
>>and its users often push for work to be done in only those,
>>what is the Plan 9 perspective of languages and tools in
>>relation to each other?
>>Is it in agreement with this statement?
>
> It's certainly true that cultures and mindsets build up
> around different things, some of it reasonable, some of it not,
> and I observe Plan 9 is not much different in this regard.
> That said, saying "push" and "inherent" are probably
> inherently pushing things :)

I suppose my wording showed emphasis in all the wrong places.
The question I meant to ask was (given the context of the
referenced article):
In the Plan 9 environment, are languages considered to be "tools"?

The rest was just my reasoning to ask.


Best,
ak

Daniel Lyons

unread,
Sep 7, 2009, 5:43:32 AM9/7/09
to

On Sep 7, 2009, at 2:54 AM, Paul Donnelly wrote:

> or perhaps A-list games programming


The Jak and Daxter series was written in Common Lisp.

http://en.wikipedia.org/wiki/Game_Oriented_Assembly_Lisp


Daniel Lyons


Vinu Rajashekhar

unread,
Sep 7, 2009, 5:43:02 AM9/7/09
to
Write Haskell as fast as C: exploiting strictness, laziness and recursion
- http://cgi.cse.unsw.edu.au/~dons/blog/2008/05/16
From the article

Lesson 1: To write predictably fast Haskell -- the kind that competes
with C day in and out
-- use tail recursion, and ensure all types are inferred as simple
machine types, like Int, Word,
Float or Double that simple machine representations. The performance
is there if you want it.

Lesson 2: Laziness has an overhead -- while it allows you to write new
kinds of programs
(where lists may be used as control structures), the memory traffic
that results can be a
penalty if it appears in tight inner loops. Don't rely laziness to
give you performance in your inner loops.

Lesson 3: For heavy optimisation, the C backend to GHC is still the
way to go. Later this
year a new bleeding edge native code generator will be added to GHC,
but until then,
the C backend is still an awesome weapon.

--

Daniel Lyons

unread,
Sep 7, 2009, 6:07:36 AM9/7/09
to

On Sep 7, 2009, at 3:05 AM, Greg Comeau wrote:

>> Some keep saying that we should use more complex languages in
>> the introductory course because they're in some way easier.
>> But I've yet to understand their definition of easier.
>
> I've seen this before. It's usually a combo of people
> not knowing what they're talking about, making stuff up
> as they go along, generalizing their personal programming
> universe, being elite, and, miscommunication their point.


I have a friend who insists that every other language has been harder
on him than macro assembler. And I think that's true, if you cannot
understand how to program a machine other than by thinking about
what's happening at the instruction level of the processor.

Each language provides its own view of the land. If you have a strong
understanding of the hardware and wish to think in those terms you
will probably find assembler or C to be your best friend. If you have
a strong mathematical inclination Haskell will probably suit you
better. I find Scheme introduces a model of computation which is a
compromise between the two; close to the machine in memory, simple in
syntax, and rather far from the machine in terms of continuations, but
most of the code being in the middle anyway.

Part of what makes computing so interesting to me is that we can
remodel it to suit different needs, tastes or problems. If we had to
write our schedulers in rc, we'd probably find it obnoxious; similarly
if we had to write trivial pipelines in C. The nice thing about Lisp,
Haskell and Java is that when you're in their little world everything
works the way you'd expect it to in their little world; the nice thing
about Plan 9 and Unix is that most everything is designed to interact
sanely and simply with the rest of the world. I find writing puzzle
solvers simpler in Prolog than in Haskell despite Haskell's list
comprehensions being essentially the same in power and somewhat more
straightforward to reason about.

For some reason, the fact that we program rational machines in logic-
based languages deludes us into thinking our experience is the same as
everyone else's or our situation must be the same as everyone else's.
I don't know anyone who likes to debate a programmer and isn't also a
programmer; we are undoubtedly the most self-assured and non-
empathetic group of people on the planet. We have every opportunity to
be free of dogma, but our reason and our aesthetic reactions seem
somehow to be soldered directly onto our emotions.

A problem is that the world isn't as rational as we are. It often
chooses based on expedience, popularity, rumor, or emotion. Often the
good is devoured by good marketing. I never would have expected to
find defenders of Lisp or Haskell here in the Plan 9 mailing list. I
am happy about that. But the hindsight has not been 20/20. Lisp and
Plan 9 are in the same situation for exactly the same reason: they're
both conceptually rigorous and short on eye candy, and the market
chose other alternatives long ago, and now those alternatives define
the question in a way that precludes these answers.


Daniel Lyons

erik quanstrom

unread,
Sep 7, 2009, 7:43:18 AM9/7/09
to
i agree the computer industry as a whole tends
to be long on dogma and yet suffers from an accute
inability to recall previous mistakes.

> For some reason, the fact that we program rational machines in logic-
> based languages deludes us into thinking our experience is the same as
> everyone else's or our situation must be the same as everyone else's.
> I don't know anyone who likes to debate a programmer and isn't also a
> programmer; we are undoubtedly the most self-assured and non-
> empathetic group of people on the planet. We have every opportunity to
> be free of dogma, but our reason and our aesthetic reactions seem
> somehow to be soldered directly onto our emotions.

but having lived in washington, dc for a decade, i think
i met a few groups that can be more self-assured.

- erik

Greg Comeau

unread,
Sep 7, 2009, 11:49:30 AM9/7/09
to
In article <fe41879c0909070239q184...@mail.gmail.com>,

From what I've seen that are (well, implementations of them).
Some thing they're fun too :) Generally universal comments anyway.

J.R. Mauro

unread,
Sep 7, 2009, 11:52:07 AM9/7/09
to
On Sun, Sep 6, 2009 at 2:03 PM, Rob Pike <rob...@gmail.com> wrote:
>> Are you implying Doug McIlroy hadn't been taught about (and inevitably
>> occupied by) Church-Turing Thesis or even before that Ackermann function and
>> had to wait to be inspired by a comment in passing about FORTRAN to realize
>> the importance of recursion?! This was a rhetorical question, of course.
>
> Doug loves that story. In the version he told me, he was a (math) grad
> student at MIT in 1956 (before FORTRAN) and the discussion in the lab
> was about computer subroutines - in assembly or machine language of
> course.  Someone mused about what might happen if a subroutine called
> itself.  Everyone looked bemused.  The next day they all returned and
> declared that they knew how to implement a subroutine that could call
> itself although they had no idea what use it would be.  "Recursion"
> was not a word in computing.  Hell, "computing" wasn't even much of a
> word in math.

It's nice to know that it's a bit of lore that changes with each telling.

Greg Comeau

unread,
Sep 7, 2009, 12:00:50 PM9/7/09
to
In article <936A4BAB-7D9A-4B65...@storytotell.org>,

Daniel Lyons <fus...@storytotell.org> wrote:
>On Sep 7, 2009, at 3:05 AM, Greg Comeau wrote:
>>> Some keep saying that we should use more complex languages in
>>> the introductory course because they're in some way easier.
>>> But I've yet to understand their definition of easier.
>>
>> I've seen this before. It's usually a combo of people
>> not knowing what they're talking about, making stuff up
>> as they go along, generalizing their personal programming
>> universe, being elite, and, miscommunication their point.
>
>I have a friend who insists that every other language has been harder
>on him than macro assembler.

We all suffer some level of this syndrome from time to time.
It's easy to get set in out ways, easy to think we know it all
and have done everything, easy to think there isn't much else,
easy to think that thing or idea we don't understand hence has
to be a piece of garbage. Etc etc. So yeah, the harder
factor comes into play here too. Other planes of thinking
need to compete with already existing and ingrained ways
of thinking, whether the already ones are poor, incomplete,
wrong, limited, or ignorant.

>And I think that's true, if you cannot
>understand how to program a machine other than by thinking about
>what's happening at the instruction level of the processor.

Probably.

>Each language provides its own view of the land. If you have a strong
>understanding of the hardware and wish to think in those terms you
>will probably find assembler or C to be your best friend. If you have
>a strong mathematical inclination Haskell will probably suit you
>better. I find Scheme introduces a model of computation which is a
>compromise between the two; close to the machine in memory, simple in
>syntax, and rather far from the machine in terms of continuations, but
>most of the code being in the middle anyway.

Something like that. And the different mentalities can
be a real good mental block often.

>For some reason, the fact that we program rational machines in logic-
>based languages deludes us into thinking our experience is the same as
>everyone else's or our situation must be the same as everyone else's.

The malleablility involved is both a band aid and a sword :)

>I don't know anyone who likes to debate a programmer and isn't also a
>programmer; we are undoubtedly the most self-assured and non-
>empathetic group of people on the planet. We have every opportunity to
>be free of dogma, but our reason and our aesthetic reactions seem
>somehow to be soldered directly onto our emotions.

Hehe.

>A problem is that the world isn't as rational as we are. It often
>chooses based on expedience, popularity, rumor, or emotion.

I find that the programmers do this just as much, if not as well
with their own twists.

erik quanstrom

unread,
Sep 7, 2009, 12:01:43 PM9/7/09
to
> From what I've seen that are (well, implementations of them).
> Some thing they're fun too :) Generally universal comments anyway.

is this english++? i just can't parse it.

- erik

It is loading more messages.
0 new messages