Language intolerance (was Re: Is Scheme a `Lisp'?)

39 views
Skip to first unread message

Ray Blaak

unread,
Feb 28, 2001, 1:44:48 AM2/28/01
to
Marco Antoniotti <mar...@cs.nyu.edu> writes:
> Christian Lynbech <c...@tbit.dk> writes:
> >[scheme has...]
> Then why not use CL? (Instead of reimplementing the wheel over and over and
> over and over , until you get the last wheel on the block avec forced
> indentation :) )

Because then they wouldn't be programming in Scheme, of course :-).

You could just as easily say "Java has [feature], but so does CL, so why not
use CL?" But the urge to convert programmers of other languages seems to be
less than that for Scheme.

<SoapBox>

Scheme is just a programming language. People use it for the same reasons they
use any other language: they like it, it meets their needs, whatever.

That there are Scheme freaks who would like to take over the world does not
mean all those who use Scheme are insufferable fools.

That there are problems with Scheme does not mean all those who use it are
clueless idiots.

All languages have their rough points, and just how rough is a source of
endless debates, often illuminating, often pointless.

But the level of hostility in this newsgroup to Scheme, and the certainty with
which lispers here know the One True Way is quite astonishing.

I lurk in a fair number of comp.lang.* newsgroups, and this one seems to me to
have the most intolerance/smugness. Even in comp.lang.scheme, when comparisons
to CL come up, the discussions point out the differences, opinions and
preferences are expressed, but then everyone just moves on to discuss some more
Scheme. No big deal.

I used to think comp.lang.ada was the worst, with everyone there just being
amazed at all the silly people refusing to understand the benefits of salvation
by using Ada (hallelujah). These days, however, they seem to have understood
that other languages have their place, and it is in fact useful to work with
Ada and the other evil languages.

Sometimes I feel like I am at a fundamentalist meeting, where the slightest
voice of dissent will brand you as a heretic.

Lighten up people. Use Common Lisp because you can use no other, but at least
be aware that the other folks' ways can be learned from (even their mistakes).

</SoapBox>

--
Cheers, The Rhythm is around me,
The Rhythm has control.
Ray Blaak The Rhythm is inside me,
bl...@infomatch.com The Rhythm has my soul.

Xah Lee

unread,
Feb 28, 2001, 2:11:32 AM2/28/01
to
Lighten up folks, there's no need to be intolerant of sibling's tone. I have
now composed a poem dedicated to _all_ common lispers.

My name is Scheme
pure and beautiful,
but in the real world,
i rape and beat lisps,
all of them to death.

Xah
x...@xahlee.org
http://xahlee.org/PageTwo_dir/more.html


> From: Ray Blaak <bl...@infomatch.com>
> Newsgroups: comp.lang.lisp
> Date: 27 Feb 2001 22:44:48 -0800
> Subject: Language intolerance (was Re: Is Scheme a `Lisp'?)

Bruce Hoult

unread,
Feb 28, 2001, 6:22:32 AM2/28/01
to
In article <31923469...@naggum.net>, Erik Naggum <er...@naggum.net>
wrote:

> > I lurk in a fair number of comp.lang.* newsgroups, and this
> > one seems to me to have the most intolerance/smugness.
>

> Try comp.lang.dylan. Talk about Common Lisp and prefix syntax.
> Ask them why they dropped it.

I understand that it was intended to allow both the infix and prefix
syntaxes to be used, at the user's discretion. But then I guess even
the extremely experienced Lisp people such as Scott Fahlman and David
Moon found that actually an infix syntax wasn't so bad after all. When
difficulties arose in defining a macro facility that could be mapped
mechaniscally between infix and prefix syntaxes they decided to drop the
prefix one.

If you know a different history I'd be interested to hear it.

-- Bruce

Jason Trenouth

unread,
Feb 28, 2001, 7:34:29 AM2/28/01
to

Ironically for this discussion, Dylan is more like (+ Scheme CLOS infix).
:-j

__Jason

Xah Lee

unread,
Feb 28, 2001, 9:04:27 AM2/28/01
to
Dear Erik Naggum,

you wrote:
> I think you're missing an important point. Whenever anyone tries to sell
> Common Lisp, argue for Common Lisp, etc, the bad experiences people have
> with Scheme become a problem for us before we can move on to do what we
> want. Whenever some Scheme freak says "Scheme is a Lisp", he holds
> "Lisp" back in the minds of whoever has been exposed to Scheme and thinks
> he knows something about "Lisp", 1960-style. Bad teachers or pedagogy,
> which only shows people the unique Scheme features, some of which are
> pretty bizarre, making it appear _Lisp_ doesn't have iteration, arrays,
> strings, etc, exacerbate the problem of selling Common Lisp. In an
> important way, Scheme is _in_the_way_ when a Common Lisp proponent tries
> to talk about his favorite choice. Because of the need to move Scheme
> out of the way all the time, hostility grows. This is a situtation the
> Scheme freaks have created all on their own, by insisting that Scheme is
> a Lisp, way beyond any useful comparisons to any other extant Lisps.


That's not Scheme's problem; life's a survivalism. Fight fight fight, fight
fight fight, fight the myths for your life.

(and just don't forget what imperative languagessss do to you and your food
source. Pray that Scheme is everywhere before Common Lisp becomes extinct.
Despite Common Lisper's problems, i betcha ass Scheme is still A Lisp!)

Xah
x...@xahlee.org
http://xahlee.org/PageTwo_dir/more.html


> From: Erik Naggum <er...@naggum.net>
> Organization: Naggum Software, Oslo, Norway
> Newsgroups: comp.lang.lisp
> Date: 28 Feb 2001 11:02:37 +0000
> Subject: Re: Language intolerance (was Re: Is Scheme a `Lisp'?)

Ray Blaak

unread,
Feb 28, 2001, 12:11:54 PM2/28/01
to
Erik Naggum <er...@naggum.net> writes:
> [a fair post, except for]
> Your very bad code examples shows that you are willing to post completely
> unfounded beliefs, and do not want to make the effort to check your
> conclusions or indeed premises for correctness.

Not at all. My very bad code example was an honest attempt to learn something
about CL. And I did. There were no beliefs or premises being held. I *knew* it
was incorrect, but assumed people would get the gist of what I was getting at.

If I had actually had a CL on my machine, I could have answered the question
myself. Next time I will likely take the trouble to install CL instead.

Xah Lee

unread,
Feb 28, 2001, 2:35:36 PM2/28/01
to
In my last message I dedicated a poem to all common lispers:

My name is Scheme,


pure and beautiful,
but in the real world,
i rape and beat lisps,
all of them to death.

Now, i wish to complement it by a dedication to the Scheme dialect
programers:

The fire in my eyes,
burning fierce and bright,
reflecting flare before my eyes,
Common Lispers' asses alight.

In conclusion to the topic of this thread, i dedicate this one to all
siblings in-fight:

My brethren lessons a learnt,
never utter truths a bent,
lest Xah elder wrath a birth,
thy buttocks =puff=, ignite.


(translation:
Oh, my dear Common Lisp loving programers, you all have learned a lesson
this week: Never say things you know that's not exactly true, for example:
do not say to your brother: "My mom is not your mom, my dad is not your
dad.". Because if you do, the loving person Xah Lee will know and will be
angry, and make your ugly behavior clearly understood by everybody.
)

Xah
x...@xahlee.org
http://xahlee.org/PageTwo_dir/more.html

Marco Antoniotti

unread,
Feb 28, 2001, 4:14:38 PM2/28/01
to

Ray Blaak <bl...@infomatch.com> writes:

> Marco Antoniotti <mar...@cs.nyu.edu> writes:
> > Christian Lynbech <c...@tbit.dk> writes:
> > >[scheme has...]
> > Then why not use CL? (Instead of reimplementing the wheel over and over and
> > over and over , until you get the last wheel on the block avec forced
> > indentation :) )
>
> Because then they wouldn't be programming in Scheme, of course :-).
>
> You could just as easily say "Java has [feature], but so does CL, so why not
> use CL?" But the urge to convert programmers of other languages seems to be
> less than that for Scheme.

You are forgetting the main points of the issue.

CL has N (for a very large positive N) features that Scheme simply
*does not have*. The inverse is essentially limited to call/cc.

Yet the number of hours sunk into making yet another (L)GPL or Open
Source or whatever license Scheme implementation, or Scheme library
covering this or that piece of CL is *staggering*.

This is what irks Common Lispers like me.

Apart from that what you say is true. :)

Finally allow me to add. I am light hearted when making this comments
on Scheme. It's like a sport for me :) It's fun and easy.

> <SoapBox>
>
> Scheme is just a programming language. People use it for the same
> reasons they use any other language: they like it, it meets their
> needs, whatever.
>

...


>
> Lighten up people. Use Common Lisp because you can use no other, but at least
> be aware that the other folks' ways can be learned from (even their
> mistakes).
>
> </SoapBox>

The last is particularly true. We learned that a big, fat and
incomplete standard is better than a thin, trimmed and incomplete
standard.

Cheers

--
Marco Antoniotti ========================================================
NYU Courant Bioinformatics Group tel. +1 - 212 - 998 3488
719 Broadway 12th Floor fax +1 - 212 - 995 4122
New York, NY 10003, USA http://bioinformatics.cat.nyu.edu
Like DNA, such a language [Lisp] does not go out of style.
Paul Graham, ANSI Common Lisp

Lieven Marchand

unread,
Feb 28, 2001, 12:43:12 PM2/28/01
to nobody
Bruce Hoult <br...@hoult.org> writes:

> I understand that it was intended to allow both the infix and prefix
> syntaxes to be used, at the user's discretion. But then I guess even
> the extremely experienced Lisp people such as Scott Fahlman and David
> Moon found that actually an infix syntax wasn't so bad after all. When
> difficulties arose in defining a macro facility that could be mapped
> mechaniscally between infix and prefix syntaxes they decided to drop the
> prefix one.
>
> If you know a different history I'd be interested to hear it.

What a beautiful history that has given Dylan misfix syntax and a
overly complex and under powered macro system. But we've had that
discussion ;-)

--
Lieven Marchand <m...@wyrd.be>
Glaðr ok reifr skyli gumna hverr, unz sinn bíðr bana.

Eugene Zaikonnikov

unread,
Feb 28, 2001, 5:53:27 PM2/28/01
to
* "Xah" == Xah Lee <x...@xahlee.org> writes:

Xah> but in the real world, i rape and beat lisps, all of them to
Xah> death.

Many before you tried; most of them rest by gravestones now,
outlived by Lisp. Lisp is the eternal magic of greater orders, and
pity mortals like you may no harm it.

Xah> The fire in my eyes,
Xah> burning fierce and bright,
Xah> reflecting flare before my eyes,
Xah> Common Lispers' asses alight.

Eat Flaming Death, The lesser being, who Never Heard A Word Of True
Poetry.

DIVERSE IMAGE
(The Song of Expressions)

Eval has a Lispy Heart,
And Macros make a Lispy Face,
Class the Lispy Form Divine,
And Syntax is the Lispy Dress.

The Scheme Dress is Forged Syntax,
The Scheme Form is Tail Recurse,
The Scheme Face is Funcall seal'd,
The Scheme Heart is hungry Cons.


--
Eugene

Bruce Hoult

unread,
Feb 28, 2001, 6:03:34 PM2/28/01
to
In article <m38zmqp...@localhost.localdomain>, Lieven Marchand
<m...@wyrd.be> wrote:

> Bruce Hoult <br...@hoult.org> writes:
>
> > I understand that it was intended to allow both the infix and prefix
> > syntaxes to be used, at the user's discretion. But then I guess even
> > the extremely experienced Lisp people such as Scott Fahlman and David
> > Moon found that actually an infix syntax wasn't so bad after all. When
> > difficulties arose in defining a macro facility that could be mapped
> > mechaniscally between infix and prefix syntaxes they decided to drop
> > the
> > prefix one.
> >
> > If you know a different history I'd be interested to hear it.
>
> What a beautiful history that has given Dylan misfix syntax and a
> overly complex and under powered macro system. But we've had that
> discussion ;-)

Under-powered? A strange thing to say about something that is
(regrettably, IMHO) Turing-complete...

-- Bruce

Joe Marshall

unread,
Feb 28, 2001, 6:55:41 PM2/28/01
to
Bruce Hoult <br...@hoult.org> writes:

> Under-powered? A strange thing to say about something that is
> (regrettably, IMHO) Turing-complete...

A lot of things are Turing-complete (for instance, a Turing machine).
That doesn't necessarily imply ease of programming.

sendmail config files are Turing complete, too.


--


-----= Posted via Newsfeeds.Com, Uncensored Usenet News =-----
http://www.newsfeeds.com - The #1 Newsgroup Service in the World!
-----== Over 80,000 Newsgroups - 16 Different Servers! =-----

Bruce Hoult

unread,
Feb 28, 2001, 8:19:03 PM2/28/01
to
In article <itlucq...@content-integrity.com>, Joe Marshall
<j...@content-integrity.com> wrote:

> Bruce Hoult <br...@hoult.org> writes:
>
> > Under-powered? A strange thing to say about something that is
> > (regrettably, IMHO) Turing-complete...
>
> A lot of things are Turing-complete (for instance, a Turing machine).
> That doesn't necessarily imply ease of programming.
>
> sendmail config files are Turing complete, too.

I don't believe I said or implied that powerful implies easy -- in fact
I believe the opposite is often true.

In particular, I don't like a language having Turing-complete processes
going on at compile-time. This applies equally whether it is a
self-contained sublanguage (such as Dylan macros or C++ templates) or
the same language as you use for normal programming (as in CL procedural
macros). I think that macros are the appropriate tool for doing trivial
syntatic rearrangements and that if you want to do something complex
then the appropriate method is to use a tool that explicitly generates
source code that the programmer can examine -- and debug.

-- Bruce

Rahul Jain

unread,
Feb 28, 2001, 10:55:54 AM2/28/01
to
In article <bruce-0A20C3....@news.nzl.ihugultra.co.nz> on
<bruce-0A20C3....@news.nzl.ihugultra.co.nz>, "Bruce Hoult"

<br...@hoult.org> wrote:
> I think that macros are the appropriate tool for doing trivial syntatic
> rearrangements and that if you want to do something complex then the
> appropriate method is to use a tool that explicitly generates source
> code that the programmer can examine -- and debug.

The code generated by macros can be examined and debugged, just like any
code one writes "normally". The difference is that CL macros are powerful
enough that one can do all the external tool-generated source generation
from inside CL. CL macros are also pervasive throughout the language, as
they are an easy way to implement multiple forms which are semantically
similar. I don't see the benefit of running a separate lisp VM
just to do macroexpansions and save them to disk.

BTW, do you use LOOP or SERIES or ITERATE or FORMAT at all? I think those
would qualify as doing "something complex".

--
-> -/- - Rahul Jain - -\- <-
-> -\- http://linux.rice.edu/~rahul -=- mailto:rahul...@usa.net -/- <-
-> -/- "I never could get the hang of Thursdays." - HHGTTG by DNA -\- <-
|--|--------|--------------|----|-------------|------|---------|-----|-|
Version 11.423.999.220020101.23.50110101.042
(c)1996-2000, All rights reserved. Disclaimer available upon request.

Pierre R. Mai

unread,
Mar 1, 2001, 5:20:09 AM3/1/01
to
Bruce Hoult <br...@hoult.org> writes:

> I don't believe I said or implied that powerful implies easy -- in fact
> I believe the opposite is often true.

Given that being turing-completeness is so easy to achieve (often even
by accident), I don't think the expressive power of a language can be
defined by examining what can be computed in the language, but
rather what can reasonably be expressed in the language by a human.

> In particular, I don't like a language having Turing-complete processes
> going on at compile-time. This applies equally whether it is a
> self-contained sublanguage (such as Dylan macros or C++ templates) or
> the same language as you use for normal programming (as in CL procedural
> macros). I think that macros are the appropriate tool for doing trivial
> syntatic rearrangements and that if you want to do something complex
> then the appropriate method is to use a tool that explicitly generates
> source code that the programmer can examine -- and debug.

But you can examine the source code generated by CL macros, and you
can debug that stuff. More importantly most implementations will even
try to give you both views simultaneously (the expanded and the
original source) when debugging, something that few external tools
achieve. I don't see why making the tool external does
in any way improve the situation. Having seen many tools that
generate source code for traditional languages, I've very often found
those tools to be much harder to debug, especially since their
transformation is often performed in one huge step, whereas the
expansion of macros is a step-by-step process, that is supported by
reasonable tools: If you want to find out what a particular piece of
input code is transformed into, you normally have little external tool
support to find out, whereas in CL you just macroexpand that part.

Many of those external tools work as if the code was written like
this:

(defmacro do-this-in-augmented-language
;; Whole program/module/file
)

Debugging such a thing is a nightmare. While you could argue that
this is just the fault of the tool creator, I'd argue that this is
often the direct result of missing support for the tool creator:
Given the external nature of the tool, as well as missing parsers,
compiler hooks, etc. it is quite natural that the author will create a
one-pass 'compiler', rather than a well-structured, segmented and
traceable process.

So I can see (and have experienced) a whole set of disadvantages to
the external tool approach. I'd be interested in hearing about the
disadvantages of the extendible compiler approach that CL's macro
system is part of. Note that I, too, think that special
macro-languages (like those of Dylan, Scheme, and especially the C++
template system) often suffer from similar problems than the external
tool approaches.

Regs, Pierre.

--
Pierre R. Mai <pm...@acm.org> http://www.pmsf.de/pmai/
The most likely way for the world to be destroyed, most experts agree,
is by accident. That's where we come in; we're computer professionals.
We cause accidents. -- Nathaniel Borenstein

Kent M Pitman

unread,
Mar 1, 2001, 8:17:44 AM3/1/01
to
"Pierre R. Mai" <pm...@acm.org> writes:

> Bruce Hoult <br...@hoult.org> writes:
>
> > I don't believe I said or implied that powerful implies easy -- in fact
> > I believe the opposite is often true.
>
> Given that being turing-completeness is so easy to achieve (often even
> by accident), I don't think the expressive power of a language can be
> defined by examining what can be computed in the language, but
> rather what can reasonably be expressed in the language by a human.

I think that "turing power" is a near useless concept because of the problem
you cite. I agree another term that is unrelated to turing needs to be
devised, and I agree that the term "expressive" should be used in this way,
not defined by computability but defined by something else more related
to practical reality. It's underspecified even so, but I usually tend to
add "per unit time" in my mind when talking about expressiveness.

For example, one CAN write an object-oriented program in Fortran or C
[statement of turing equivalence], but it takes longer than in Lisp
[statement of comparative expressive power].

> > In particular, I don't like a language having Turing-complete processes
> > going on at compile-time. This applies equally whether it is a
> > self-contained sublanguage (such as Dylan macros or C++ templates) or
> > the same language as you use for normal programming (as in CL procedural
> > macros). I think that macros are the appropriate tool for doing trivial
> > syntatic rearrangements and that if you want to do something complex
> > then the appropriate method is to use a tool that explicitly generates
> > source code that the programmer can examine -- and debug.

This is like saying that some programs should always have to be done
interactively and it should never be possible to write batch scripts.
Ugh.

> But you can examine the source code generated by CL macros, and you
> can debug that stuff.

Indeed. And what about things that ARE debugged? And what about
potential users of the complicated macro that are not capable of
debugging it even if they could? (Look at Microsoft IE and its option
to turn off debugging of Javascript errors, which I'm sure most users
check.) It's pretty plain that even if you can't prove it 100%
debugged, you can still easily get to a point where the cumulative sum
of the certain time lost by running it always in this kind of debug
mode just in case there's a problem far exceeds the possible time lost
due to the occasional bug.

Lieven Marchand

unread,
Mar 1, 2001, 3:58:06 PM3/1/01
to nobody
Bruce Hoult <br...@hoult.org> writes:

Macro-expand is in the standard. Most implementations have also walker
functionality that will recursively expand inner forms until
everything referenced is primitive.

Would you really want to write LOOP or SERIES in Dylan macros, and
could it be done?

Paul Dietz

unread,
Mar 1, 2001, 6:29:07 PM3/1/01
to
Bruce Hoult wrote:

> I think that macros are the appropriate tool for doing trivial
> syntatic rearrangements and that if you want to do something complex
> then the appropriate method is to use a tool that explicitly generates
> source code that the programmer can examine -- and debug.

I could not disagree more. The ability to layer new language
features on top of lisp and to use them interactively (vs.
going through a batch-oriented preprocessor) is a feature
I use all the time.

If anything, Common Lisp macros are not powerful enough
for my taste. I'd like to see environment information (for
example, declared and infered types of expressions) be available
to the macros (or, at least, to compiler macros).

If you want to examine the macro expanded code, you can use
a code walker/expander.

Paul

cbbr...@hex.net

unread,
Mar 1, 2001, 2:20:52 PM3/1/01
to
>>>>> "Kent" == Kent M Pitman <pit...@world.std.com> writes:

Kent> "Pierre R. Mai" <pm...@acm.org> writes:
>> Bruce Hoult <br...@hoult.org> writes:
>>
>> > I don't believe I said or implied that powerful implies easy
>> -- in fact > I believe the opposite is often true.
>>
>> Given that being turing-completeness is so easy to achieve
>> (often even by accident), I don't think the expressive power of
>> a language can be defined by examining what can be computed in
>> the language, but rather what can reasonably be expressed in
>> the language by a human.

Kent> I think that "turing power" is a near useless concept because of
Kent> the problem you cite. I agree another term that is unrelated to
Kent> turing needs to be devised, and I agree that the term
Kent> "expressive" should be used in this way, not defined by
Kent> computability but defined by something else more related to
Kent> practical reality. It's underspecified even so, but I usually
Kent> tend to add "per unit time" in my mind when talking about
Kent> expressiveness.

Kent> For example, one CAN write an object-oriented program in Fortran
Kent> or C [statement of turing equivalence], but it takes longer than
Kent> in Lisp [statement of comparative expressive power].

Invoking the gremlin of "turing completeness" or "turing power" or
what have you seems to be a monkey that comes up to give the excuse
that some computing system displaying a gaping paucity of
expressiveness is, by some wild isomorphism, "as good as everything
else."

Some of the oft-not-so-gentle readers here might quickly throw that as
a dung-ball against the notion that Scheme is good for anything; I
would think it far more appropriate to throw the dung at the languages
that are _vastly_ less expressive such as Visual Basic and its ilk.

It would be nice to try to talk about some sort of "expressiveness
measure;" the fact that mathematicians and physicists and such keep on
creating new notations to describe the new areas of their disciplines
that pop up suggests to me that this would be a futile exercise.
(I'll allude to Godel and incompleteness here, but would certainly not
suggest any _provable_ connection...)

My suspicion is that "expressiveness" is a multiple-edged sword in any
case, thinking in the Kolmogorov complexity direction of things.
"Kolmogorov complexity" refers to the size of the minimum UTM capable
of producing a particular string of symbols.

You might add some nifty operations to a UTM that allow the number of
instructions required to fall dramatically; if that leads to the
program being incomprehensible, it's not evident that you've got
something of improved expressiveness.

Heading to the more concrete, while APL is capable of doing vast
quantities of stuff in a single line of code, most people don't think
in the vector space terms required to properly harness that. Which
probably explains the fact that there is a rather minscule niche of
APL programmers.

The fair degree of complexity to the CL specification similarly makes
it unsurprising that a lot of people find it too much to cope with.
It's expressive, to the point to which people get frightened...
--
(concatenate 'string "cbbrowne" "@acm.org")
http://vip.hex.net/~cbbrowne/finances.html
Rules of the Evil Overlord #177. "If a scientist with a beautiful and
unmarried daughter refuses to work for me, I will not hold her
hostage. Instead, I will offer to pay for her future wedding and her
children's college tuition." <http://www.eviloverlord.com/>

Barry Margolin

unread,
Mar 1, 2001, 7:38:18 PM3/1/01
to
In article <wkpug1t...@mail.hex.net>, <cbbr...@hex.net> wrote:
>Invoking the gremlin of "turing completeness" or "turing power" or
>what have you seems to be a monkey that comes up to give the excuse
>that some computing system displaying a gaping paucity of
>expressiveness is, by some wild isomorphism, "as good as everything
>else."

Can you imagine if carpenters were like computer scientists? Some of them
would argue that it's not necessary to own a hammer because the butt of a
screwdriver is naildriver-complete. :)

No wonder all other engineering disciplines laugh at us....

--
Barry Margolin, bar...@genuity.net
Genuity, Burlington, MA
*** DON'T SEND TECHNICAL QUESTIONS DIRECTLY TO ME, post them to newsgroups.
Please DON'T copy followups to me -- I'll assume it wasn't posted to the group.

cbbr...@hex.net

unread,
Mar 1, 2001, 11:06:14 PM3/1/01
to
Barry Margolin <bar...@genuity.net> writes:
> In article <wkpug1t...@mail.hex.net>, <cbbr...@hex.net> wrote:
> >Invoking the gremlin of "turing completeness" or "turing power" or
> >what have you seems to be a monkey that comes up to give the excuse
> >that some computing system displaying a gaping paucity of
> >expressiveness is, by some wild isomorphism, "as good as everything
> >else."
>
> Can you imagine if carpenters were like computer scientists? Some of them
> would argue that it's not necessary to own a hammer because the butt of a
> screwdriver is naildriver-complete. :)
>
> No wonder all other engineering disciplines laugh at us....

The flip side of this is that people can, and do, do pretty
appallingly powerful things using the "duct tape" of Perl scripts.
That generally _doesn't_ include writing things resembling Maple or
Macsyma, but the list of "useful things done" is probably pretty high
nonetheless...

--
(concatenate 'string "aa454" "@freenet.carleton.ca")
http://www.ntlug.org/~cbbrowne/rdbms.html
Rules of the Evil Overlord #9. "I will not include a self-destruct
mechanism unless absolutely necessary. If it is necessary, it will not
be a large red button labelled "Danger: Do Not Push". The big red
button marked "Do Not Push" will instead trigger a spray of bullets on
anyone stupid enough to disregard it. Similarly, the ON/OFF switch
will not clearly be labelled as such." <http://www.eviloverlord.com/>

Janis Dzerins

unread,
Feb 28, 2001, 5:25:00 AM2/28/01
to
Xah Lee <x...@xahlee.org> writes:

> Lighten up folks, there's no need to be intolerant of sibling's tone. I have
> now composed a poem dedicated to _all_ common lispers.
>
> My name is Scheme
> pure and beautiful,
> but in the real world,
> i rape and beat lisps,
> all of them to death.

And my name is the Reaper,
Don't call me a lisp.

--
Janis Dzerins

If million people say a stupid thing it's still a stupid thing.

Tim Bradshaw

unread,
Mar 2, 2001, 7:58:23 AM3/2/01
to
Barry Margolin <bar...@genuity.net> writes:

>
> Can you imagine if carpenters were like computer scientists? Some of them
> would argue that it's not necessary to own a hammer because the butt of a
> screwdriver is naildriver-complete. :)
>

Another contingent of engineers would construct large civil
engineering projects with gaffa tape as a crucial component.
Periodically these would collapse inporrly-understood circumstances
with huge loss of life.

Others would refuse to install safety features such as guards on
machinery, seatbelts in cars, lifeboats on ships and so on, as it
results in loss of performance and increased cost and weight. This
contingent would produce most of the engineered products we use, often
in league with the gaffa-tape contingent (who they secretly despise).

Others still would argue -- apparently seriously -- that all
engineering should be done based on quantum gravity. Unfortunately,
lacking a satisfactory theory of quantum gravity, their projects would
be somewhat limited, if beautifully made.

> No wonder all other engineering disciplines laugh at us....

No wonder

--tim

Marco Antoniotti

unread,
Mar 2, 2001, 9:46:15 AM3/2/01
to

Barry Margolin <bar...@genuity.net> writes:

> In article <wkpug1t...@mail.hex.net>, <cbbr...@hex.net> wrote:
> >Invoking the gremlin of "turing completeness" or "turing power" or
> >what have you seems to be a monkey that comes up to give the excuse
> >that some computing system displaying a gaping paucity of
> >expressiveness is, by some wild isomorphism, "as good as everything
> >else."
>
> Can you imagine if carpenters were like computer scientists? Some of them
> would argue that it's not necessary to own a hammer because the butt of a
> screwdriver is naildriver-complete. :)
>
> No wonder all other engineering disciplines laugh at us....

Yeah. But them many engineers go ahead and either re-invent the wheel
or keep using the but of the screwdriver.

Informatically yours....

Paul Wallich

unread,
Mar 2, 2001, 9:58:59 AM3/2/01
to
In article <nkjn1b4...@tfeb.org>, Tim Bradshaw <t...@tfeb.org> wrote:

>Barry Margolin <bar...@genuity.net> writes:
>
>>
>> Can you imagine if carpenters were like computer scientists? Some of them
>> would argue that it's not necessary to own a hammer because the butt of a
>> screwdriver is naildriver-complete. :)
>>
>
>Another contingent of engineers would construct large civil
>engineering projects with gaffa tape as a crucial component.
>Periodically these would collapse inporrly-understood circumstances
>with huge loss of life.
>
>Others would refuse to install safety features such as guards on
>machinery, seatbelts in cars, lifeboats on ships and so on, as it
>results in loss of performance and increased cost and weight. This
>contingent would produce most of the engineered products we use, often
>in league with the gaffa-tape contingent (who they secretly despise).

Which part of the 19th and early 20th centuries doesn't this decribe?
The main difference is that mass production of bits is so much easier
than mass production of bits of metal.

Of course turing-equivalence also goes the other way: all nonworking
systems are equivalent regardless of how elegant or inelegant their
design...

paul

Tim Bradshaw

unread,
Mar 2, 2001, 8:55:17 PM3/2/01
to
* Paul Wallich wrote:

> Which part of the 19th and early 20th centuries doesn't this decribe?
> The main difference is that mass production of bits is so much easier
> than mass production of bits of metal.

I think that's a good point -- software development is between one and
two hundred years behind engineering disciplines. Unfortunately we
don't seem to be very good at taking any notice of what they learned
in that one to two hundred years.

--tim

Kent M Pitman

unread,
Mar 3, 2001, 9:27:31 AM3/3/01
to
Tim Bradshaw <t...@cley.com> writes:

> I think that's a good point -- software development is between one and
> two hundred years behind engineering disciplines. Unfortunately we
> don't seem to be very good at taking any notice of what they learned
> in that one to two hundred years.

That's double-edged--i.e., sometimes a good thing.

The one thing I'm fearful they'll pick up from that other discipline
is accreditation and/or licensing. We see it being sold for
particular products, and maybe that's ok, but it should not be sold
for the whole information industry or a Bad Thing will happen. The
first thing that would happen, apropos this newsgroup, would be the
closing down of alternate languages.

Tim Bradshaw

unread,
Mar 3, 2001, 2:57:59 PM3/3/01
to
* Kent M Pitman wrote:

> The one thing I'm fearful they'll pick up from that other discipline
> is accreditation and/or licensing. We see it being sold for
> particular products, and maybe that's ok, but it should not be sold
> for the whole information industry or a Bad Thing will happen. The
> first thing that would happen, apropos this newsgroup, would be the
> closing down of alternate languages.

I think that this is wrong, but in a peculiar way -- I'll try
to explain what I think but probably get it wrong.

I agree with you that if accreditation became something you needed it
would result in a lot of bad things, like closing down of less-used
languages, rigid and stupid development methodologies and so on.

*But* I think that accreditation is not itself harmful -- it's harmful
because software development is at such a rudimentary stage. I don't
think that accreditation for a civil engineer is harmful, because what
it says is something like `this person knows how to build structures
which are safe and maintainable and so on'. It *doesn't* say that
`this person will rigidly insist that all the things they build are
made of concrete and will not consider steel tension structures'.

I think that the problem is that software development, as it currently
stands, consists of various squabbling cults (we call them
`methodologies'), and accreditation would be a way for one of these
cults to oust the others. Engineering disciplines aren't like that.

--tim

Kent M Pitman

unread,
Mar 3, 2001, 3:59:49 PM3/3/01
to
Tim Bradshaw <t...@cley.com> writes:

> I think that the problem is that software development, as it currently
> stands, consists of various squabbling cults (we call them
> `methodologies'), and accreditation would be a way for one of these
> cults to oust the others. Engineering disciplines aren't like that.

This is what I meant.

Computer Science is mostly Computer Religion. Hence your use of the term
"cult" is dead on. The last thing we need is a standardized religion.

David Thornley

unread,
Mar 5, 2001, 4:48:01 PM3/5/01
to
In article <sfwd7by...@world.std.com>,

Computer Science is a legitimate science, in its way, but it comprises
too many things. Imagine if physics departments taught Mechanical and
Electrical Engineering - then imagine any sort of complicated useful
device designed by physics majors. There will eventually be a known
corpus of knowledge to build Software Engineering around, but I
don't think we're there yet.

The interesting part here is that most shops (in my experience, and
according to what statistics I've seen) are at about a Nineteenth
Century engineering level (without the option to overbuild things,
a la the Brooklyn Bridge). Any attempt, at this time, to license
software engineers properly would mean that some 90% of computer
shops would have nobody that would qualify, and would not be able
to hire a licensed engineer to certify processes without a great
deal of disruption, and I don't think that's politically possible
right now. (I assume we're talking about government-backed
certification efforts. There have been other certification processes
in the past, and AFAICT nobody ever paid much attention to these.)

So, if there were a serious certification effort, it couldn't be
based on what we know about software engineering now (which
wouldn't be a bad thing, really). In order to be exclusive
enough to be politically worth doing, it would therefore have to
concentrate on other things, likely including knowledge of specialized
things. To make a vague effort at being on-topic, it would likely
define object orientation to be encapsulation, inheritance, and
polymorphism, and therefore CLOS would legally not be an object-oriented
system.

So, while it would be really interesting to see ISO attempt to
standardize religion (ANSI standardization would be too boring -
arguably the US has a standard religion that could be codified
using Billy Graham to write the base document), I don't want to
see any effort to certify Software Engineers any time soon.

--
David H. Thornley | If you want my opinion, ask.
da...@thornley.net | If you don't, flee.
http://www.thornley.net/~thornley/david/ | O-

Russell Wallace

unread,
Mar 6, 2001, 1:44:23 PM3/6/01
to
On 28 Feb 2001 16:14:38 -0500, Marco Antoniotti <mar...@cs.nyu.edu>
wrote:

>CL has N (for a very large positive N) features that Scheme simply
>*does not have*. The inverse is essentially limited to call/cc.
>
>Yet the number of hours sunk into making yet another (L)GPL or Open
>Source or whatever license Scheme implementation, or Scheme library
>covering this or that piece of CL is *staggering*.

There's a causal relationship between these facts.

Writing your own little language implementation is a valuable learning
experience. But which would you rather try to implement as a hobby
project, Scheme or Common Lisp? :)

--
"To summarize the summary of the summary: people are a problem."
mailto:rwal...@esatclear.ie
http://www.esatclear.ie/~rwallace

Marco Antoniotti

unread,
Mar 6, 2001, 2:20:26 PM3/6/01
to

rwal...@esatclear.ie (Russell Wallace) writes:

> On 28 Feb 2001 16:14:38 -0500, Marco Antoniotti <mar...@cs.nyu.edu>
> wrote:
>
> >CL has N (for a very large positive N) features that Scheme simply
> >*does not have*. The inverse is essentially limited to call/cc.
> >
> >Yet the number of hours sunk into making yet another (L)GPL or Open
> >Source or whatever license Scheme implementation, or Scheme library
> >covering this or that piece of CL is *staggering*.
>
> There's a causal relationship between these facts.
>
> Writing your own little language implementation is a valuable learning
> experience. But which would you rather try to implement as a hobby
> project, Scheme or Common Lisp? :)

You can always use Common Lisp to implement a Scheme. That would be
an educating experience. :)

Mike McDonald

unread,
Mar 6, 2001, 4:09:49 PM3/6/01
to
In article <3aa53059....@news.iol.ie>,
rwal...@esatclear.ie (Russell Wallace) writes:

> Writing your own little language implementation is a valuable learning
> experience. But which would you rather try to implement as a hobby
> project, Scheme or Common Lisp? :)

As a hobby project? Either ZetaLisp or LispM lisp! (Anyone know of a CL
implementation of flavors?)

Mike McDonald
mik...@mikemac.com

Andras Simon

unread,
Mar 6, 2001, 5:56:41 PM3/6/01
to
mik...@mikemac.com (Mike McDonald) writes:


> As a hobby project? Either ZetaLisp or LispM lisp! (Anyone know of a CL
> implementation of flavors?)

International Allegro CL Trial Edition
6.0 [Linux (x86)] (Dec 7, 2000 16:15)
Copyright (C) 1985-2000, Franz Inc., Berkeley, CA, USA. All Rights
Reserved.

This copy of Allegro CL is licensed to:
Andras Simon, Technical University, Budapest


; Loading home .clinit.cl file.
;; Optimization settings: safety 1, space 1, speed 1, debug 2.
;; For a complete description of all compiler switches given the
current
;; optimization settings evaluate (EXPLAIN-COMPILER-SETTINGS).
CL-USER(1): (require :flavors)
; Fast loading from bundle code/flavors.fasl.
; Fast loading from bundle code/vanilla.fasl.
T
CL-USER(2):

Andras

Kent M Pitman

unread,
Mar 6, 2001, 5:12:09 PM3/6/01
to
mik...@mikemac.com (Mike McDonald) writes:

> (Anyone know of a CL implementation of flavors?)

I vaguely think I might have heard that Franz has one? (They could comment
better than I could.)

I wrote one for the probably-now-defunct CLOE project at old Symbolics.
(I don't know what happened to the CLOE assets after the old Symbolics
liquidation, but my default assumption is it went to the new Symbolics.)
It wasn't heavy on error checking but implemented most of the features
needed for product delivery. It was kinda fun to write, as I recall.

romeo bernardi

unread,
Mar 6, 2001, 7:24:24 PM3/6/01
to

"Mike McDonald" <mik...@mikemac.com> ha scritto nel messaggio
news:xmcp6.304$a3.1...@typhoon.aracnet.com...

It is in the standard distribution of ACL.

P.

Mike McDonald

unread,
Mar 6, 2001, 8:07:59 PM3/6/01
to
In article <sfwae6y...@world.std.com>,

Kent M Pitman <pit...@world.std.com> writes:
> mik...@mikemac.com (Mike McDonald) writes:
>
>> (Anyone know of a CL implementation of flavors?)
>
> I vaguely think I might have heard that Franz has one? (They could comment
> better than I could.)

I should have been more specific, does anyone know of the source to a CL
implementation of flavors? I know of the ACL version and I've played with it
some. (It's a subset of the LispM's. No special instance variables if I
remember right.) I'm currently playing with CMUCL so I'd kind of like to use
that. At one time, there was one in the CMUL lisp archives but I can't get
in lately. Also, the old Franz Lisp had a toy version.

> I wrote one for the probably-now-defunct CLOE project at old Symbolics.
> (I don't know what happened to the CLOE assets after the old Symbolics
> liquidation, but my default assumption is it went to the new Symbolics.)
> It wasn't heavy on error checking but implemented most of the features
> needed for product delivery. It was kinda fun to write, as I recall.

I got to evaluate CLOE once! Does that count? :-) We wanted to deliver on
Suns instead of PCs so it wasn't for us. We had a UX400 for a while though.

Mike McDonald
mik...@mikemac.com

Rob Warnock

unread,
Mar 6, 2001, 11:34:53 PM3/6/01
to
Marco Antoniotti <mar...@cs.nyu.edu> wrote:
+---------------

| rwal...@esatclear.ie (Russell Wallace) writes:
| > Writing your own little language implementation is a valuable learning
| > experience. But which would you rather try to implement as a hobby
| > project, Scheme or Common Lisp? :)
|
| You can always use Common Lisp to implement a Scheme. That would be
| an educating experience. :)
+---------------

Part V "The Rest of Lisp", Chapter 22 "Scheme: An Uncommon Lisp",
in "Paradigms of Artificial Intelligence Programming: Case Studies
in Common Lisp", by Peter Norvig <URL:http://www.norvig.com/paip.html>.

[Also Chapter 23 "Compiling Lisp", which develops a compiler for the
Scheme in Chapter 22.]


-Rob

-----
Rob Warnock, 31-2-510 rp...@sgi.com
SGI Network Engineering <URL:http://reality.sgi.com/rpw3/>
1600 Amphitheatre Pkwy. Phone: 650-933-1673
Mountain View, CA 94043 PP-ASEL-IA

Russell Wallace

unread,
Mar 7, 2001, 11:34:39 AM3/7/01
to
On 07 Mar 2001 02:27:54 +0000, Erik Naggum <er...@naggum.net> wrote:

>* Russell Wallace


>> Writing your own little language implementation is a valuable learning
>> experience. But which would you rather try to implement as a hobby
>> project, Scheme or Common Lisp? :)
>

> Common Lisp. Implementing a Scheme is no challenge. Figuring out how to
> implement a fundamental core of Common Lisp is quite interesting work,
> and starting to do it, so you can test your conclusions. Figuring out
> the impact of implementing another feature, and like projects, will also
> yield very valuable information. Figuring out how to layer features in
> Common Lisp so you can implement them in stages is also very useful --
> many domain-specific languages grow that way, and growing implementation
> from the bottom up and from the top down at the same time will yield a
> lot of exciting insight. Then there's realizing the support environment
> around the language. How much do you really need to get anywhere is not
> something people know a priori.

Valid points. However, I'm still of the opinion that for a typical
postgrad student, say, a Scheme is about the right level of challenge;
it's simple enough that you can wrap your mind around the whole
language without needing many years of experience in it, and a full
implementation isn't too much work to be in the scope of what one
person can do.

> If you don't make the mistake of starting out with a single-namespace
> Lisp with no real symbols (i.e, a Scheme), you can grow with a lot less
> pain and suffering than if you start out with a serious design flaw.

Hmm, unless I'm missing something, the main advantage of a separate
function namespace seems to be that it allows generating more
efficient code, without needing declarations or general type
inferencing? For commercial applications this is important, but for
academic purposes it would seem to be much less so.

Kent M Pitman

unread,
Mar 7, 2001, 11:59:33 AM3/7/01
to
rwal...@esatclear.ie (Russell Wallace) writes:

> On 07 Mar 2001 02:27:54 +0000, Erik Naggum <er...@naggum.net> wrote:
>
> >* Russell Wallace
> >> Writing your own little language implementation is a valuable learning
> >> experience. But which would you rather try to implement as a hobby
> >> project, Scheme or Common Lisp? :)
> >
> > Common Lisp. Implementing a Scheme is no challenge. Figuring out how to
> > implement a fundamental core of Common Lisp is quite interesting work,
> > and starting to do it, so you can test your conclusions. Figuring out
> > the impact of implementing another feature, and like projects, will also
> > yield very valuable information. Figuring out how to layer features in
> > Common Lisp so you can implement them in stages is also very useful --
> > many domain-specific languages grow that way, and growing implementation
> > from the bottom up and from the top down at the same time will yield a
> > lot of exciting insight. Then there's realizing the support environment
> > around the language. How much do you really need to get anywhere is not
> > something people know a priori.
>
> Valid points. However, I'm still of the opinion that for a typical
> postgrad student, say, a Scheme is about the right level of challenge;
> it's simple enough that you can wrap your mind around the whole
> language without needing many years of experience in it, and a full
> implementation isn't too much work to be in the scope of what one
> person can do.

But what if that's not the nature of the world? I don't think it is.
The world is not something you can wrap your head around without
needing years of experience. And every year you waste not having that
experience is a year denied to the world of the time that you might
actually be productive solving the real problems of society instead of
the made-up problems of the ivory tower.

I'm not knocking research, mind you--I think the ivory tower has a place.
But reimplementing Scheme is not research. Research has long ago moved
long beyond that.

I'm curious--I don't actually know--but would this fly in other fields
of science? Would, for example, a "postgrad" student spend time
wrapping his head around a simplified model of the world? I know it's
done in Physics sometimes, to simplify the very high number of
variables involved, in ordinary "applied" areas like engineering, for
example. Would it be a good way to train to be a general purpose
chemist to specialize on only a restricted space and close one's eyes
to what other chemists were doing because it upset their personal
sense of aesthetics? Would it make you a good doctor to study only
diseases that seemed well-formed because they were easier to wrap
one's head around and it was messy to think about diseases and
syndromes we couldn't quite characterize? Would it make you a good
lifeguard to study rescue techniques only for people who were going to
be cooperative in their struggle for survival? Can lawyers study
idealized legal systems that don't have messy problems like judges
with an attitude or juries that don't like or understand the way the
law is written? Why ought computer science folks get a free pass to be
what amounts to techno-bums and not have to take grief for it? ;-)

Society makes a big investment in training and teaching people. One
then has a debt to society to pay back. Is this debt well-served by
people doing the self-indulgent thing of one after another solving
problems that others have already confronted and patting oneself on
the back about it? That seems to me exactly the appropriate role of a
"pregrad", not a "postgrad". One graduates having demonstrated mastry
in what is known, and moves on to what is not known.

It's been said before but it bears repeating: The world does not need yet
another GPL'd scheme implementation. There are enough. There are many
more Scheme than CL implementations, and there are enough CL implementations.
Languages don't need to be implemented and reimplemented and reimplemented.
They need to be used.

I suspect people like implementing Scheme because it is easier to implement
than to use.

CL is definitely easier to use than to implement. I think that's more
appropriate design. The hard work has been factored out as a loop
invariant and is done only once. Design once, use many times.

Paolo Amoroso

unread,
Mar 7, 2001, 12:35:48 PM3/7/01
to
On Tue, 6 Mar 2001 22:12:09 GMT, Kent M Pitman <pit...@world.std.com>
wrote:

> I wrote one for the probably-now-defunct CLOE project at old Symbolics.

What is CLOE?


Paolo
--
EncyCMUCLopedia * Extensive collection of CMU Common Lisp documentation
http://cvs2.cons.org:8000/cmucl/doc/EncyCMUCLopedia/

Kent M Pitman

unread,
Mar 7, 2001, 1:47:22 PM3/7/01
to
Paolo Amoroso <amo...@mclink.it> writes:

> On Tue, 6 Mar 2001 22:12:09 GMT, Kent M Pitman <pit...@world.std.com>
> wrote:
>
> > I wrote one for the probably-now-defunct CLOE project at old Symbolics.
>
> What is CLOE?

It stood for Common Lisp Operating Environment. It was a Symbolics->386
native delivery solution for applications using primarily portable common
lisp. (It was supposed to have window system support, but that never
worked very well.) The idea was to develop on the Lisp Machine with a
special environment that was very conservative and signaled lots of errors
to keep you in line so that you developed code that would deliver well on a
386.

The lingering vestige of this today in the Symbolics system is the
CLTL syntax (and accompanying package, which masquerades as the LISP
package when in the CLTL syntax), which is similar to the SCL package,
but is conservative rather than liberal in its interpretation of gray
areas.

Conservative readings of a spec enhance portability outward; liberal readings
enhance inward portability. The native LispM environment is liberal in
its readings, IMO.

Marco Antoniotti

unread,
Mar 7, 2001, 3:11:36 PM3/7/01
to

rwal...@esatclear.ie (Russell Wallace) writes:

> On 07 Mar 2001 02:27:54 +0000, Erik Naggum <er...@naggum.net> wrote:
>

... stuff Erik wrote deleted ...


>
> Valid points. However, I'm still of the opinion that for a typical
> postgrad student, say, a Scheme is about the right level of challenge;
> it's simple enough that you can wrap your mind around the whole
> language without needing many years of experience in it, and a full
> implementation isn't too much work to be in the scope of what one
> person can do.

But why shouldn't s/he use Common Lisp to do the implementation?

I strongly advocate this approach. At least you can get across the
point that Scheme is a strict subset of Common Lisp. (And no! You are not
allowed to mention call/cc in this context :) ).

Will Hartung

unread,
Mar 7, 2001, 4:15:17 PM3/7/01
to

"Erik Naggum" <er...@naggum.net> wrote in message
news:31929208...@naggum.net...

> Common Lisp. Implementing a Scheme is no challenge. Figuring out how
to
> implement a fundamental core of Common Lisp is quite interesting work,
> and starting to do it, so you can test your conclusions.

This sort of rings back to the "minimal Lisp" thread of a little while ago.

What is considered the fundamental core of Common Lisp?

I think it's safe to say that the Package System is not a fundamental piece,
CLOS is questionably fundamental, and exceptions, restarts, etc are Core.

CLOS is the best example of something that isn't fundamental, but it doesn't
perform or behave well if it's not really tightly bound into the system. It
works much better when it is fundamental and designed into the
implementation in the first place. No doubt folks like Franz and Xanalys
have specific optimizations for CLOS that won't happen if you just load the
PCL package.

So, what's core?

Regards,

Will Hartung
(wi...@msoft.com)

Craig Brozefsky

unread,
Mar 7, 2001, 5:16:50 PM3/7/01
to
rwal...@esatclear.ie (Russell Wallace) writes:

> > Common Lisp. Implementing a Scheme is no challenge. Figuring out how to
> > implement a fundamental core of Common Lisp is quite interesting work,
> > and starting to do it, so you can test your conclusions. Figuring out
> > the impact of implementing another feature, and like projects, will also
> > yield very valuable information. Figuring out how to layer features in
> > Common Lisp so you can implement them in stages is also very useful --
> > many domain-specific languages grow that way, and growing implementation
> > from the bottom up and from the top down at the same time will yield a
> > lot of exciting insight. Then there's realizing the support environment
> > around the language. How much do you really need to get anywhere is not
> > something people know a priori.
>
> Valid points. However, I'm still of the opinion that for a typical
> postgrad student, say, a Scheme is about the right level of challenge;
> it's simple enough that you can wrap your mind around the whole
> language without needing many years of experience in it, and a full
> implementation isn't too much work to be in the scope of what one
> person can do.

As someone who started with Scheme, and then has seen several people
around me skip Scheme and learn Common Lisp, I just haven't seen this
difference in aquisition and comprehension speed. Skill aquisition
for the core of either language is very similiar. The difference is
that the CL system has a complete language built around it, so one can
just read the very excellent documentation for the rest of CL, and not
have to spend time shopping for or building those features all over
again.

In other words, the parts of the languages which are critical for
learning and have the steepest curve, are similiar between Lisp and
Scheme module some minor details. But, aquisition time for learning
the language and it's tools sufficient for larger projects is
definetly in CL's favor.

I've built sizeable projects in both languages (the largest by order
of magnitude is a CL app) and I find portable (r5rs and srfi) Scheme
is quite a pain to work in. Things like slib and guile ease the pain
somewhat, but it's still opressive.

> Hmm, unless I'm missing something, the main advantage of a separate
> function namespace seems to be that it allows generating more
> efficient code, without needing declarations or general type
> inferencing? For commercial applications this is important, but for
> academic purposes it would seem to be much less so.

Well, I like it because it allows me to have intuitive and short names
for my methods and functions, and intuitive and short names for
variables and arguments, and there is no name conflict. The biggest
example is the list symbol, as it's an intuitive variable name, and a
standard function.

--
Craig Brozefsky <cr...@red-bean.com>
In the rich man's house there is nowhere to spit but in his face
-- Diogenes

Russell Wallace

unread,
Mar 7, 2001, 5:29:45 PM3/7/01
to
On 07 Mar 2001 18:57:31 +0000, Erik Naggum <er...@naggum.net> wrote:

> I consider the main advantage the fact that I don't have to stop using
> certain nouns because I have used the same spelling for some verbs or
> vice versa.

*nod* Fair enough. I generally find I don't particularly want to use
the same names for functions and data, though I do want to use the
same names for _types_ and data, so that if I have an object of type
FOO, I can call it FOO instead of MY-FOO or whatever.

> I would consider myself to have a
> problem if I had to _invent_ new names or use some arbitrary conventions
> to name various objects the same modulo the arbitrary noise. I dislike
> that in Ada, for instance, and I positively hate C++ used with Hungarian/
> Microsoftian gobbledygook, which simply fakes a _lot_ of namespaces.

Yeah, I can't stand Hungarian notation either. What I've ended up
doing for my C++ code is adopting the Java convention where types are
named LikeThis and functions and data are named likeThis, which is a
fairly painless way to get a separate namespace for types.

Russell Wallace

unread,
Mar 7, 2001, 5:37:17 PM3/7/01
to
On Wed, 7 Mar 2001 16:59:33 GMT, Kent M Pitman <pit...@world.std.com>
wrote:

>I'm curious--I don't actually know--but would this fly in other fields
>of science?

Well, we start off by teaching Newton's laws of physics and the ball
and stick model of chemistry rather than the more complex general
laws, because they're simpler, therefore easier to deal with. One
can't restrict oneself to the simple models forever, but they do have
their uses.

>Society makes a big investment in training and teaching people. One
>then has a debt to society to pay back. Is this debt well-served by
>people doing the self-indulgent thing of one after another solving
>problems that others have already confronted and patting oneself on
>the back about it? That seems to me exactly the appropriate role of a
>"pregrad", not a "postgrad". One graduates having demonstrated mastry
>in what is known, and moves on to what is not known.

A valid point; one could reasonably argue that writing a Scheme
implementation is a good undergrad project, but by the time you've
graduated you should be spending your time inventing new wheels rather
than reinventing old ones.

>I suspect people like implementing Scheme because it is easier to implement
>than to use.

That's a major part of it. Also it's simpler, therefore easier to
tinker with.

Let's take objects, for example. If you want to get on with writing OO
code in a Lisp family language, obviously the sensible thing to do is
to use CLOS which already exists and works well. But what if you want
to experiment with a new object system design you've thought of?
Writing or modifying a Scheme implementation for the purpose is a
reasonable thing to do.

>CL is definitely easier to use than to implement. I think that's more
>appropriate design. The hard work has been factored out as a loop
>invariant and is done only once. Design once, use many times.

I agree.

Russell Wallace

unread,
Mar 7, 2001, 5:38:33 PM3/7/01
to
On 07 Mar 2001 15:11:36 -0500, Marco Antoniotti <mar...@cs.nyu.edu>
wrote:

>But why shouldn't s/he use Common Lisp to do the implementation?

Well, in that case the Common Lisp vendor has already done 99% of the
work for you. In the context of a student project, that's cheating :)

Russell Wallace

unread,
Mar 7, 2001, 5:53:00 PM3/7/01
to
On 07 Mar 2001 16:16:50 -0600, Craig Brozefsky <cr...@red-bean.com>
wrote:

>In other words, the parts of the languages which are critical for
>learning and have the steepest curve, are similiar between Lisp and
>Scheme module some minor details. But, aquisition time for learning
>the language and it's tools sufficient for larger projects is
>definetly in CL's favor.

Yep.

However, the context of the discussion is why there are more
implementations of Scheme, and I think you'll agree it's much easier
to _implement_ Scheme than Common Lisp (or make major changes to an
implementation).

Marco Antoniotti

unread,
Mar 7, 2001, 5:57:05 PM3/7/01
to

rwal...@esatclear.ie (Russell Wallace) writes:

...

> >I suspect people like implementing Scheme because it is easier to implement
> >than to use.
>
> That's a major part of it. Also it's simpler, therefore easier to
> tinker with.

Of course. It is very easy to tinker with a CL implementation of
Scheme. :)

> Let's take objects, for example. If you want to get on with writing OO
> code in a Lisp family language, obviously the sensible thing to do is
> to use CLOS which already exists and works well. But what if you want
> to experiment with a new object system design you've thought of?
> Writing or modifying a Scheme implementation for the purpose is a
> reasonable thing to do.

If you "have a new idea about a new OO idea" it is much easier to
tinker with a CL implementation of the new OO idea that to modify a
Scheme implementation. That is a given.

Marco Antoniotti

unread,
Mar 7, 2001, 6:03:53 PM3/7/01
to

rwal...@esatclear.ie (Russell Wallace) writes:

> On 07 Mar 2001 15:11:36 -0500, Marco Antoniotti <mar...@cs.nyu.edu>
> wrote:
>
> >But why shouldn't s/he use Common Lisp to do the implementation?
>
> Well, in that case the Common Lisp vendor has already done 99% of the
> work for you. In the context of a student project, that's cheating :)

But in that way you are not exposing the student to the simple and
true fact that Scheme is a (very) small subset of Common Lisp.

Apart from that, I would be surprised if an educational institution
today required a student to write from scratch (meaning, in some form
of portable assembler, like C :) ) a Scheme interpreter. I'd bet that
the standard exercise "let's write a metacircular interpreter in Lisp"
is far more common. Why not start with CL then?

Kent M Pitman

unread,
Mar 7, 2001, 6:13:47 PM3/7/01
to
"Will Hartung" <wi...@msoft.com> writes:

> "Erik Naggum" <er...@naggum.net> wrote in message
> news:31929208...@naggum.net...
> > Common Lisp. Implementing a Scheme is no challenge. Figuring out
> > how to implement a fundamental core of Common Lisp is quite
> > interesting work, and starting to do it, so you can test your
> > conclusions.
>
> This sort of rings back to the "minimal Lisp" thread of a little
> while ago.
>
> What is considered the fundamental core of Common Lisp?
>
> I think it's safe to say that the Package System is not a
> fundamental piece,

Why? (I don't have a position on this, I'm just curious why you're so
definite about it.)

> CLOS is questionably fundamental, and exceptions, restarts, etc are
> Core.

Again why?

It's almost as if you're building your idea of core as you go...

> CLOS is the best example of something that isn't fundamental, but it
> doesn't perform or behave well if it's not really tightly bound into
> the system. It works much better when it is fundamental and designed
> into the implementation in the first place. No doubt folks like
> Franz and Xanalys have specific optimizations for CLOS that won't
> happen if you just load the PCL package.
>
> So, what's core?

This discussion used to come up a lot in my discussion with Eulisp
designers. I identified two competing meanings of "core" that seemed
incompatible yet held simultaneously by people working together on the
project. I want to at least observe the potential for people to talk
at crossed purposes:

I think "core" means "encapsulates something that I could not write
myself without going out of the language". So to me, the function
OPEN is core because without it I'd better have an FFI as core plus
also a manual for every operating system I'm allowed to run in and a
way to detect which operating system is out there so I can call the
right OS-level primitive. To me, as a user of CL, a language that
doesn't come with OPEN doesn't talk to the file system and is
fundamentally limited in what it can do. No matter how many macro
packages or higher order functions I write, I'm still stuck not
talking to the operating system. Ditto for UNWIND-PROTECT. And
WITHOUT-PREEMPTION or WITHOUT-INTERRUPTS. And DRAW-LINE. And so on.
CL, if anything, has too small a core. The Lisp Machine, I believe,
had my notion of "core". It knew that anytime there was a primitive
that it didn't give the user access to, it was denying the user
control over the ability to do things in that programming domain.

There is another meaning of "core" that incompatible with this. It
assumes that there is a generous god out there that makes "libraries"
and is continuously attaching them to your language using some
extralinguistic glue that is not part of your language, but that
causes you to miraculously be able to open files on one day even
though you couldn't the day before. You just call some new function
OPEN (never mind how it got there) and it would do this service for
you. In this universe, the notion of "core" is "anything that must be
done with a special form". As long as something doesn't require new
language glue to express, it is not "core".

Russell Wallace

unread,
Mar 7, 2001, 6:26:57 PM3/7/01
to
On 07 Mar 2001 18:03:53 -0500, Marco Antoniotti <mar...@cs.nyu.edu>
wrote:

>I'd bet that


>the standard exercise "let's write a metacircular interpreter in Lisp"
>is far more common. Why not start with CL then?

Yes, that's perfectly reasonable.

(Though I suspect implementing call/cc might be a major headache.)

Marco Antoniotti

unread,
Mar 7, 2001, 6:55:02 PM3/7/01
to

rwal...@esatclear.ie (Russell Wallace) writes:

> On 07 Mar 2001 18:03:53 -0500, Marco Antoniotti <mar...@cs.nyu.edu>
> wrote:
>
> >I'd bet that
> >the standard exercise "let's write a metacircular interpreter in Lisp"
> >is far more common. Why not start with CL then?
>
> Yes, that's perfectly reasonable.
>
> (Though I suspect implementing call/cc might be a major headache.)

It is a major headache no matter what. At that point the level of
complexity is already such that you might as well go for the full
CL. :)

Will Hartung

unread,
Mar 7, 2001, 7:58:57 PM3/7/01
to

"Kent M Pitman" <pit...@world.std.com> wrote in message
news:sfwwva1...@world.std.com...

> "Will Hartung" <wi...@msoft.com> writes:
>
> > "Erik Naggum" <er...@naggum.net> wrote in message
> > news:31929208...@naggum.net...
> > > Common Lisp. Implementing a Scheme is no challenge. Figuring out
> > > how to implement a fundamental core of Common Lisp is quite
> > > interesting work, and starting to do it, so you can test your
> > > conclusions.
> >
> > This sort of rings back to the "minimal Lisp" thread of a little
> > while ago.
> >
> > What is considered the fundamental core of Common Lisp?
> >
> > I think it's safe to say that the Package System is not a
> > fundamental piece,
>
> Why? (I don't have a position on this, I'm just curious why you're so
> definite about it.)

I think as you mention below, it's what is needed to be done "outside" the
language.

I think that the Package system could be built and defined completely within
CL itself, without having to touch the OS interface, or the compiler, as the
package system appears to "only" affect the Lisp Reader.

> > CLOS is questionably fundamental, and exceptions, restarts, etc are
> > Core.
>
> Again why?

As PCL demonstrates, CLOS can be built up within the lanaguage itself and
simply loaded, but then it's not quite right (as demonstrated by CMUCLs
implementation). If CLOS was there from the start, and compiler is made
aware of it, then it Works Better. So, I think it is arguable that CLOS as
we know it today cannot be done from simply within the language (but I could
be mistaken).

> It's almost as if you're building your idea of core as you go...

I was just trying to come up with an example of current CL constructs that I
felt were on once side of the "core" line, on the line, and on the other
side of the line.

> This discussion used to come up a lot in my discussion with Eulisp
> designers. I identified two competing meanings of "core" that seemed
> incompatible yet held simultaneously by people working together on the
> project. I want to at least observe the potential for people to talk
> at crossed purposes:
>
> I think "core" means "encapsulates something that I could not write
> myself without going out of the language".

So, this brings me to, with clarification (thank you Kent), my question.
Erik mentioned the "Fundamental Core".

HASHTABLEs don't make CL what it is. They're nice, and they're in the
standard so they're portable, but if they didn't exist, they're easily
crafted. They're not parts of the core that I think Erik was talking about.

Another example, I think it's pretty clear that you cannot make a CL out of
a Scheme without redoing the implementation, so Scheme is not a "core" of
CL. Perhaps it's close, or moreso, perhaps it's way off but LOOKS close to
the casual observer.

So, I was curious what people thought the CL core was.

I guess in simple terms regarding the LispM, the answer is "Whatever is
written in micro-code" or protoLisp, or whatever they used to bootstrap
those things.

Regards,

Will Hartung
(wi...@msoft.com)

cbbr...@hex.net

unread,
Mar 7, 2001, 10:03:23 PM3/7/01
to
Marco Antoniotti <mar...@cs.nyu.edu> writes:
> rwal...@esatclear.ie (Russell Wallace) writes:
>
> > On 07 Mar 2001 02:27:54 +0000, Erik Naggum <er...@naggum.net> wrote:
> >
> ... stuff Erik wrote deleted ...
> >
> > Valid points. However, I'm still of the opinion that for a typical
> > postgrad student, say, a Scheme is about the right level of challenge;
> > it's simple enough that you can wrap your mind around the whole
> > language without needing many years of experience in it, and a full
> > implementation isn't too much work to be in the scope of what one
> > person can do.

> But why shouldn't s/he use Common Lisp to do the implementation?

> I strongly advocate this approach. At least you can get across the
> point that Scheme is a strict subset of Common Lisp. (And no! You
> are not allowed to mention call/cc in this context :) ).

The downside is that the task is fairly much trivial, largely
exercising one's understanding of CL and Scheme. It may be more
educational to implement atop a fundamentally "dumber" language in
that you have to build:
- A garbage collector;
- A model for mapping stuff like C stack frames onto the "Lispy"
code storage model;
- A name space manager...

I'd observe one further thing about the "challenge" of it: since
implementing Scheme (and CL, for that matter) has been done before,
unless there is something Rather Particular about the implementation
approach that is special, this is a task that commonly won't
contribute much to "new learning" in the discipline.

"New learning" is _quite_ necessary to Ph.D studies, but not nearly so
much for M.Sc studies. As a result, for a Master's student to construct
Yet Another Scheme is probably "educationally adequate." The same is
not true for a Ph.D student, and I'd definitely distinguish between
the two...
--
(reverse (concatenate 'string "gro.gultn@" "enworbbc"))
http://vip.hex.net/~cbbrowne/lisp.html
Rules of the Evil Overlord #58. "If it becomes necessary to escape, I
will never stop to pose dramatically and toss off a one-liner."
<http://www.eviloverlord.com/>

Kent M Pitman

unread,
Mar 7, 2001, 11:28:10 PM3/7/01
to
cbbr...@hex.net writes:

> "New learning" is _quite_ necessary to Ph.D studies, but not nearly so
> much for M.Sc studies. As a result, for a Master's student to construct
> Yet Another Scheme is probably "educationally adequate." The same is
> not true for a Ph.D student, and I'd definitely distinguish between
> the two...

Obviously this is a matter on which people could disagree. But I
doubt MIT would accept yet another scheme as educationally adequate
for a master's student to spend time on...

I think undergrad work is the place for duplicating both goals and techniques
that have been tried before. For example, implementing Scheme at all.

I think masters work should seek to apply new techniques to old areas or
old techniques to new areas. For example, applying some well-understood
register allocation technique or data flow analysis to Scheme when such
has not been done before.

I think phd should be about thinking up a new subject area and
identifying ways of thinking about it. For example, inventing the
notion of a reflective lisp or some such thing, before such had been
thought of, and then saying which existing techniques for parsing,
compiling, etc. might be relevant or interesting to it, or why they'd
have to be modified to make sense.

Kellom{ki Pertti

unread,
Mar 8, 2001, 3:16:14 AM3/8/01
to
Marco Antoniotti <mar...@cs.nyu.edu> writes:
> Apart from that, I would be surprised if an educational institution
> today required a student to write from scratch (meaning, in some form
> of portable assembler, like C :) ) a Scheme interpreter.

I would be surprised if more than a handful of the Scheme
implementations around were in fact student projects. My bet is that
apart from the research implementations, the rest were born out of the
joy of hacking. Not that doing research in any way precludes joy of
course.

If I had had access to Scheme literature when I implemented my Lisp
interpreter, I would have implemented Scheme. I didn't, so I ended up
implementing a bastard cousin of CL instead.

> I'd bet that the standard exercise "let's write a metacircular
> interpreter in Lisp"
> is far more common. Why not start with CL then?

The learning experience one gets from implementing garbage collection
and figuring out how to represent cons cells, numbers, etc. at the low
level is considerably different from one would get from writing a
metacircular interpreter in CL.

Do you consider it a bad thing for students to get hands-on experience
with language implementation techniques?
--
Pertti Kellom\"aki, Tampere Univ. of Technology, Software Systems Lab

Lars Lundbäck

unread,
Mar 8, 2001, 8:48:36 AM3/8/01
to
Will Hartung wrote:

>
> > Kent Pitman wrote:
> >
> > I think "core" means "encapsulates something that I could not write
> > myself without going out of the language".
>
> So, this brings me to, with clarification (thank you Kent), my question.
> Erik mentioned the "Fundamental Core".
>

Are you perhaps thinking about some Lisp OS, that "can of worms"? I am
very sure that Erik had no such thing in mind since he has not been
gulled into discussing Common Lisp from that viewpoint, yet the first
paragraph in his post illustrates the Lisp OS complex of problems so
well.

> I guess in simple terms regarding the LispM, the answer is "Whatever is
> written in micro-code" or protoLisp, or whatever they used to bootstrap
> those things.
>

Are we discussing the language core or a Common Lisp System core?

To see the boundaries of any core, I think one would have to state what
the functionality of that core is to be. The CL specification (meaning
the HyperSpec, I haven't read the ANSI spec) is non-layered. Is it
possible and reasonable to layer the language, so that any function in a
layer only depends on definitions(?) in some lower layer? The bottom
layer(s) may serve as a core _for the language_ but not for the
_system_. That kind of core would have CL code as well as protoLisp, and
some of it would be very hidden from the user.

Anyway, I'm sure you had something in the back of your mind that made
you think in terms of CL Core, perhaps you can tell us?

Regards,
Lars

Pierre R. Mai

unread,
Mar 8, 2001, 10:26:31 AM3/8/01