Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

(describe-differences "Scheme" "Common Lisp")

97 views
Skip to first unread message

David Steuber The Interloper

unread,
Sep 12, 1998, 3:00:00 AM9/12/98
to
Perhaps I should also subscribe to the scheme group. In fact, I think
I will.

Meanwhile, could someone please tell me the differences between Scheme
and Common Lisp? I am interested in syntax, primitive functions (like
cons, car, cdr, etc), philosophical differences, size of interpreter
implementation, that sort of thing.

I keep getting suggestions to consider Scheme for a scripting
language. There must be something to it. Currently, I am biased
towards CL because the reference material I have is CL related and I
don't mind implementing a subset.

Thanks.

P.S. A pointer to an online resource that describes the differences
would be more than satisfactory.


--
David Steuber
http://www.david-steuber.com
To reply by e-mail, replace trashcan with david.

When the long night comes, return to the end of the beginning.
--- Kosh (???? - 2261 AD) Babylon-5

Ray Dillinger

unread,
Sep 13, 1998, 3:00:00 AM9/13/98
to
David Steuber The Interloper wrote:
>
> Perhaps I should also subscribe to the scheme group. In fact, I think
> I will.
>
> Meanwhile, could someone please tell me the differences between Scheme
> and Common Lisp?

Scheme syntax, on the surface, looks just like CL syntax -- fully
parenthesized prefix notation.

Scheme macros are different, because they protect you from
inadvertent variable capture (which can happen in a CL macro
where a variable introduced by the macro has the same name
as a variable in scope where the macro is called).

Scheme has one namespace -- a procedure is just a value that
a variable can take on, the same as any other value.

Scheme uses the same rules to evaluate all the forms in a
combination -- the first position object can be either a
proceure or an expression that returns a procedure as its
result -- makes no difference to scheme.

Scheme does check, if the first element in a combination is a
symbol, to see whether that symbol is a macro keyword (aka a
syntactic keyword) or a variable name; the difference being
that, in an ordinary function call (where the leading symbol
is a variable name) the function and *ALL* of the arguments
will be evaluated - in an UNSPECIFIED order - before the function
is called, whereas in a macro call the order and method of
evaluation dependon on the macro expansion. The unspecified
order is so that compilers are free to try to extract the
maximum parallelism available from the algorithm; however
the compiler is also constrained that it must keep side
effects and interferences between the parallel threads
controlled to the extent that nothing happens unless it
could have happened by evaluating these forms in some
concrete non-intermingled order.

> I am interested in syntax, primitive functions (like
> cons, car, cdr, etc), philosophical differences, size of interpreter
> implementation, that sort of thing.

Primitive functions like CAR, CONS, and CDR work in the same
way you'll be used to in CL, except you'll never be sure which
argument is evaluated first -- of course, if it matters, then
the odds that what you have is not a coding mistake are small.

Scheme has far *fewer* functions and libraries than CL -- the
language has been deliberately kept small by requiring a
unanimous consensus of the standards committee rather than a
majority vote before new features are added. In general,
a thing does not become a part of the language until it is
pretty much agreed both that it is needed and that the proposal
is clearly the best, most elegant, and most consistent way to
do it.

There are a lot of things that there is general agreement that
scheme *needs* -- such as an object system and a module system --
but which have not been added to the language because no one
proposal for them has been clearly "the one right thing to do".

Interpreters are fairly rare in the scheme world -- compilation
has pretty much taken over. But scheme systems tend to be MUCH
smaller than comparable CL systems just because of the smallness
of the language.

> I keep getting suggestions to consider Scheme for a scripting
> language. There must be something to it.

Indeed there is. The way procedures and macros are handled in
scheme makes it considerably easier and/or clearer to implement
other language features, or interpreters and compilers for them,
than in CL (note that this is just what I observe in textbooks;
researchers familiar with both almost always pick scheme to
express such things in rather than CL -- not being a CL user I
can't really draw comparisons from personal experience). Also
there are several implementations of scheme (SIOD because it's
tiny, Guile because that's what it was designed for) which are
particularly well-suited for embedding into another application.
I don't think there are any CL systems which fit under 500
kbytes, but I could be wrong.
ray


are handled

Darius Bacon

unread,
Sep 13, 1998, 3:00:00 AM9/13/98
to
tras...@david-steuber.com (David Steuber "The Interloper") writes:

>P.S. A pointer to an online resource that describes the differences
>would be more than satisfactory.

http://www.well.com/user/djello/scheme-for-lispers.html is a quick
overview of what's needed for a Common Lisp user to understand a
Scheme program.

--
Darius Bacon http://www.well.com/~djello

Erik Naggum

unread,
Sep 13, 1998, 3:00:00 AM9/13/98
to
* tras...@david-steuber.com (David Steuber "The Interloper")

| Meanwhile, could someone please tell me the differences between Scheme
| and Common Lisp? I am interested in syntax, primitive functions (like

| cons, car, cdr, etc), philosophical differences, size of interpreter
| implementation, that sort of thing.

the difference is easily observed to be cultural. you cannot compare
cultures, you can only critique them. the Common Lisp is generally very
welcoming -- proof of that is that all the good ideas in Scheme were
adopted in Common Lisp, but none of the bad ones, which Scheme sadly has
refused to let go of.

| I keep getting suggestions to consider Scheme for a scripting language.
| There must be something to it.

Scheme as a culture is much more combative than Common Lisp. Schemers
fight each other over how unclean they are, and only the purest of pure
are admitted into the language -- only causing much _more_ unclean code
to live in "user space", instead. of course they fight everybody else
with this cleanliness obsession, too: they have this _odd_ need to tell
people to use Scheme instead of Common Lisp and impute to Common Lisp all
of the unhealthy properties of Scheme were it ever to become as big and
supremely useful to as many programmers and uses as Common Lisp is, but
Scheme does _not_ scale well. Scheme's beauty is that easily achieved by
the immature, like kittens, puppies, baby seals, etc -- it's like a
pedophile whose arrested sexual development has removed from him the
ability to find attraction in the mature.

avoid Scheme, but do know it first. the same goes for other languages
that made seriously misguided decisions at one point or another in their
youth, like C, Perl, etc. they have their uses, but the chance they
overlap with your needs is _very_ small. yet, you must learn to eschew
their mistakes: "small is beautiful" is actually reversed: it is ver
often the case the beautiful is small, but mere lack of size is no
guarantee of beauty. absence of features is _not_ good. unless, that
is, you're into job security and reimplementing several kinds of wheels
in incompatible ways -- then you will find Scheme even better than C and
Perl. _nothing_ is standard in Scheme, except for the ridiculously small
language core. it also appears to be much more fun to implement Scheme
than to actually use it. the useful Scheme systems are often larger than
the comparable Common Lisp systems, and this is no accident. small
languages require more verbiage to express the same ideas than large
languages do, and on top of that much smaller parts of the system has
been left to people who were paid _only_ to implement fundamentals well
and optimize them heavily. small languages violate the concept of
division of labor. only fools and geniuses insist on implementing their
own languages, and you can never tell which is which until afterwards.
in Scheme, as in C, every programmer has to be a genius, but often comes
out a fool because he is so far from competent at every task required.

I think I dislike Scheme more because of the way Scheme folks denigrate
Common Lisp, which they invariably haven't used, than anything else. a
language that does that to good people's minds should carry a warning
from the Surgeon General.

I know C very well and Scheme well. I became exhausted over the prospect
of doing the same low-level cruft over and over and over. Scheme is
often worse than C in this regard, because you will mostly find
interpreted Scheme environments, and abstraction carries a heavy
performance penalty, so you need to be much too clever. (this is like
byte-compiled Common Lisps, which penalize you severely for using your
own functions instead of the internal functions that are very heavily
optimized.)

if you really want to implement your own language, implement Scheme. if
you really want to _use_ a good language, choose Common LIsp. corollary:
implement Scheme in Common Lisp.

#:Erik
--
http://www.naggum.no/spam.html is about my spam protection scheme and how
to guarantee that you reach me. in brief: if you reply to a news article
of mine, be sure to include an In-Reply-To or References header with the
message-ID of that message in it. otherwise, you need to read that page.

David Steuber The Interloper

unread,
Sep 14, 1998, 3:00:00 AM9/14/98
to
On 13 Sep 1998 20:21:25 GMT, dje...@well.com (Darius Bacon) claimed or
asked:

% tras...@david-steuber.com (David Steuber "The Interloper") writes:
%
% >P.S. A pointer to an online resource that describes the differences
% >would be more than satisfactory.
%
% http://www.well.com/user/djello/scheme-for-lispers.html is a quick
% overview of what's needed for a Common Lisp user to understand a
% Scheme program.
%
% --
% Darius Bacon http://www.well.com/~djello

<blockquote>
What's Missing
Scheme lacks several standard facilities of a modern general-purpose
language: modules, exception-handling, a rich standard library, and an
object system. It's possible to build them all on top of the base
language with little extra support from the implementation, but
there's no consensus on what the resulting system should look like.
Scheme isn't likely to supplant CL for industrial-strength use;
instead it's most useful for research and education and as a glue
language for other systems.
</blockquote>

Presumably, I can implement the missing features (in theory). I don't
know that I have the knowledge to do the job. In order to
satisfactorily interact with the Java Runtime, I may need to do
multithreading, synchronization, exception handling, and packages
(modules). I would do that in the form of obvious implementation
specific extensions. If I go the compiler route (I've never even
written an interpreter let alone a compiler), I need to be able to
generate .class files in the format defined by "The Java Virtual
Machine Specification".

Guy Steele seems to show up all over the place. He invents Scheme, he
writes a Common Lisp ANSI standard book, and is co-author of "The Java
Language Specification."

I wonder if, with the support of some Java code, a Scheme compiler (or
CL) could be written to produce a .class file. The byte code and file
format is biased heavily towards Java. But it would be cool to see if
Scheme can be translated to Java byte code. It would be even cooler
to see what a Java decompiler makes of it.

Clearly CL is Schemes big brother in terms of size.

Brian M. Moore; esq.

unread,
Sep 14, 1998, 3:00:00 AM9/14/98
to
David Steuber "The Interloper (tras...@david-steuber.com) wrote:

: I wonder if, with the support of some Java code, a Scheme compiler (or


: CL) could be written to produce a .class file. The byte code and file
: format is biased heavily towards Java. But it would be cool to see if
: Scheme can be translated to Java byte code. It would be even cooler
: to see what a Java decompiler makes of it.

It is possible to compile a large subset of Scheme to Java
bytecodes. The result, however, is not properly tail-recursive,
as the JVM itself is not properly tail recursive. This is due to
restrictions in the security manager and classloader. There are
also restrictions on continuations captured with call/cc.

Per Bothner has created a compiler of this nature known as
Kawa. See http://www.cygnus.com/~bothner/kawa.html for more
details.


--
--Brian

+---------------------------------------------------------+
| Brian M. Moore procedural epistemologist |
| moo...@iname.com http://www.ukans.edu/home/mooreb |
+---------------------------------------------------------+

Klaus Schilling

unread,
Sep 14, 1998, 3:00:00 AM9/14/98
to
tras...@david-steuber.com (David Steuber "The Interloper") writes:
>
> I wonder if, with the support of some Java code, a Scheme compiler (or
> CL) could be written to produce a .class file. The byte code and file
> format is biased heavily towards Java. But it would be cool to see if
> Scheme can be translated to Java byte code. It would be even cooler
> to see what a Java decompiler makes of it.

Per Bothner from Cygnus Solutions is working on that . (Kawa project).


>
> Clearly CL is Schemes big brother in terms of size.

But Commion Lisp lacks call-with-current-continuation.

Klaus Schilling

Darius Bacon

unread,
Sep 14, 1998, 3:00:00 AM9/14/98
to
tras...@david-steuber.com (David Steuber "The Interloper") writes:

>I wonder if, with the support of some Java code, a Scheme compiler (or
>CL) could be written to produce a .class file.

Actually, no Java is needed -- there's Scheme code to read and write
classfiles on my home page.

Kent M Pitman

unread,
Sep 14, 1998, 3:00:00 AM9/14/98
to
[comp.lang.scheme removed. i don't cross-post and in fact i don't
usually reply to cross-posts. i'm only sending this at all because i
spent enough time on it before noticing it was a cross-post that i
didn't want to throw it away. see
http://world.std.com/~pitman/pfaq/cross-posting.html
for an explanation of why i don't participate in cross-posted
discussions.]

Klaus Schilling <Klaus.S...@home.ivm.de> writes:

> > Clearly CL is Schemes big brother in terms of size.
>
> But Commion Lisp lacks call-with-current-continuation.

You shouldn't assume all CL users are sad about this.

I, for one, am glad not to see continuations in CL. If someone
proposed adding them, I would argue strongly against the move. I feel
about them the same way as others feel about GO. They encourage
unstructured programming and I think they're a semantic mess. (No, I
don't mean they can't be defined in a formal semantics. I mean there
is no proof that being modelable in a formal semantics implies being a
good way for a human being to think about or describe something.)
Critically absent from continuations (i.e., procedurally embedded in
them, and therefore not always "apparent"--halting problems and all,
you know) is the answer to the question "is this the last time this
continuation will return?", which makes a mess of any attempt to
attach a rational semantics to unwind-protect. Scheme doesn't
offer unwind-protect, of course; it leaves it as an exercise to the
(human) reader. But you can see hints of how ugly the situation would be
in 6.6.1 Ports in the Revised^5 Report on Scheme by realizing that
unwind-protect or its equivalent is subprimitive to call-with-input-file,
and realizing the war between unwind-protect and continuations is what
leads to the mega-yuck wording there.

I argued vociferously to the Scheme committee that either Scheme
continuations should take an extra arg saying final-p or else
call-with-current-continuation should take an extra argsaying whether
the continuation given should be a one-time-use continuation (i.e.,
one that marks itself invalid after one use, allowing it to run its
unwinds at unwind time instead of later at gc time, which is--though
the scheme report doesn't say it outright--the only time the "proof"
can ever be done that the continuation is no longer in use) or a
multi-use continuation (i.e., one that never runs unwinds because it
might go "return again"). Either of those "fixes" would fix the
problem in a clean way. But the majority of authors of the Revised^N
Scheme Report seem to like the present definition of Scheme as is
because they're used to it, in spite of the mess I claim it makes.

I have often said that the way to tell a usable language is the
presence of unwind-protect. It's not just that unwind-protect is a
useful thing in and of itself, but its presence or absence says a
great deal about the mindset of a language's designers.

Don't get me wrong. I think the meta-notion of continuations offers
interesting expression/power. But so does GO. (Coincidentally, the
two are pretty related.) My point is that one doesn't have to deny its
expressiveness or flexibility, nor does one have to fail to understand
something, in order to not like it. So please no lectures about all
the wonders of call/cc. I understand the wonders. But I don't
put things in languages to make people wonder--I put them in because
they allow me to express things clearly. And absent a way of distinguishing
multi-use continuations from one-use continuations, or absent a way of
distinguishing non-final calls to continuation sfrom final calls,
the notion of call-with-current-continuation is broken.

Of course, we could put it into CL with a fix, but then it wouldn't
be compatible and people would still say we weren't Scheme's big brother
because we weren't buggy like Scheme is. So what's the point?

I'd rather have a working unwind-protect than scheme continuations any day.

Please don't forward this to comp.lang.scheme. This message is not
about what Scheme should do. I argue that elsewhere than in public
forums and don't want to stir it up with them again. Scheme will do
what Scheme will do and I for one am tired of beating my heaed against
the wall on it. My remarks here are offered only as an explanation of
why I think it's in CL's best interests not to follow Scheme's lead.
And perhaps as a partial explanation of why I'm so unimpressed by
people who insist on telling me that Scheme is more "noble" than CL
because of it so-called simplicity and its claim that somehow a
formal semantics makes it morally superior.

Enough controversy for one evening. Time for me to head off to bed.
--Kent

brl...@my-dejanews.com

unread,
Sep 14, 1998, 3:00:00 AM9/14/98
to
In article <6th9g5$qku$1...@its.hooked.net>,
> overview of what's needed for a Common Lisp user to understand a
> Scheme program.

Nice guide. As a schemer, I also found it useful for learning something about
CL.

One nit: Exclamation points are specifically about mutation, not side effects
in general. For example, the write and display procedures don't have
exclamation points.

-----== Posted via Deja News, The Leader in Internet Discussion ==-----
http://www.dejanews.com/rg_mkgrp.xp Create Your Own Free Member Forum

Mark Watson

unread,
Sep 14, 1998, 3:00:00 AM9/14/98
to
Hello everyone,

re:
>> Erik Naggum wrote:
> But Scheme does _not_ scale well.

Well, I must admit that I have never written any extremely
large systems in Scheme, but I think that I will strongly
disagree with this statement anyway! Years ago, at
SAIC, I used to spend about 1/4 of my time doing IR&D
projects (mostly NLP, some expert systems, reasoning,
etc.) in Common LISP and Scheme. (I have also written
two application-oriented LISP books for Springer-Verlag;
one in Common LISP, and one in Scheme).

I used a simple procedure in Scheme to keep moderately
large systems tidy: after I implement a framework that might
include a dozen or so functions (and debug them!), I used
to write a "wrapper" function, and copy all of the "worker"
functions into the "wrapper" function, using LEXICAL SCOPING
to keep my namespace clean (i.e., the worker functions are
now invisible to the global name space).

I could never think of a reason why this was a bad idea, and it
allowed me to keep systems tidy: especially important if I would
have to occasionally set aside work on a Scheme project for
a month or so.

-- Mark Watson www.markwatson.com

PS. I did do one large system in Scheme, but it was not really
my code: once I spent three long days converting OPS5 from
Common LISP to Scheme (it was an old Scheme dialect with a
built in "eval" function, so this was not too difficult - mostly
hand coding loops to use recursion).

Dimitrios Souflis

unread,
Sep 14, 1998, 3:00:00 AM9/14/98
to
Erik Naggum wrote:
>
> I think I dislike Scheme more because of the way Scheme folks denigrate
> Common Lisp, which they invariably haven't used, than anything else.
>

Every good programmer knows at least a handful of languages to get
along,
so a "Scheme folk" is rather an abstraction than an actual label.

Now, on the point.

Scheme is small. I guess R5RS is about one tenth the size of CLTL.
That makes it suitable for
1) scripting. Yes, people _need_ scripting. They do it in Perl,
Javascript
TCL, whatever.
2) embedded applications ? If Java can make it, why not Scheme?
3) explaining ideas simply

Like it or not, the mere size of a language can make a difference.
My first LISP was Acornsoft LISP. At the time, small was beautiful
for LISPers.

--
Dimitrios Souflis dsou...@altera.gr
Altera Ltd. http://www.altera.gr/dsouflis

*** Reality is what refuses to disappear when you stop believing
*** in it (VALIS, Philip K. Dick)

Barry Margolin

unread,
Sep 14, 1998, 3:00:00 AM9/14/98
to
In article <35FD2832...@altera.gr>,

Dimitrios Souflis <dsou...@altera.gr> wrote:
>Scheme is small. I guess R5RS is about one tenth the size of CLTL.
>That makes it suitable for
>1) scripting. Yes, people _need_ scripting. They do it in Perl,
>Javascript
>TCL, whatever.

What does the size of the language spec have to do with whether it's
suitable for scripting? Perl is one of the most popular scripting
languages, and its book is about the size of CLtL. Part of what makes it
popular is the wealth of built-in, high-level facilities (e.g. associative
arrays and regular expressions) -- also much more in the style of CL
(associative arrays are analogous to CL hash tables).

--
Barry Margolin, bar...@bbnplanet.com
GTE Internetworking, Powered by BBN, Burlington, MA
*** DON'T SEND TECHNICAL QUESTIONS DIRECTLY TO ME, post them to newsgroups.

Barry Margolin

unread,
Sep 14, 1998, 3:00:00 AM9/14/98
to
In article <87btoj5...@ivm.de>,

Klaus Schilling <Klaus.S...@home.ivm.de> wrote:
>But Commion Lisp lacks call-with-current-continuation.

call/cc is mostly usable for implementing higher-level control
abstractions. CL includes most of them as built-in features.

BTW, call/cc can be implemented as a macro in CL. It won't permit
returning from a block twice, but that's an overall limitation of CL, not
specific to call/cc. Basically, CL permits the dynamic control stack to be
a traditional stack, and only requires variable bindings to be saved in
heap-allocated lexical environments.

Rainer Joswig

unread,
Sep 14, 1998, 3:00:00 AM9/14/98
to

> Erik Naggum wrote:
> >
> > I think I dislike Scheme more because of the way Scheme folks denigrate
> > Common Lisp, which they invariably haven't used, than anything else.
> >
>
> Every good programmer knows at least a handful of languages to get
> along,
> so a "Scheme folk" is rather an abstraction than an actual label.
>
> Now, on the point.
>

> Scheme is small. I guess R5RS is about one tenth the size of CLTL.
> That makes it suitable for
> 1) scripting. Yes, people _need_ scripting. They do it in Perl,
> Javascript
> TCL, whatever.

We are using CL for scipting.

Dimitrios Souflis

unread,
Sep 14, 1998, 3:00:00 AM9/14/98
to
Barry Margolin wrote:
>
> What does the size of the language spec have to do with whether it's
> suitable for scripting? Perl is one of the most popular scripting
> languages, and its book is about the size of CLtL. Part of what makes it
> popular is the wealth of built-in, high-level facilities (e.g. associative
> arrays and regular expressions) -- also much more in the style of CL
> (associative arrays are analogous to CL hash tables).

I'm more concerned with extension languages, so the use of the word
"scripting" was somewhat misleading. But even for the ordinary,
Perlish meaning, having a smaller, orthogonal language
is worthy. Watch what Olin Shivers and others are doing with SCSH.

The size of the language spec is correlated to the size of the average
implementation. Perl is used for scripting, but
1) it must be fully installed on the machine (not everybody can/wants)
2) it's too big at runtime compared to a small program (yes, Apache does it)

My company is releasing v.2.0 of a small-footprint client-server SQL
server with embedded HTTP server. Had we used Perl as a scripting
language point (1) above would mean that installing our products
scripting language would double the disk requirements, and
point (2) that we would increase our memory requirements. Admit that
it would seem silly to have a scripting language comparable in size to
the rest of the program.

I'm not opposite to CL per se, but having to port a whole CLISP
implementation (we target Win32 and several Un*xes) would be a burden,
even if I knew of free CLISPs (pointers welcome).

Stig Hemmer

unread,
Sep 14, 1998, 3:00:00 AM9/14/98
to
The only certain conclusion to this debate is that crossposting
between comp.lang.lisp and comp.lang.scheme generates flame wars.

Please don't.

Stig Hemmer,
Jack of a Few Trades.

PS: Note followup.

Dorai Sitaram

unread,
Sep 14, 1998, 3:00:00 AM9/14/98
to
In article <sfw7lz7...@world.std.com>,

Kent M Pitman <pit...@world.std.com> wrote:
>
>I argued vociferously to the Scheme committee that either Scheme
>continuations should take an extra arg saying final-p or else
>call-with-current-continuation should take an extra argsaying whether
>the continuation given should be a one-time-use continuation (i.e.,
>one that marks itself invalid after one use, allowing it to run its
>unwinds at unwind time instead of later at gc time, which is--though
>the scheme report doesn't say it outright--the only time the "proof"
>can ever be done that the continuation is no longer in use) or a
>multi-use continuation (i.e., one that never runs unwinds because it
>might go "return again"). Either of those "fixes" would fix the
>problem in a clean way. But the majority of authors of the Revised^N
>Scheme Report seem to like the present definition of Scheme as is
>because they're used to it, in spite of the mess I claim it makes.

Some dozen years ago, Friedman and Haynes showed
(proved!) that call/cc-one-shot is no less expressive
than call/cc. Making continuations one-shot is not
going to make your headaches go away. The
Pitman/Naggum line should be drawn firmly at "escaping"
continuations. Beyond that, there be dragons.

--d


Frank Adrian

unread,
Sep 14, 1998, 3:00:00 AM9/14/98
to
>Clemens wrote in message ...
>>Dimitrios Souflis <dsou...@altera.gr> writes:

>> Scheme is small. I guess R5RS is about one tenth the size of CLTL.

>But you have to mention that the Common Lisp standard goes into great
>length of explaining things and giving examples and implementation
>hints. R5RS does not do this.

This is simply not true. The denotational semnatics section of R5RS gives
more detail than the Common Lisp standard does (and in a much less ambiguous
manner, as well). As for implementation hints, I really have to wonder if a
language standards document is a place for this, given that it should be
targeted towards the largest readership base, i.e., the language users,
rather than the implementors. Implementation pragmata belong in an
implementation manual, not in the standard.

P.S. Please don't take this as a slam against the CL standard. I am aware
that many people have different opinions as to what should go into a
standard and that the people who created the CL standard had a monumental
task in front of them. I just think that it is unfair to slam a standard
that (for the most part) is a benchmark for non-ambiguity, clarity, and
minimalism for not "explaining things" (which it clearly does) in a manner
that would lead to more ambiguity and verbosity.
--
Frank A. Adrian
First DataBank

frank_...@firstdatabank.com (W)
fra...@europa.com (H)

This message does not necessarily reflect those of my employer,
its parent company, or any of the co-subsidiaries of the parent
company.


Barry Margolin

unread,
Sep 14, 1998, 3:00:00 AM9/14/98
to
In article <6tjr3o$kab$1...@client2.news.psi.net>,

Frank Adrian <frank_...@firstdatabank.com> wrote:
>>Clemens wrote in message ...
>>>Dimitrios Souflis <dsou...@altera.gr> writes:
>
>>> Scheme is small. I guess R5RS is about one tenth the size of CLTL.
>
>>But you have to mention that the Common Lisp standard goes into great
>>length of explaining things and giving examples and implementation
>>hints. R5RS does not do this.
>
>This is simply not true. The denotational semnatics section of R5RS gives
>more detail than the Common Lisp standard does (and in a much less ambiguous
>manner, as well). As for implementation hints, I really have to wonder if a
>language standards document is a place for this, given that it should be
>targeted towards the largest readership base, i.e., the language users,
>rather than the implementors. Implementation pragmata belong in an
>implementation manual, not in the standard.

I haven't checked specifically, but I don't think the denotational
semantics includes details like branch cuts for mathematical functions. It
just covers basic control and data flow primitives. And by design,
Scheme's control and data flow primitives are relatively simple (virtually
everything can be defined in terms of lambda and call/cc), which is why
it's feasible to write a DS (X3J13 had as one of its original goals a DS
for Common Lisp, but the person we tasked with it gave up).

It's easy to be precise when you're describing something simple. CLtL
takes several pages just describing all the features of CL's lambda list
parameters (&OPTIONAL, &KEY, etc.).

Also, R*RS makes extensive use of references to other documents. For
instance, the macro section simply says that macros must be hygienic and
points the reader to a paper on this subject, rather than going into
extensive detail about what it means.

Kent M Pitman

unread,
Sep 14, 1998, 3:00:00 AM9/14/98
to
[oh, darn. i almost replied to another cross-posted conversation.
i gotta remember to killfile this conversation so it stops snagging me.
http://world.std.com/~pitman/pfaq/cross-posting.html
]

"Frank Adrian" <frank_...@firstdatabank.com> writes:

> As for implementation hints, I really have to wonder if a
> language standards document is a place for this, given that it should be
> targeted towards the largest readership base, i.e., the language users,
> rather than the implementors. Implementation pragmata belong in an
> implementation manual, not in the standard.

We discussed this explicitly. The standard is not an implementation
guide and I would never have thought to even suggest it gives
implementation advice. I guess it does sometimes offer sample
implementations, but not in enough quantity. In the original design
(pre X3J13), we referred to the "white pages" as the main document and
"yellow pages" as shared libraries" and there were red and blue pages
that were implementation notes and design rationales that were
supposed to happen but no color of pages ever emerged but the white
ones. We struck a balance by not designing the CL document as an
implementation guide, but we also didn't make a specific policy of
removing info just because it was useful to implementation.

It would probably not have been financially viable to make an
implementation guide. Only implementors care, and they all have a
financial stake. Unlike Scheme, CL is not about college, it's about
business. (I'm simplifying, yes. But to make a point.) Businesses
had big stakes in CL and the whole process was about stabilizing
investments, not creating new entrants into the market. Certainly all
vendors have been more helpful than they might be in many industries
when new entrants have tried to come along, so it's not a vicious
thing. It's just that coping with the task of stabilizing the
enormous investment which was already CL was very large. So it seemed
a shame to defer such information to an implementation guide that would
never come.

And to an extent I can recall discussions (not at X3J13 specifically
but in the Lisp community generally for years even predating the
standards work) where the key element was that languages don't need to
be implemented really a lot. Programs get written a lot, compared to
implementations. If people need to keep reimplementing the language,
there is either something wrong with the language or the key
contribution of the language is not the programs that people will
write because programs don't need infinite implementations to run on.
The idea in CL was to take a bunch of stuff people do over and over
and centralize that in one place so that it wasn't repeated by each
programmer. (Dick Gabriel, in his "worse is better" articles, argues
that this might not be a "good idea". But it doesn't make it not a
"possible idea".) There is a potential pitfall where people rely on
the "myth of the sufficiently clever compiler" (sometimes just "SCC")
to solve all problems. It's tough. You want to rely on things you
might not know how to do fully optimally today, and it's tricky drawing
the line. Scheme has a simlar problem. ("Adopt only one thing--Lambda--
and optimize it to pieces and hope that's enough") CL's problem is
slightly different. ("Adopt many things and optimize each according to
idiosyncratic but practically popular patterns of actual usage").
Anyway, that CL is hard to implement is not a suprising result--it's
almost designed in. What CL doesn't implement, users would have.
So it's a kind of modularity thing--better to do it centrally.
You might disagree it's the right approach, but your disagreement
would have to be pragmatic based on how you weigh the options.
I think in principle it's a solid theory, just different than the
path Scheme takes.

I model CL as the function f(x)=1000000*x and Scheme as the function
f(x)=0.000001*x^1.000001 where f is how well the language uses my
time and effort, represented somehow as x. Big-O notation
tells me Scheme should win, but it doesn't tell me whether Scheme
will win within my lifetime. Practicality tells me to use CL.

I liked Erik Naggum's advice to people wanting to implement a language,
that they should implement scheme, and to people wanting to use a language,
that they should use common lisp. The only thing I would add is that,
for the most part, since there seeems to be some confusion on the point,
it's more useful if people implement programs than languages.
Somewhere lost in the soup of all this discussion is that the only reason
computers exist is to better quality of life, and I personally want to
have done something to better my intrinsic quality of life before I die.
I've exhausted the quantum allocated to getting me the language necessary
to do that. You can, like the serfs of the middle ages, take solace in
the idea that you're improving the quality of life for people in some
subsequent era. But if you do, probably the best part of that plan will
be dying before you find out how much of whhat you could have contributed
you've wasted.

A few of us should implement languages. Big powerful languages
with big powerful compilers that do big important things that people
really need. The rest of us should get down to the business of finding
out whether computers can do anything other than be big paperweights
and enormous timesinks. So far, in all of computer science, the benefits
have been few compared to the number of people doing things. Most has
been frittered away by bad choice of programming language that tried
to be "minimal" instead of "functional". Yeah, there's the occasional
medical diagnosis program or desktop publishing thing or web browser.
But with the massive parallelism of human effort, we could be doing better.
And the key to that is not "just making sure we have the perfect language".
The cost of doing that if we'd applied that theory to english would be
the loss of everything from Homer's works to Shakespeare's to Tom Clancy.
Human languages are not perfect. But their value comes from our willingness
to agree on things and move on, whether in Spanish which mostly gets
verb conjugation right but has a dreadfully small vocabulary or in French
which has atrocious spelling but very pretty sound to it or in English which
is not pretty at all but very adaptable. The language characteristics don't
matter nearly as much as that certain basic power is there and people are
willing to just use it and get on with life. Any additional moralizing is
like that little committee that tries to decide if Spanish should add
this or that word to the language while English and other non-committee-run
languages outpace it in anarchisticly-organized expressiveness. Small and
pretty is nice if small and pretty is your goal. If usefulness is nice,
ask someone who's got a serious commitment to being a user and a real
goal of doing something other than design his whole life.

(Note: I tell people when they ask that I design programming languages, but
I always admit that languages are only designed cyclically and that most of
being a language designer is about writing applications so you'll know when
the next round of questions come up what things matter and what don't.
Because you don't get that by sitting in committee. So nine years out of
ten in being a langauge designer is not spent designing languages.)

> P.S. Please don't take this as a slam against the CL standard. I am aware
> that many people have different opinions as to what should go into a
> standard and that the people who created the CL standard had a monumental
> task in front of them. I just think that it is unfair to slam a standard
> that (for the most part) is a benchmark for non-ambiguity, clarity, and
> minimalism for not "explaining things" (which it clearly does) in a manner
> that would lead to more ambiguity and verbosity.

No offense taken. But I'll add another defense for CL as well.

The Scheme standard uses two independent languages which it claims have
the same implications. This is hard to test. The English is readable by
more people than the formal semantics, I bet. What if there is a bug?
Which prevails? Procedurally, we decided to drop as much of redundancy
as possible from the ANSI spec. It's why there is no chart of the
class arrangement, for example. If I made a mistake, would the chart be
right or the class precedence list in the class definition? In such a
big document, given that it was already written when started and was
merely morphed into shape (an engineering feat of its own), the cost of
removing all such redundancy was prohibitive. The most obvious place
you see this is in the inconsistent application of glossary terms. The
original goal was to use only italic (glossary terms) and never roman
(english terms). At the point where everything was in italic, we could
remove the italic and just say any use of a glossary word meant what the
glossary said. But it was too costly. So we made an engineering choice
and the result leaves you on each case having to know "did we think about
this sentence and this particular use of the glossary word" and if you
see it in italic, the answer is yes. (You can imagine the tedium and why
it took years for my predecessor, Kathy Chapman, who started the glossary
and for me who probably doubled its size and widened the scope of usage
in the running text.) It was an informal goal of the process to make
the standard not internally redundant. Medium success. It was an
informal goal of the process to make the standard accessible to a lay
audience, to allow the community as a whole to criticize it.

I, for example, can't read a formal semantics. I have, I think, a
clear understanding of what it offers. But I hate the notation and
have never been good at infix, and have never had the patience to slug
through learning it well enough to apply reasoning about it at any
speed. (I consider it a bug that the formal semantics is not
s-expression-based. Ah well. Ironic,m perhaps, that my semantic
understanding of scheme should be so impaired by syntax, as ones
usually thinks of the point of a semantics to separate the language
from its syntax.) So I occasionally trip over something that is not
well-specified in the Scheme standard and get in an argument over
whether it's well-specified in the standard because it's in the formal
semantics. Some probably think this means I'm weak-mineded and my
opinion shouldn't be counted about Scheme. Maybe so. But there are
more weak-minded people where I came from, and I'm happy to support
those of us more feeble members of the society who are disenfranchised
by the use of a formal semantics. As such, I reject any claim that
the use of English is what makes the CL spec not clear. I think
English is up to the task. Time, budget, and the author's skill
is another matter. I will happily apologize for the places where
my English didn't leave it in a perfect state. But I don't think
going to half-English/half-formal-semantics fixes the problem. It
doesn't get rid of the English, AND it adds the possibility of
a confusion where one half says one thing and one says another.
To solve this, I'm told some other standards are written without English.
I have not seen it, but I'm told PL/1 (I think it was) had a definition
written in a formal language (only). Yuck.

Use of English brings the language to the users rather than requiring
the users to come to the language. (Sorry it's English. That's just
the international interchange convention. The ambiguity of talking
about my own language where I mean "the canonical natural language" is
embarassing to me sometimes. Do understand I'm aware there are other
languages out there. I even speak a couple of them.) Anyway, I think
in general computers do all too much of making people think like
machines instead of vice versa, and I'm proud to have been involved
in a little push in the other direction.

Moreover, languages don't exist in isolation. I consider it an
essential criterion of a programming language that its constructs be
pronounceable (hence my indulging a pronunciation guide in a few
places where a dictionary would not have one) and that people have
useful interchange terms for common situations (hence the 70 page
glossary--itself longer than the scheme spec). When it was done, I
looked through it and said "gee, this is big. " but then i looked for
terms that weren't already in common use and there weren't many. the
few there were (like "designator" and "generalized boolean", which i
made up) were concepts already present but just in search of a name.
So I regarded this as a process of helping programmers interchange,
not just programs. It didn't need that justification. Just the
saved text size of making the definitions centralized was enough.
(And I'd have had to invent a lot more terms if I hadn't overloaded
terms exactly as they're overloaded in English.) But it was still
nice to know it was contributing to stabilizing the terminology
in the culture.

This is really longer than I meant it to be. Sorry about that.
If I had more time I'd shorten it.

The above is my personal opinion and not necessarily the official
position of my company or country.

Jrm

unread,
Sep 14, 1998, 3:00:00 AM9/14/98
to

>I have often said that the way to tell a usable language is the
>presence of unwind-protect. It's not just that unwind-protect is a
>useful thing in and of itself, but its presence or absence says a
>great deal about the mindset of a language's designers.


What about dynamic-wind?

>Please don't forward this to comp.lang.scheme.

That'll keep 'em in the dark.


Jrm

unread,
Sep 14, 1998, 3:00:00 AM9/14/98
to

Kent M Pitman wrote in message ...

>I liked Erik Naggum's advice to people wanting to implement a language,
>that they should implement scheme, and to people wanting to use a language,
>that they should use common lisp.

I made more money when I hacked common lisp, but I have more fun when I
hack scheme.

(/ (- (f common-lisp) (f scheme)) (f C)) => float underflow

David Steuber The Interloper

unread,
Sep 15, 1998, 3:00:00 AM9/15/98
to
On 14 Sep 1998 08:17:01 GMT, dje...@well.com (Darius Bacon) claimed or
asked:

% tras...@david-steuber.com (David Steuber "The Interloper") writes:
%
% >I wonder if, with the support of some Java code, a Scheme compiler (or
% >CL) could be written to produce a .class file.
%
% Actually, no Java is needed -- there's Scheme code to read and write
% classfiles on my home page.

Can the Scheme code compile itself to run in the JVM?

David Steuber The Interloper

unread,
Sep 15, 1998, 3:00:00 AM9/15/98
to
On 13 Sep 1998 19:33:06 +0000, Erik Naggum <cle...@naggum.no> claimed
or asked:

% I know C very well and Scheme well. I became exhausted over the prospect
% of doing the same low-level cruft over and over and over. Scheme is
% often worse than C in this regard, because you will mostly find
% interpreted Scheme environments, and abstraction carries a heavy
% performance penalty, so you need to be much too clever. (this is like
% byte-compiled Common Lisps, which penalize you severely for using your
% own functions instead of the internal functions that are very heavily
% optimized.)
%
% if you really want to implement your own language, implement Scheme. if
% you really want to _use_ a good language, choose Common LIsp. corollary:
% implement Scheme in Common Lisp.

It seems to be implied that if you want to implement Common Lisp, it
should be done in Common Lisp. Also, why can't byte compiled Common
Lisp optimize your functions as well as its own?

I should have just made up my mind before being confused with the
facts.

One requirement I can safely say I have is that I need to be able to
express constructs of arbitrary complexity in a uniform way. I also
need to be able to compile or interpret the code. The other main
thing I need to do is generate code. I have already decided that it
would be highly desirable from a code reuse point of view to use which
ever Lisp I go with as a file format as well as an extension or
scripting language.

I have to wonder how much of Common Lisp must be implemented as
native, and how much can be implemented in Common Lisp.

As far as inventing languages goes, I would really, really, like to
avoid that. It is already beyond anything I have done to implement a
language that already has a specified standard. To invent a language,
at least a good one, must be more difficult still.

Paul Graham says in his book that one of the features of Common Lisp
is that it can extend itself. I am only a little way into the book,
but I like what I have seen so far. I just wish I had a free ANSI
Common Lisp implementation on NT that I could play with his examples
on. Any recommendations?

Barry Margolin

unread,
Sep 15, 1998, 3:00:00 AM9/15/98
to
In article <35ffc713....@news.newsguy.com>,

David Steuber "The Interloper" <tras...@david-steuber.com> wrote:
>I have to wonder how much of Common Lisp must be implemented as
>native, and how much can be implemented in Common Lisp.

Take a look at one of the Common Lisp implementations where source is
available, such as CMUCL, and I think you'll find that an enormous number
of the functions are implemented in CL. Most of the sequence and list
functions, hash tables, the reader and printer, CLOS, the condition system,
and the evaluator are frequently implemented as library functions in Lisp.
Probably more than half the functions in CL.

Kent M Pitman

unread,
Sep 15, 1998, 3:00:00 AM9/15/98
to
"Jrm" <j...@rebol.com> writes:

> >I have often said that the way to tell a usable language is the
> >presence of unwind-protect. It's not just that unwind-protect is a
> >useful thing in and of itself, but its presence or absence says a
> >great deal about the mindset of a language's designers.
>
> What about dynamic-wind?

In spite of the implementation similarity, they don't implement the
same things. what unwind-protect implements is the concept of
"remember to clean up when i'm done". this is an enormously simple
and basic thing people say and do all the time. it is presented
as system magic becuase it becomes associated with stacks
or continuations, but it has nothing to do with these--it merely
interacts with them. "when i'm done" is adequate to describe
the "when" for most people. anything else is mere refinement.

by contrast, dynamic-wind implements "remember to save and
restore this state each time i'm working on a certain project"
mostly people streamline their activities to avoid state changes,
so they don't find needs to articulate descriptions of state changes
because they work to make them not happen. i'm talking about the
real world, not computers. projects and task interleaving do occur
in the real world but are rarer than just projects un-interleaved.

my comment was NOT meant to say that other operators aren't useful.
just that i think unwind-protect is primitively important even to
non-programmers in routine and very mundane tasks in ways that dynamic
wind is not. i think unwind-protect is "conceptually more primitive"
(in that most people have the concept before learning how to
articulate computerized computation) than dynamic wind (where i think
many people don't have the concept until taught to think tha way).
i'm not making a statement about the relative usefulness of these
operators per se. i'm saying that a decision by a language to deny
the ability to express a "conceptually primitive" concept is more
telling than a decision to deny a "learned" concept.

just my opinion. you're welcome to disagree. (i've never heard
anyone agree with me, actually. it's a pretty odd "test" for a
language, i guess. but it's one that has held up for me over time.)

if you don't like my suggestion, though, what small test or tests what
ould do if someone tossed you a manual to look at for ten minutes and
then to say if the language you were looking at was thoughtful?

Darius Bacon

unread,
Sep 15, 1998, 3:00:00 AM9/15/98
to
tras...@david-steuber.com (David Steuber "The Interloper") writes:

>On 14 Sep 1998 08:17:01 GMT, dje...@well.com (Darius Bacon) claimed or
>asked:

>% tras...@david-steuber.com (David Steuber "The Interloper") writes:
>%
>% >I wonder if, with the support of some Java code, a Scheme compiler (or
>% >CL) could be written to produce a .class file.
>%
>% Actually, no Java is needed -- there's Scheme code to read and write
>% classfiles on my home page.

>Can the Scheme code compile itself to run in the JVM?

No. If I had that I'd probably be advertising it a bit more
heavily. :) It's just a no-frills classfile parser/unparser, which you
could use to bootstrap such a compiler without writing a line of Java.
I also have some Scheme code with a table of JVM opcodes, their stack
effects and operands, etc., but it didn't really seem worth releasing.
Write me if you have a burning desire to see it anyway.

(What's this doing in comp.lang.lisp?)

Barry Margolin

unread,
Sep 15, 1998, 3:00:00 AM9/15/98
to
In article <sfwlnnl...@world.std.com>,

Kent M Pitman <pit...@world.std.com> wrote:
>"Jrm" <j...@rebol.com> writes:
>> What about dynamic-wind?
>
>In spite of the implementation similarity, they don't implement the
>same things. what unwind-protect implements is the concept of
>"remember to clean up when i'm done".
...

>by contrast, dynamic-wind implements "remember to save and
>restore this state each time i'm working on a certain project"

Well, they seem conceptually similar, given the context of the languages.
Common Lisp doesn't let you resume a dynamic environment after you've
exited, so it's only necessary to clean up. Scheme's continuations allow
you to go back in, so you need to clean up when exiting and "messy up" when
resuming.

If you're not planning on resuming, you can simply provide a a function
that signals an error as the before-thunk to dynamic-wind, and you have
something equivalent to unwind-protect.

Erik Naggum

unread,
Sep 15, 1998, 3:00:00 AM9/15/98
to
* tras...@david-steuber.com (David Steuber "The Interloper")

| It seems to be implied that if you want to implement Common Lisp, it
| should be done in Common Lisp. Also, why can't byte compiled Common Lisp
| optimize your functions as well as its own?

because the byte-code interpretation machinery must add _some_ overhead
to the execution, and the more byte-code, the slower it runs, as in
wholly user-defined functions. the more stuff is implemented in the real
implementation language, the faster it runs. this skews the way people
look at performance, and which features they'll use. when abstraction is
expensive, people don't do it. when abstraction is cheaper than not
doing it, the benefits become more obvious.

| I should have just made up my mind before being confused with the facts.

that's always good advice when the world is too complex to answer one's
questions simply. (I think it helps to understand complexity and then go
back and decide to ignore some of it, but far too many frown at the
complexity of the world and actively denounce tools and languages and
techniques intended to deal with it.)

| I have to wonder how much of Common Lisp must be implemented as native,
| and how much can be implemented in Common Lisp.

this depends on how you implement your "native" functions. it's possible
to write a proto-Lisp that a simple compiler compiles to machine code and
then implement enough Common Lisp in proto-Lisp to be able to build a new
system entirely in Common Lisp. (this proto-Lisp can in fact be another
language entirely, but then it's _more_ work to build the compiler.)

| Paul Graham says in his book that one of the features of Common Lisp is
| that it can extend itself. I am only a little way into the book, but I
| like what I have seen so far. I just wish I had a free ANSI Common Lisp
| implementation on NT that I could play with his examples on. Any
| recommendations?

why do you need NT? it takes less effort to install Linux and run
Allegro CL 5.0 Trial Edition (an uncrippled version recently released)
than it takes to find something free and high quality under NT.

#:Erik
--
http://www.naggum.no/spam.html is about my spam protection scheme and how
to guarantee that you reach me. in brief: if you reply to a news article
of mine, be sure to include an In-Reply-To or References header with the
message-ID of that message in it. otherwise, you need to read that page.

Kent M Pitman

unread,
Sep 15, 1998, 3:00:00 AM9/15/98
to
Barry Margolin <bar...@bbnplanet.com> writes:

> In article <sfwlnnl...@world.std.com>,
> Kent M Pitman <pit...@world.std.com> wrote:
> >"Jrm" <j...@rebol.com> writes:
> >> What about dynamic-wind?
> >
> >In spite of the implementation similarity, they don't implement the
> >same things. what unwind-protect implements is the concept of
> >"remember to clean up when i'm done".
> ...
> >by contrast, dynamic-wind implements "remember to save and
> >restore this state each time i'm working on a certain project"
>
> Well, they seem conceptually similar, given the context of the languages.
> Common Lisp doesn't let you resume a dynamic environment after you've
> exited, so it's only necessary to clean up. Scheme's continuations allow
> you to go back in, so you need to clean up when exiting and "messy up" when
> resuming.


i've seen ports of lisp machine letf (uses dynamic-wind style technology)
to use letf-globally (uses approximately unwind-protect) instead
and it makes a mess. people tend to debug it in single-processing mode
and then get screwed in multitasking.

When using the word conceptually, though, I was
talking about how human beings describe processes in the real world.
My point was that humans don't have (that they speak about) stacks,
so when they do context switches, they don't do (that they speak of)
this operation. Consequently, linguistically, they do not use these
terms "primitively" (without training to do so). The world is indeed
prone to having re-entrant processes. People might read a book and
watch TV, for example. But where state changes are required, people
try (i observe informally) to avoid having to do the state change
(see the literature on "practice effects" and think about the kinds of
optimizations people make to operations). if i use reading glasses
to read, i don't want to read and watch TV at the same time because
i have to keep putting on and removing my glasses. i avoid that.
or i invent bifocals so that i don't have to change them in between.
but either way, i avoid the need to linguistically discuss the state
change. human linguistics is driven by experience, not vice versa.
you can try to teach people terms they have no experience about in order
to broaden their horizons, but in my opinion all but the intellectual
class will resent you for it. all just my armchair observations
and conclusions-of-the-day. feel free to disagree. i may be wrong
about the data or wrong about the conclusions that it's not linguistic,
but those are both premises to my real claim which is that if
you believe (as i do, right or wrong) that dynamic-wind is not
"conceptually primitive", then you understand why it is that i
think it's not as fundamental a "tell" to the thoughtfulness
of the language in other regards.

as to the usefulness of dynamic-wind, i'm not saying it's unimportant.
ever since i heard about it in 1981, i've repeatedly reminded people
of its existence and advocated its adoption on occasion. it doesn't
just trouble me that it's absent. it also troubles me that "places"
aren't just as good as "variables" in special binding. if you believe
in special binding at all, and i do, it should seem clear that it's
about "the process of dynamically instantiating within a task that is
possibly interleaved with other tasks for which the dynamic instantiation
has not been done" and that doing it only for symbols is a hack
reminiiscent of maclisp sublis, which as i recall (might be wrong)
only allowed you to sublis for symbols because it was possible to
attach a property to symbols (and not to all objects) allowing
constant (or almost that) time access to the substitution object.
So instead of O(m*n) performance [a size m object and n replacements]
you'd get O(m). But at the price that you couldn't sublis other
objects at all. If they'd just fallen out to using alists or
hash tables or some such in the other case, it could have been defined
on anything. (Some have made the same criticism about GET and
property lists.)

optimization is important, but it shouldn't come at the price of not
allowing the unoptimized cases. there may be ways to optimize other
cases in the future and the language should plan for that, leaving it
to users to make a judgment about which features they exercise at what
point in time based on their market needs.

> If you're not planning on resuming, you can simply provide a a function
> that signals an error as the before-thunk to dynamic-wind, and you have
> something equivalent to unwind-protect.

"Resuming" happens on stack group switch. unless you're going to
turn off scheduling to avoid the resuming, you're going to lose with
this approach unless i misunderstand you. as soon as your quantum is
exhausted, the dynamic-wind after-thunk is going to run, which may
be ok in some cases and not in others, and as soon as you get
rescheduled, you'll be in the error handler (perhaps in the scheduler).
Or am I not understanding you?

Steve Gonedes

unread,
Sep 15, 1998, 3:00:00 AM9/15/98
to

tras...@david-steuber.com (David Steuber "The Interloper") writes:


< Paul Graham says in his book that one of the features of Common Lisp
< is that it can extend itself. I am only a little way into the book,
< but I like what I have seen so far. I just wish I had a free ANSI
< Common Lisp implementation on NT that I could play with his examples
< on. Any recommendations?

CLISP will run under DOS and windows. Also there is LispWorks for
windows (I think it will work with the New Technology - as windows
always seems to take an aggresive stance to be backwards compatibe :)

I have no url for LispWorks (sorry) maybe someone will shout it out;
a search engine will certainly pick it up.

Enjoy the rest of your book.

Mike McDonald

unread,
Sep 15, 1998, 3:00:00 AM9/15/98
to
In article <g0nL1.33$yt.5...@burlma1-snr1.gtei.net>,

Barry Margolin <bar...@bbnplanet.com> writes:
> In article <35ffc713....@news.newsguy.com>,
> David Steuber "The Interloper" <tras...@david-steuber.com> wrote:
>>I have to wonder how much of Common Lisp must be implemented as
>>native, and how much can be implemented in Common Lisp.
>
> Take a look at one of the Common Lisp implementations where source is
> available, such as CMUCL, and I think you'll find that an enormous number
> of the functions are implemented in CL. Most of the sequence and list
> functions, hash tables, the reader and printer, CLOS, the condition system,
> and the evaluator are frequently implemented as library functions in Lisp.
> Probably more than half the functions in CL.

With the exception of the code to call into/out of C/Lisp, some low level
interrupt and mmap C code, all of CMUCL is written in lisp. Having to have a
working CL inorder to get a working CL makes bootstrapping CMUCL lots of fun!
:-)

Mike McDonald
mik...@mikemac.com


Gary T. Leavens

unread,
Sep 15, 1998, 3:00:00 AM9/15/98
to
This may not be what you're looking for,
but here's a handout I use in my classes...

PRINCIPLES OF PROGRAMMING LANGUAGES:
COMMON LISP VS. SCHEME

Gary T. Leavens, Department of Computer Science,
Iowa State University, Ames, Iowa 50011-1040 USA
lea...@cs.iastate.edu
$Date: 1995/09/23 18:38:01 $

The table below highlights the important differences between
Common Lisp and Scheme for those that know a more standard
Lisp dialect. Only the differences are highlighted here,
there are many more similarities.

Common Lisp Scheme
-------------------------------------------------------
() ; eq to nil ()
nil #f
t #t

(eq x y) (eq? x y)
(equal x y) (equal? x y)

(atom x) (not (pair? x))
(consp x) (pair? x)
(null x) (null? x)
(symbolp x) (symbol? x)
(zerop x) (zero? x)

(setf (car x) y) (set-car! x y)
(setf (cdr x) y) (set-cdr! x y)

(mapcar #'f l) (map f l)
(map 'list #'f l) (map f l)

(mapcar #'(lambda (x) (g x)) l) (map (lambda (x) (g x)) l)

#'(lambda (x) y) (lambda (x) y)

(setq var y) (set! var y)
(setf var y) (set! var y)

(cond (cond
((null x) 0) ((null? x) 0)
((eq x y) (f x)) ((eq? x y) (f x))
(t (g y))) (else (g y)))

(defun square (x) (define (square x)
(* x x)) (* x x))
; or
(define square (lambda (x) (* x x)))

(funcall (g f) x) ((g f) x)

(defun compose (f g) (define (compose f g)
#'(lambda (x) (lambda (x)
(funcall f (f (g x))))
(funcall g x))))


Acknowledgments

Thanks to Erik Naggum and Richard Gabriel for correcting my ancient
LISP accent. Of course any errors are not their responsibility but mine.

Gary
--
229 Atanasoff Hall, Department of Computer Science
Iowa State Univ., Ames, Iowa 50011-1040 USA
lea...@cs.iastate.edu phone: +1 515 294-1580 fax: +1 515 294-0258
URL: http://www.cs.iastate.edu/~leavens/homepage.html

Jrm

unread,
Sep 15, 1998, 3:00:00 AM9/15/98
to

Kent M Pitman wrote in a message ...
[comparison about dynamic-wind elided]

I think I agree with Barry (barmar) that it is pretty easy to view
unwind protect as a dynamic-wind with an empty entry thunk (in
most cases).

I was just concerned with the the syllogism:

1. Languages without unwind-protect are not well thought out.
2. Scheme doesn't have unwind protect.

I know you disagree with much of what the Scheme people say, but
you can't say they don't think about what they are doing.

>if you don't like my suggestion, though, what small test or tests what
>ould do if someone tossed you a manual to look at for ten minutes and
>then to say if the language you were looking at was thoughtful?

I look for the ability to write continuation-passing style code without
too much pain. If you can do that, then the language is pretty powerful
even if the designers didn't intend or understand it. (I use the term
"designer"
losely; too many languages these days are not designed in any sense of the
word)

Barry Margolin

unread,
Sep 15, 1998, 3:00:00 AM9/15/98
to
In article <LJwL1.1820$K02.1...@news.teleport.com>,

Mike McDonald <mik...@mikemac.com> wrote:
> With the exception of the code to call into/out of C/Lisp, some low level
>interrupt and mmap C code, all of CMUCL is written in lisp. Having to have a
>working CL inorder to get a working CL makes bootstrapping CMUCL lots of fun!
>:-)

So how do you implement CAR and CDR in Lisp? I recall that Lisp Machines
had definitions like:

(defun car (x)
(car x))

and then depend on the compiler to generate the appropriate code. I don't
really think these trivial cases count. Nor would LAP. I'm not sure
whether to count "subprimitives", e.g. something like:

(defun car (x)
(case (system:%typecode x)
((1) (system:%memory-ref x 0)) ; cons
(t (error ...))))
(defun car (x)
(case (system:%typecode x)
((1) (system:%memory-ref x 1)) ; cons
(t (error ...))))

Barry Margolin

unread,
Sep 15, 1998, 3:00:00 AM9/15/98
to
In article <6tm5us$6pr$1...@news.iastate.edu>,

Gary T. Leavens <lea...@cs.iastate.edu> wrote:
>#'(lambda (x) y) (lambda (x) y)

FYI, ANSI Common Lisp includes a LAMBDA macro so that #' is no longer
needed for lambda expressions. Essentially it's

(defmacro lambda ((&rest arglist) &body body)
`#'(lambda (,@arglist) ,@body))

But it's probably good to have it in the table, since you're likely to see
it in old code (and new code by people who want to be portable to old
environments, don't like the macro, or just can't break the habit).

It would probably be a good idea to show an example of #'<function-name> in
the table. Perhaps after the definitions of compose, you could demonstrate
this in a use:

CL:

(defun *-+ (x y)
(funcall (compose #'* #'+) x y)

Scheme:

(define (*-+ x y)
((compose * +) x y))

Barry Margolin

unread,
Sep 15, 1998, 3:00:00 AM9/15/98
to
In article <sfwvhmp...@world.std.com>,

Kent M Pitman <pit...@world.std.com> wrote:
>Barry Margolin <bar...@bbnplanet.com> writes:
>> If you're not planning on resuming, you can simply provide a a function
>> that signals an error as the before-thunk to dynamic-wind, and you have
>> something equivalent to unwind-protect.
>
>"Resuming" happens on stack group switch. unless you're going to
>turn off scheduling to avoid the resuming, you're going to lose with
>this approach unless i misunderstand you. as soon as your quantum is
>exhausted, the dynamic-wind after-thunk is going to run, which may
>be ok in some cases and not in others, and as soon as you get
>rescheduled, you'll be in the error handler (perhaps in the scheduler).
>Or am I not understanding you?

How did stack groups and quanta enter this discussion? By "resuming" I
meant "re-entering the dynamic environment". Sure, Scheme continuations
provide a way to implement coroutines, but if you're not planning on
re-entering the routine then the continuation is just an escape procedure,
like CL THROW or RETURN-FROM. Basically, I think I'm referring to the same
kind of distinction you made a day or two ago about one-shot continuations
-- there's no need to provide a useful before-thunk if you're escaping the
dynamic-wind via a one-shot continuation.

Mike McDonald

unread,
Sep 15, 1998, 3:00:00 AM9/15/98
to
In article <HJxL1.46$yt.8...@burlma1-snr1.gtei.net>,

Barry Margolin <bar...@bbnplanet.com> writes:
> In article <LJwL1.1820$K02.1...@news.teleport.com>,
> Mike McDonald <mik...@mikemac.com> wrote:
>> With the exception of the code to call into/out of C/Lisp, some low level
>>interrupt and mmap C code, all of CMUCL is written in lisp. Having to have a
>>working CL inorder to get a working CL makes bootstrapping CMUCL lots of fun!
>>:-)
>
> So how do you implement CAR and CDR in Lisp? I recall that Lisp Machines
> had definitions like:
>
> (defun car (x)
> (car x))

Yup, that's what it does.

(defun car (list) "Returns the 1st object in a list." (car list))

> and then depend on the compiler to generate the appropriate code. I don't
> really think these trivial cases count.

Well, since the compiler and its optimizations are written in lisp, seems
close enough for me. :-)

>Nor would LAP. I'm not sure
> whether to count "subprimitives", e.g. something like:
>
> (defun car (x)
> (case (system:%typecode x)
> ((1) (system:%memory-ref x 0)) ; cons
> (t (error ...))))
> (defun car (x)
> (case (system:%typecode x)
> ((1) (system:%memory-ref x 1)) ; cons
> (t (error ...))))

This seems reasonable to me too. Sure, at some point, you have to have some
primitives that the compiler "knows" about or some "foreign" implementation of
them.

BTW, shouldn't this last one be cdr?

Mike McDonald
mik...@mikemac.com


Gareth McCaughan

unread,
Sep 15, 1998, 3:00:00 AM9/15/98
to
Barry Margolin wrote:

> It would probably be a good idea to show an example of #'<function-name> in
> the table. Perhaps after the definitions of compose, you could demonstrate
> this in a use:
>
> CL:
>
> (defun *-+ (x y)
> (funcall (compose #'* #'+) x y)
>
> Scheme:
>
> (define (*-+ x y)
> ((compose * +) x y))

But preferably not that example, since it won't work. (Evaluating
(*-+ x y) will try to apply (lambda (x) ...) to args x and y.) :-)
Oh, and emacs just bracket-matched the rparen in my smiley with
the lparen before "defun", so it looks like something's wrong
with the parentheses in the CL definition of *-+.

(Yes, of course I know these are trivial errors.)

--
Gareth McCaughan Dept. of Pure Mathematics & Mathematical Statistics,
gj...@dpmms.cam.ac.uk Cambridge University, England.

David Rush

unread,
Sep 15, 1998, 3:00:00 AM9/15/98
to
lea...@cs.iastate.edu (Gary T. Leavens) writes:
> (funcall (g f) x) ((g f) x)
>
> (defun compose (f g) (define (compose f g)
> #'(lambda (x) (lambda (x)
> (funcall f (f (g x))))
> (funcall g x))))

For my money, these two examples are nearly my *entire* reason for
preferring Scheme over CL. call-with-current-continuation is the rest
of it, although call/cc is not of very commonly useful.

Then again, I also really like Smalltalk and SML.

david rush
--
Unfortunately nobody will *pay* for much besides C/C++ (bleah)

Gary T. Leavens

unread,
Sep 15, 1998, 3:00:00 AM9/15/98
to
Barry Margolin <bar...@bbnplanet.com> writes:

>In article <6tm5us$6pr$1...@news.iastate.edu>,
>Gary T. Leavens <lea...@cs.iastate.edu> wrote:
>>#'(lambda (x) y) (lambda (x) y)

>FYI, ANSI Common Lisp includes a LAMBDA macro so that #' is no longer
>needed for lambda expressions. Essentially it's

>(defmacro lambda ((&rest arglist) &body body)
> `#'(lambda (,@arglist) ,@body))

>But it's probably good to have it in the table, since you're likely to see
>it in old code (and new code by people who want to be portable to old
>environments, don't like the macro, or just can't break the habit).

Thanks, I didn't know that.

>It would probably be a good idea to show an example of #'<function-name> in
>the table. Perhaps after the definitions of compose, you could demonstrate
>this in a use:

>CL:

>(defun *-+ (x y)
> (funcall (compose #'* #'+) x y)

>Scheme:

>(define (*-+ x y)
> ((compose * +) x y))

I put in the folowing:

(defun my-cadr (ls) (define my-cadr (ls)
(funcall ((compose car cdr)
(compose #'car #'cdr) ls))
ls))

I thought that composing two functions that are usually considered binary
would be confusing.

Thanks for the help,

Gary Leavens

Frank Adrian

unread,
Sep 15, 1998, 3:00:00 AM9/15/98
to
Mike McDonald wrote in message ...

> This seems reasonable to me too. Sure, at some point, you have to have
some
>primitives that the compiler "knows" about or some "foreign" implementation
of
>them.

If you want to know how another language implementation group did this (for
another language) check out the Squeak page at http://squeak.cs.uiuc.edu/,
especially the HTML version of the OOPSLA paper "Back to the Future"
(ftp://st.cs.uiuc.edu/Smalltalk/Squeak/docs/OOPSLA.Squeak.html).
All-in-all, a reasonable paper on bootstrapping a language in general (even
if the language they bootstrap into is incurably low-level :-).

Phil Stubblefield

unread,
Sep 15, 1998, 3:00:00 AM9/15/98
to Gary T. Leavens
Gary T. Leavens wrote:
>
> Barry Margolin <bar...@bbnplanet.com> writes:
>
> >It would probably be a good idea to show an example of
> >#'<function-name> in the table. [...]

>
> I put in the folowing:
>
> (defun my-cadr (ls) (define my-cadr (ls)
> (funcall ((compose car cdr)
> (compose #'car #'cdr) ls))
> ls))

It might make sense to change the name of the CL parameter from "ls"
to "list". Another poster said that the lack of funcall and #' was
nearly his entire reason for preferring Scheme over CL. In the same
vein, I can imagine many newcomers looking at these examples and
questioning the additional cruft on the CL side. The answer, of
course, is that this cruft is the cost you pay for the luxury of
separate namespaces for "variable values" and "function values," if
I may use that terminology. Changing the name of the CL parameter
to "list" helps the examples to spotlight the strengths and
weaknesses of the different approaches taken by the two languages.


Phil Stubblefield
Rockwell Palo Alto Laboratory 650/325-7165
http://www.rpal.rockwell.com/~phil ph...@rpal.rockwell.com

d s f o x o g s c i . u c s d . e d u

unread,
Sep 15, 1998, 3:00:00 AM9/15/98
to
Phil Stubblefield <ph...@rpal.rockwell.com> writes:

> to "list". Another poster said that the lack of funcall and #' was
> nearly his entire reason for preferring Scheme over CL. In the same
> vein, I can imagine many newcomers looking at these examples and
> questioning the additional cruft on the CL side. The answer, of
> course, is that this cruft is the cost you pay for the luxury of
> separate namespaces for "variable values" and "function values," if
> I may use that terminology.

Why is this a luxery? I honestly thought it was considered a mistake.
--
David Fox http://www.cat.nyu.edu/fox xoF divaD
UCSD HCI Lab ds...@cogsci.ucsd.edu baL ICH DSCU

Pierre Mai

unread,
Sep 15, 1998, 3:00:00 AM9/15/98
to
Erik Naggum <cle...@naggum.no> writes:

> | Paul Graham says in his book that one of the features of Common Lisp is
> | that it can extend itself. I am only a little way into the book, but I
> | like what I have seen so far. I just wish I had a free ANSI Common Lisp
> | implementation on NT that I could play with his examples on. Any
> | recommendations?
>

> why do you need NT? it takes less effort to install Linux and run
> Allegro CL 5.0 Trial Edition (an uncrippled version recently released)
> than it takes to find something free and high quality under NT.

Although I'd also highly recommend switching to Linux, Harlequin _is_
offering a high-quality, "free"[1] implementation of ANSI Common Lisp
for NT. Just go visit http://www.harlequin.com/, and follow the links
to Harlequin Lispworks Personal Edition (IIRC)...

Regs, Pierre

Footnotes:
[1] free as in "no money" for personal use, _not_ free as in GPL'd
software.

--
Pierre Mai <de...@cs.tu-berlin.de> http://home.pages.de/~trillian/
"Such is life." -- Fiona in "Four Weddings and a Funeral" (UK/1994)

Kent M Pitman

unread,
Sep 16, 1998, 3:00:00 AM9/16/98
to
Steve Gonedes <jgon...@worldnet.att.net> writes:

> I have no url for LispWorks (sorry) maybe someone will shout it out;
> a search engine will certainly pick it up.

http://www.harlequin.com/products/ads/lisp/

If you forget, of course, you can do like i just did and go to the
Harlequin web page at http://www.harlequin.com/


Kent M Pitman

unread,
Sep 16, 1998, 3:00:00 AM9/16/98
to
Barry Margolin <bar...@bbnplanet.com> writes:

> CL:
>
> (defun *-+ (x y)
> (funcall (compose #'* #'+) x y)
>
> Scheme:
>
> (define (*-+ x y)
> ((compose * +) x y))

And, in fairness, it's worth noting there's a price to this. Lots of
scheme code is afraid to use LIST, CONS, and other function names as
variables for fear of collisions. Hence, the following style of
dodge, which some of us find tacky, comes up a lot:

CL:

(defun list-cons (element list)
(list element list))

Scheme:

(define (list-cons element lst)
(list element lst))

You see it even in code that doesn't use the conflicting function,
such as

(define (adjoin element lst)
(if (member? element lst) lst (cons element lst)))

because, I think, you worry it's too hard to see a stray use of a
function in a large body of code or because you worry a call to LIST
will later be added.


Kent M Pitman

unread,
Sep 16, 1998, 3:00:00 AM9/16/98
to
d s f o x @ c o g s c i . u c s d . e d u (David Fox) writes:

> Phil Stubblefield <ph...@rpal.rockwell.com> writes:
>
> > ... The answer, of


> > course, is that this cruft is the cost you pay for the luxury of
> > separate namespaces for "variable values" and "function values," if
> > I may use that terminology.
>
> Why is this a luxery? I honestly thought it was considered a mistake.

Because all evidence suggests that the human brain has no trouble at
all entertaining multiple namespaces and routinely seeks them. By
reducing the number of namespaces, you decrease the space of available
names on the basis of an unproven claim that the human mind is unable
to make distinctions like "the f in (f x) is different than the f in
(g f)". The fact that "buffalo buffalo buffalo" and even "buffalo
buffalo buffalo buffalo buffalo" and "buffalo buffalo buffalo buffalo
buffalo buffalo buffalo" are valid sentences in english (and speakers
can tell you which are the nouns and which are the verbs) suggest that
it is natural to make strong positional distinctions. In spanish, the
sentence "como como como" (i eat how i eat) is another example. It's
easy to generate less pathological examples. Thinking for about ten
seconds I can come up with: "A box cannot box a box but a man can box a
box."

The point not being that one always uses words together. Rather, the
point being that sentences like "I want to box." and "That is a box."
connote different meanings of box based on part of speech. Yet no
ambiguity results because people aret rained from birth to know that
in different parts of speech, words may mean different things. This
allows one to reuse words in places where no ambiguity will result.
Then computer language designers come along and make foolish,
unfounded sentences like "to avoid confusion, we don't allow multiple
namespaces" when they really mean "because we think you'll avoid
confusion by it, we have cut off your flexibility to use certain words
in certain places and we now request that you tell others they'll be
confused until you've confuse them into believing they are in fact
confused by the confusion i've confused you with."

Kent M Pitman

unread,
Sep 16, 1998, 3:00:00 AM9/16/98
to
"Jrm" <j...@rebol.com> writes:

>
>
> Kent M Pitman wrote in a message ...
> [comparison about dynamic-wind elided]
>
> I think I agree with Barry (barmar) that it is pretty easy to view
> unwind protect as a dynamic-wind with an empty entry thunk (in
> most cases).
>
> I was just concerned with the the syllogism:
>
> 1. Languages without unwind-protect are not well thought out.
> 2. Scheme doesn't have unwind protect.
>
> I know you disagree with much of what the Scheme people say, but
> you can't say they don't think about what they are doing.

Heh. Let's be careful here. I didn't mean to say they don't think
about what they're doing. I meant to say it suggests to me they don't
think about what *I* (as a user of their language) am doing.

I saw a special on TV once about Bitstream, the font company, in which
someone pointed out that font design is not really about the design of
letters. It's about the design of words. The letters are right when
if you compose them, the words look good. By analogy, I believe
a language is right if, when you compose sentences in it, the sentences
look good. I tend to believe a language that doesn't offer me a way
to say the unwind-protect thing leaves me looking stupid when I want to
express this concept, and I personally feel undervalued by languages
that do not accomodate my need in this regard.

It's just my personal judgment. Your mileage may, of course, vary.

> >if you don't like my suggestion, though, what small test or tests what
> >ould do if someone tossed you a manual to look at for ten minutes and
> >then to say if the language you were looking at was thoughtful?
>
> I look for the ability to write continuation-passing style code without
> too much pain. If you can do that, then the language is pretty powerful
> even if the designers didn't intend or understand it. (I use the term
> "designer"
> losely; too many languages these days are not designed in any sense of the
> word)

This is a fair thing for any individual to want. I just think the set
of people wanting to say "do x and be sure to do y when you're done"
is large because it uses only words taught to first-graders. And the
set of people wanting to say "construct me a representation of the
function which represents the things i'd planned to do later so that
later i can do (or redo) those things even if i am already in the
process of doing them". It's not that I doubt your desire or ability
to say this. I guess it just reflects my belief that call/cc is much
farther down the luxury chain and that even though unwind-protect is
often not taught to novice programmers, it should be. I've heard
people make the same claim about continuations. But I'll believe it's
apprpriate when I see pre-programming children adept in the concept.
Same with tail recursion over loop, btw. The things I think belong as
ESSENTIALS in the language are the terms people arrive to programming
with intuitions about. Programming should be an extension to our
basic way of thinking, not a replacement. It's fine to add call/cc to
the set of ways you can think about things, but before you call it a
natural way to think about things, I think the burden is to show
examples of ordinary people who naturally DO think about things this
way long before they are programmers.

Just my opinion.

David Steuber The Interloper

unread,
Sep 16, 1998, 3:00:00 AM9/16/98
to
On 15 Sep 1998 06:42:39 GMT, dje...@well.com (Darius Bacon) claimed or
asked:

% (What's this doing in comp.lang.lisp?)

I cross posted the original question. The results turned out to be
the perfect demonstration of why _not_ to cross post. You can bet I
won't make that mistake again for a while.

--
David Steuber
http://www.david-steuber.com
To reply by e-mail, replace trashcan with david.

When the long night comes, return to the end of the beginning.
--- Kosh (???? - 2261 AD) Babylon-5

David Steuber The Interloper

unread,
Sep 16, 1998, 3:00:00 AM9/16/98
to
On 15 Sep 1998 10:01:24 +0000, Erik Naggum <cle...@naggum.no> claimed
or asked:

% * tras...@david-steuber.com (David Steuber "The Interloper")
% | It seems to be implied that if you want to implement Common Lisp, it
% | should be done in Common Lisp. Also, why can't byte compiled Common Lisp
% | optimize your functions as well as its own?
%
% because the byte-code interpretation machinery must add _some_ overhead
% to the execution, and the more byte-code, the slower it runs, as in
% wholly user-defined functions. the more stuff is implemented in the real
% implementation language, the faster it runs. this skews the way people
% look at performance, and which features they'll use. when abstraction is
% expensive, people don't do it. when abstraction is cheaper than not
% doing it, the benefits become more obvious.

I would like to see Common Lisp running in the Java Virtual Machine.
That means all of it will be byte code anyway. That should make
everything equal.

% | I have to wonder how much of Common Lisp must be implemented as native,
% | and how much can be implemented in Common Lisp.
%
% this depends on how you implement your "native" functions. it's possible
% to write a proto-Lisp that a simple compiler compiles to machine code and
% then implement enough Common Lisp in proto-Lisp to be able to build a new
% system entirely in Common Lisp. (this proto-Lisp can in fact be another
% language entirely, but then it's _more_ work to build the compiler.)

By "native", I mean something the underlying machine understands. In
my case, the JVM. For example, there needs to be some way to
represent a CONS, CAR, and CDR. In Java, I might write something like
this:

public class cons {
public object car;
public object cdr;
}

The variables car and cdr can hold any object, including a cons. This
gives me my primitive cons type. In a real Java class, I would make
car and cdr private and define a bunch of members. I could provide
for a consp predicate that returns t if the object is a cons.

% why do you need NT? it takes less effort to install Linux and run
% Allegro CL 5.0 Trial Edition (an uncrippled version recently released)
% than it takes to find something free and high quality under NT.

I have a whole bunch of software, including this news reader, that
requires NT or Windows 95 to run. If I could drop Microsoft's OS, I
would.

David Steuber The Interloper

unread,
Sep 16, 1998, 3:00:00 AM9/16/98
to
On Tue, 15 Sep 1998 17:46:15 GMT, Barry Margolin
<bar...@bbnplanet.com> claimed or asked:

% So how do you implement CAR and CDR in Lisp? I recall that Lisp Machines
% had definitions like:
%
% (defun car (x)
% (car x))

What what what? Isn't that a tautology? And how would defun be
defined?

(defun defun (x)
(defun x))

Jussi Piitulainen

unread,
Sep 16, 1998, 3:00:00 AM9/16/98
to
Kent M Pitman <pit...@world.std.com> writes:

> And, in fairness, it's worth noting there's a price to this. Lots
> of scheme code is afraid to use LIST, CONS, and other function names
> as variables for fear of collisions. Hence, the following style of
> dodge, which some of us find tacky, comes up a lot:

.
.


> (define (adjoin element lst)
> (if (member? element lst) lst (cons element lst)))

It's a pretty small price, at least to those of us who find this CL
practice distasteful anyway.

I'm a Schemer but I feel that the name `lst' is ugly; I never use it
but I wouldn't use `list' either. Instead, I borrow from Prolog
culture here and use plurals of more or less mnemonic variable names
for lists,

(define (adjoin e es) (if (member e es) e (cons e es)))
;; yes, I often use such short variable names

or else I use some name that describes the collection better than just
the fact that it is implemented as a list,

(define (adjoin element set)
(if (member element set)
set
(cons element set))),

which, of course, is another example of the tension, `set' being an
assignment form in some related languages.

The bottom line for me is that good choice of names requires careful
thought anyway, whatever the number of namespaces, since the code is
for humans to read, and there will always be clashes. It's a pretty
small price.

(Re clashes, a friend once tried to use `auto' as a variable name in C
code that dealt with automobiles. It's a reserved word in C, though
never? actually used.)
--
Jussi

Marco Antoniotti

unread,
Sep 16, 1998, 3:00:00 AM9/16/98
to

Well! Since I see the Scheme/CL flame war re-igniting, I'll jump in. :)

MORE COMPARISONS BETWEEN SCHEME (R^nRALS) AND CL

Common Lisp Scheme

(defclass bird () ; Let's go ahead and
((plumage-color :accessor color)) ; implement yet
; another OO Scheme extension
; hoping it'll make in
; the next^n Revised^n Report

(defclass ostrich (bird))

(defmethod fly ((b bird))
(format *standard-output*
"Volare! Oh-oh!~%"))

(defmethod fly ((b ostrich))
(error 'non-flying-bird :bird b))

(define-condition non-flying-bird (error) ; Conditions/Exceptions?
((bird :reader grounded-bird :initarg :bird)); In Scheme?
(:report (lambda (c s)
(format s "Bird ~S can't fly."
(grounded-bird c))))
)

Of course I am being unfair. But if you need OO and you want
parenthesis, you have no choice. :)

--
Marco Antoniotti ===========================================
PARADES, Via San Pantaleo 66, I-00186 Rome, ITALY
tel. +39 - (0)6 - 68 10 03 16, fax. +39 - (0)6 - 68 80 79 26
http://www.parades.rm.cnr.it

Klaus Schilling

unread,
Sep 16, 1998, 3:00:00 AM9/16/98
to
Marco Antoniotti <mar...@galvani.parades.rm.cnr.it> writes:

> Well! Since I see the Scheme/CL flame war re-igniting, I'll jump in. :)
>
> MORE COMPARISONS BETWEEN SCHEME (R^nRALS) AND CL
>
> Common Lisp Scheme
>
> (defclass bird () ; Let's go ahead and
> ((plumage-color :accessor color)) ; implement yet
> ; another OO Scheme extension
> ; hoping it'll make in
> ; the next^n Revised^n Report
>
> (defclass ostrich (bird))
>
> (defmethod fly ((b bird))
> (format *standard-output*
> "Volare! Oh-oh!~%"))
>

(defmethod sing ((b bird))
(format *standard-output*
"Cantare! Oh-oh-oh-oh!~%"))


> (defmethod fly ((b ostrich))
> (error 'non-flying-bird :bird b))
>
> (define-condition non-flying-bird (error) ; Conditions/Exceptions?
> ((bird :reader grounded-bird :initarg :bird)); In Scheme?
> (:report (lambda (c s)
> (format s "Bird ~S can't fly."
> (grounded-bird c))))
> )
>

(format s "Nel blu`, negli occhi tuoi blu`, felice di stare quaggiu`.")

> Of course I am being unfair. But if you need OO and you want
> parenthesis, you have no choice. :)
>

The current situation in scheme with many different object systems, one for
each taste, is much better: yasos, bos, mos, tiny-clos, meroon, meroonet,
scoops,miniops, ...

Klaus Schilling

brl...@my-dejanews.com

unread,
Sep 16, 1998, 3:00:00 AM9/16/98
to
In article <6tmh2c$ufe$1...@news.iastate.edu>,

lea...@cs.iastate.edu (Gary T. Leavens) wrote:
> (defun my-cadr (ls) (define my-cadr (ls)
> (funcall ((compose car cdr)
> (compose #'car #'cdr) ls))
> ls))

(define (my-cadr ls) ...)

-----== Posted via Deja News, The Leader in Internet Discussion ==-----
http://www.dejanews.com/rg_mkgrp.xp Create Your Own Free Member Forum

Reini Urban

unread,
Sep 16, 1998, 3:00:00 AM9/16/98
to
brl...@my-dejanews.com wrote about:
>> http://www.well.com/user/djello/scheme-for-lispers.html

>One nit: Exclamation points are specifically about mutation, not side effects
>in general. For example, the write and display procedures don't have
>exclamation points.

Are there any more (written or unwritten) rules about special prefixes
for internal lisp functions or even docs in sources describing that?
esp. the "%" prefix attracted me.

cmucl is thanksfully very conservative using only % for internals, but i
know other lisps or schemes which sometimes use other special prefixes
on internal functions.
some use it even heavily so that they really look like a c mess. esp.
scheme but some lisps also.

% cmucl, openlisp: internal helpers not exported to the user
cmucl, clisp: system specific internal funcs (c side) also
sometimes also used for hooks even on the user side.
xlisp: only opcode operators (there are no internals,
everything is defined in c)

%% probably very internal (very dirty c habits :)

! only sawn as suffix in scheme with mutations so far.
in gambit as prefix also.

& yes! i've sawn that with functions too.

$ ?? ie in guile: $cos
gambit: $code

_ very bad c habit, sawn in gambit

__ bigloo: a feature? module names
gambit: c defines

# scheme: special symbols as #f #t ...
thanksfully reserved for the reader in lisp

## gambit: Unstable additions (user side) and all
system internals

this is of course highly implementation specific.
I usually don't care for internals like this since lisp uses
thanksfully declarative prefixes on the user side.
I personally use only % for internals in my library.
just curious how these guys came to these or if there is anything carved
into stone, means written.

---
Reini Urban
http://xarch.tu-graz.ac.at/autocad/news/faq/autolisp.html

Reini Urban

unread,
Sep 16, 1998, 3:00:00 AM9/16/98
to
some corrections, more insights:

I posted:


>! only sawn as suffix in scheme with mutations so far.
> in gambit as prefix also.

corr: not found in gambit as prefix. sorry.

>& yes! i've sawn that with functions too.
>
>$ ?? ie in guile: $cos
> gambit: $code

marc feeley: old lisp convention for names introduced by a macro.

>_ very bad c habit, sawn in gambit

corr: not found in gambit. sorry.

>__ bigloo: a feature? module names
> gambit: c defines

corr: in gambit only in c-code as ____

># scheme: special symbols as #f #t ...
> thanksfully reserved for the reader in lisp
>
>## gambit: Unstable additions (user side) and all
> system internals

very gambit specific, because it's illegal in other lisps/schemes.
Now that's a GOOD reason!

Klaus Schilling

unread,
Sep 16, 1998, 3:00:00 AM9/16/98
to
Pierre Mai <de...@dent.isdn.cs.tu-berlin.de> writes:

> Erik Naggum <cle...@naggum.no> writes:
>
> > | Paul Graham says in his book that one of the features of Common Lisp is
> > | that it can extend itself. I am only a little way into the book, but I
> > | like what I have seen so far. I just wish I had a free ANSI Common Lisp
> > | implementation on NT that I could play with his examples on. Any
> > | recommendations?
> >

> > why do you need NT? it takes less effort to install Linux and run

> > Allegro CL 5.0 Trial Edition (an uncrippled version recently released)

> > than it takes to find something free and high quality under NT.
>

> Although I'd also highly recommend switching to Linux, Harlequin _is_
> offering a high-quality, "free"[1] implementation of ANSI Common Lisp
> for NT. Just go visit http://www.harlequin.com/, and follow the links
> to Harlequin Lispworks Personal Edition (IIRC)...
>
> Regs, Pierre
>
> Footnotes:
> [1] free as in "no money" for personal use, _not_ free as in GPL'd
> software.

The GNU sense of free is the only acceptable one.
gcl and clisp are free (but maybe not ansi-compliant) , harlequin ain't.

Klaus Schilling

Ray Dillinger

unread,
Sep 16, 1998, 3:00:00 AM9/16/98
to
Erik Naggum wrote:

<a long and convoluted flame on schemers.... deleted for reasons
of taste and space>


Geez, Erik, what did you have to go and do that for?

Here somene posted something that I thought might lead somewhere
useful, and you had nothing better to do than make sure that the
flames drown out any useful info that might be posted?

Some of us are actually interested in hearing about the design
philosophies and differences, you know? Now because of your
post we'll have to slog through mounds and mounds of crap to
find anything.

Ray

Stig Hemmer

unread,
Sep 16, 1998, 3:00:00 AM9/16/98
to
Klaus Schilling <Klaus.S...@home.ivm.de> writes:
> The GNU sense of free is the only acceptable one.

So, you want to restrict the way people use the English language.

Why this censorship?

Stig Hemmer,
Jack of a Few Trades.

Erik Naggum

unread,
Sep 16, 1998, 3:00:00 AM9/16/98
to
* Ray Dillinger <be...@sonic.net>

| Now because of your post we'll have to slog through mounds and mounds of
| crap to find anything.

step back one article in the thread and try this line again. it should
fit equally well. it might actually fit equally well to your article.

I wonder why Scheme adherents are helpless to respond, while Common Lisp
users who don't like the Scheme wars on Common Lisp are causing them to
respond the way they do. it looks like you use disagreement as the basis
of your blaming people for most everything you don't like. just quit it.

#:Erik
--
http://www.naggum.no/spam.html is about my spam protection scheme and how
to guarantee that you reach me. in brief: if you reply to a news article
of mine, be sure to include an In-Reply-To or References header with the
message-ID of that message in it. otherwise, you need to read that page.

Barry Margolin

unread,
Sep 16, 1998, 3:00:00 AM9/16/98
to
In article <35ff1e74....@news.newsguy.com>,

David Steuber "The Interloper" <tras...@david-steuber.com> wrote:
>On Tue, 15 Sep 1998 17:46:15 GMT, Barry Margolin
><bar...@bbnplanet.com> claimed or asked:
>
>% So how do you implement CAR and CDR in Lisp? I recall that Lisp Machines
>% had definitions like:
>%
>% (defun car (x)
>% (car x))
>
>What what what? Isn't that a tautology? And how would defun be

Most compilers open-code calls to functions like CAR, so the above means
"Create a function object that contains the code that the compiler
generates for CAR and put it in the global function binding of the CAR
symbol."

--
Barry Margolin, bar...@bbnplanet.com
GTE Internetworking, Powered by BBN, Burlington, MA
*** DON'T SEND TECHNICAL QUESTIONS DIRECTLY TO ME, post them to newsgroups.

Kent M Pitman

unread,
Sep 16, 1998, 3:00:00 AM9/16/98
to
Klaus Schilling <Klaus.S...@home.ivm.de> writes:

> Pierre Mai <de...@dent.isdn.cs.tu-berlin.de> writes:
>
> > [1] free as in "no money" for personal use, _not_ free as in GPL'd
> > software.
>

> The GNU sense of free is the only acceptable one.

Hmmm. Are you saying it's the `only acceptable sense of free' for
your personal situation right now, or are you making a
political/religious statement that implicitly seeks to limit the
entire Lisp community to this point of view?

Barry Margolin

unread,
Sep 16, 1998, 3:00:00 AM9/16/98
to
In article <sfw90jk...@world.std.com>,

Kent M Pitman <pit...@world.std.com> wrote:
>The point not being that one always uses words together. Rather, the
>point being that sentences like "I want to box." and "That is a box."
>connote different meanings of box based on part of speech. Yet no
>ambiguity results because people aret rained from birth to know that
>in different parts of speech, words may mean different things. This
>allows one to reuse words in places where no ambiguity will result.
>Then computer language designers come along and make foolish,
>unfounded sentences like "to avoid confusion, we don't allow multiple
>namespaces" when they really mean "because we think you'll avoid
>confusion by it, we have cut off your flexibility to use certain words
>in certain places and we now request that you tell others they'll be
>confused until you've confuse them into believing they are in fact
>confused by the confusion i've confused you with."

Maybe it's retroactive justification, but the impression I've gotten from
many "Lisp-1'ers" is that it's more than just "to avoid confusion", but
there's a real philosophical reason. In the Lisp family, since functions
are first-class, they're often considered as nouns, not verbs, and it would
be best to use the same phrasing in both contexts.

However, like Kent, I don't happen to agree with this. In English we don't
have much problem turning a verb into a noun by adding the "to" prefix,
i.e. "box" is a verb and must be used in the predicate of a sentence, while
"to box" is a noun phrase and may be used most places where nouns are
allowed. The "#'" prefix of Common Lisp is analogous to the "to" prefix of
English (just as the "'" prefix of practically the entire Lisp family of
languages is analogous to quoting in many natural languages -- they allow
you to talk *about* tokens rather than use them for their meaning).

Kent M Pitman

unread,
Sep 16, 1998, 3:00:00 AM9/16/98
to
Barry Margolin <bar...@bbnplanet.com> writes:

> Maybe it's retroactive justification, but the impression I've gotten
> from many "Lisp-1'ers" is that it's more than just "to avoid
> confusion", but there's a real philosophical reason. In the Lisp
> family, since functions are first-class, they're often considered as
> nouns, not verbs, and it would be best to use the same phrasing in
> both contexts.

(Note also I've made the claim that making this change virtually forces
the need to have hygienic macros, and they don't seem to mind that where
CL people would flip out. I won't repeat that argument here. I've promised
to write it up sometime and will, eventually.)

I think languages pick some axioms from which to start. Scheme picked
the idea of "functional abstraction" as central, and that's what leads
to functions being passed all over in the manner you describe. It's
used in all kinds of places CL programmers would use different
solutions (CLOS; powerful iteration notations like DO, DOLIST,
DOTIMES, and LOOP; multiple inheritance; custom macros; keyword args;
special variables; overloaded types, such as symbols, which you can
both funcall and put properties on; and so on). The decision by
Scheme leaves functions called so much that they feel it's
inappropriate not to make the syntax utterly streamlined. The
decision by CL leaves functional arguments only one of many ways to
solve the general problem of abstraction--CL programmers use
functional arguments [I use them often many times in a day] but not to
the exclusion of all else.

CL has a syntax that is streamlined to
(a) alert you to the presence of functional uses when present and (b)
keep you from paying the namespace and macro price scheme pays for
having made the decision differently and (c) allow you easy access to
both the function and the variable in cases where you want to
write (defun foo (list) (list list)). (Of course, the real case is
never this simply. The real case is usually
(defun foo (list)
... first and only use of list ...
...50 lines of code...
... first and only use of function list ...))
The idea that there might be a call to LIST ever in your code causes
Scheme programmers never to use LIST as a variable name out of what
in CL would seem like "random paranoia", though in Scheme the reason
makes sense. Perosnally, I think it violates referential transparency
if any free reference to an apparent variable might turn out to be
"customized" (or "interfered with", depending on your point of view)
by a surrounding variable binding. I like the fact that I can look
at Lisp code and mostly know what's going on without worrying any
call to an apparent function might be a variable.


I doubt you and I are disagreeing. I'm just clarifying that any
claim about the "statistics" of how often functions are called
and therefore how often you need Lisp1-style syntax are tainted
by the kinds of things Scheme people program about and the kinds of
things Lisp people do. Admittedly it's also biased by what's
there, but you rarely hear complaints from users who are not
trying to either create for the 10,000th time an implementation
of the lambda calculus or church numerals or some such, or who are
not "used to programming in scheme and unable to get back into
any other style". Users who have not been through Scheme and who
learn Lisp directly from someone who is not biased usually just
make note of the fact and get on with their programming. Even in
algebraic-syntax langauges, when I run across a new language,
I always ask a few basic questions that include "are semicolons
separators or terminators", "are names case-sensitive", and
"are function names in the same space as variable names". I don't
a priori care which the answer is for new languages I learn
not because it doesn't make a difference, but because I (perhaps
naively) assume that the designer of the language in question
made a good decision for the context of the language and then I
look to see if that turns out to be so. I think CL stands up
to the test. In particular, it permits but does not favor functional
abstraction, which leads people to examine all of their options and
to use an appropriate option for the circumstance. That choice
is influenced by a great many things in CL, which offers me the
flexiblity I want.

Scheme probably made a good decision for what it's trying to do. Some
of the things we did in T with function names and object message
handlers wouldn't have worked in a Lisp2 because they relied on
punning in a way that Lisp2 doesn't let you. And some of the things
CL does won't work in Lisp1 because of other decisions. That's
exactly why I always emphasize these are different languages. They
are trying to do different things for different reasons. Forcing
them into the same mold because they have some remaining operator
names in common confuses the matter.

I don't like the idea of things being "the right way to go"
without a surrounding context. (Not that you, Barry, are doing
that. But it is done.) So identifying the context and the
relationship of the operators helps you see why the decision
could be reasonable in both cases.

Again a favorite quote:
There are two kinds of people in the world:
People who think there are two kinds of people,
and people who don't.

I feel that many Scheme people are often trying to dig out the latter
camp, leaving me no place to be happy with alternatives. Often they
use "CL is bad" as a veritable lemma, requiring no demonstration.
Many never seem satisfied by just having made good decisions for
themselves, but often go on to assert what is right for all of Lisp.
And that often leads me to want to both partition Lisp from them, and
in gneeral to get contentious. But I don't want to tell them what to
do (except with my Scheme author's hat on, working in their framework,
where I still like having my vote). They're entitled to their
language and even though I have quibbles with it, I don't spend any
time at all telling them they should be two namespaces. I respect
decisions like that and wish they would be more respectful in return.

Some great things have been done in Scheme. But the same is true of CL.
So I kinda like the "there are two (perhaps many) kinds of people (or
namespaces)" approach.


Per Bothner

unread,
Sep 16, 1998, 3:00:00 AM9/16/98
to
In article <87btoj5...@ivm.de>,
Klaus Schilling <Klaus.S...@home.ivm.de> wrote:
>tras...@david-steuber.com (David Steuber "The Interloper") writes:
>>
>> I wonder if, with the support of some Java code, a Scheme compiler (or
>> CL) could be written to produce a .class file. The byte code and file
>> format is biased heavily towards Java. But it would be cool to see if
>> Scheme can be translated to Java byte code. It would be even cooler
>> to see what a Java decompiler makes of it.
>
>Per Bothner from Cygnus Solutions is working on that . (Kawa project).

True, in the sense that I'm continually improving Kawa, but some
people may not realize from your phrasing that Kawa *is* available
(freely), and a number of people are using Kawa for real work.

See http://www.cygnus.com/~bothner/kawa.html.
--
--Per Bothner
Cygnus Solutions bot...@cygnus.com http://www.cygnus.com/~bothner

Pierre Mai

unread,
Sep 17, 1998, 3:00:00 AM9/17/98
to
Klaus Schilling <Klaus.S...@home.ivm.de> writes:

> The current situation in scheme with many different object systems, one for
> each taste, is much better: yasos, bos, mos, tiny-clos, meroon, meroonet,
> scoops,miniops, ...

Klaus Schilling <Klaus.S...@home.ivm.de> writes:

> The GNU sense of free is the only acceptable one.

> gcl and clisp are free (but maybe not ansi-compliant) , harlequin
> ain't.

Although maybe I shouldn't comment, I'll do anyway ;) :

IMHO you seem a tad too certain of your conclusions, without giving
any kind of supportive arguments.

Also, whilst handling GPL'd software, you seem to have gotten infected
with MSC (Moral Superiority Complex), a spreading disease in the
free-software community, it seems.

I've written and contributed to programs under the GPL and other
(arguably "more free") licences, several of my clients are using free
software because of my advice, yet I really think that you are doing
the free software community a great disservice by putting the GPL and
it's "definition" of free above anything else. Sorry, but you are not
going to convince me, or most anyone else here, that the GPL is the
"only true and good" licence or something of this sort. Especially
you are not going to convince me, that CMU CL, which is in the public
domain, is any less free or "acceptably free" than clisp or gcl.

And what's more, there are more dimensions to the evaluation of
software and companies/authors than their licencing policies. You are
also not going to convince me, that I should value perl more than
Harlequin's Lispworks or Dylan, or Franz' ACL, just because perl is
distributed under the GPL (and/or the Artistic Licence).

For further comments on your attitude, re-read Erik Naggum's response
to your previous post on CL & CORBA, <31139157...@naggum.no>.

Regs, Pierre.

Christopher Browne

unread,
Sep 17, 1998, 3:00:00 AM9/17/98
to
On Wed, 16 Sep 1998 22:04:26 GMT, Kent M Pitman <pit...@world.std.com>
wrote:
>Klaus Schilling <Klaus.S...@home.ivm.de> writes:
>
>> Pierre Mai <de...@dent.isdn.cs.tu-berlin.de> writes:
>>
>> > [1] free as in "no money" for personal use, _not_ free as in GPL'd
>> > software.
>>
>> The GNU sense of free is the only acceptable one.
>
>Hmmm. Are you saying it's the `only acceptable sense of free' for
>your personal situation right now, or are you making a
>political/religious statement that implicitly seeks to limit the
>entire Lisp community to this point of view?

He's stating "true reality." Any other perception of reality is evil and
unacceptable, at least according to his perceptions...

See other postings in gnu.discuss.misc if you wish to interpret this
more clearly; it's certainly not intended to be a limitation "merely" to
the Lisp community...

--
"...Deep Hack Mode--that mysterious and frightening state of
consciousness where Mortal Users fear to tread." (By Matt Welsh)
cbbr...@ntlug.org- <http//www.ntlug.org/~cbbrowne/lsf.html>

David Steuber The Interloper

unread,
Sep 17, 1998, 3:00:00 AM9/17/98
to
On 15 Sep 1998 14:34:35 +0200, Pierre Mai
<de...@dent.isdn.cs.tu-berlin.de> claimed or asked:

% Although I'd also highly recommend switching to Linux, Harlequin _is_
% offering a high-quality, "free"[1] implementation of ANSI Common Lisp
% for NT. Just go visit http://www.harlequin.com/, and follow the links
% to Harlequin Lispworks Personal Edition (IIRC)...
%
% Regs, Pierre
%
% Footnotes:
% [1] free as in "no money" for personal use, _not_ free as in GPL'd
% software.

I am downloading it now. It is cripple ware. It may be enough for me
to learn on, but I don't think I can create another compiler with it.

Marco Antoniotti

unread,
Sep 17, 1998, 3:00:00 AM9/17/98
to
Klaus Schilling <Klaus.S...@home.ivm.de> writes:

> Marco Antoniotti <mar...@galvani.parades.rm.cnr.it> writes:
>
> > Well! Since I see the Scheme/CL flame war re-igniting, I'll jump in. :)
> >
> > MORE COMPARISONS BETWEEN SCHEME (R^nRALS) AND CL
> >

> > Of course I am being unfair. But if you need OO and you want
> > parenthesis, you have no choice. :)
> >
>

> The current situation in scheme with many different object systems, one for
> each taste, is much better: yasos, bos, mos, tiny-clos, meroon, meroonet,
> scoops,miniops, ...

The situation in CL is similar, you have Flavors, CLOS, QSL (ok, you
don't know what it is, but I do :) ), KRL, etc. etc. etc. plus all the
reimplementation (if you want to do it) of most of the Scheme OO stuff
you just mentioned.

But your CLOS application runs everywhere, it is ANSI, and it is
consistent in form and function. Furthermore. It is already
implemented in all the CLs out there. Try to beat that.

Now you have the following situation: in your Scheme system you need
to use application (A) which uses yasos, but you also need application
(B) which uses scoops. This is bound to happen much more often than
not in the Scheme world, and suddendly you have to deal with a lot of
complications, which are needlesly there, had the R^nRALS group
decided on *an* object system.

There is value in diversity of applications and languages, but there
is also value in being able to communicate easily. Spedire messaggi in
Italiano e` semanticamente equivalente allo spedire messaggi in
Inglese su una Bacheca Virtuale. Yet, using English facilitates
conveying my intentions :)

Cheers

Tim Bradshaw

unread,
Sep 17, 1998, 3:00:00 AM9/17/98
to
* Kent M Pitman wrote:
> Because all evidence suggests that the human brain has no trouble at
> all entertaining multiple namespaces and routinely seeks them. By
> reducing the number of namespaces, you decrease the space of available
> names on the basis of an unproven claim that the human mind is unable
> to make distinctions like "the f in (f x) is different than the f in
> (g f)". The fact that "buffalo buffalo buffalo" and even "buffalo
> buffalo buffalo buffalo buffalo" and "buffalo buffalo buffalo buffalo
> buffalo buffalo buffalo" are valid sentences in english (and speakers
> can tell you which are the nouns and which are the verbs) suggest that
> it is natural to make strong positional distinctions.

Perhaps in American English -- can someone tell me what these mean, as
I think they must depend on terms I don't know!

I always liked: `John had had had where Jane had had had had had had
had had the examiner's approval'. But that needs lots of quotes and
punctuation to fly really.

Anyway, the multiple namespace / single namespace thing caused me to
spend many hours trying to debug some scheme program where I'd bound
something like `car'. This was in code I'd semi-mechanically
translated from CL I think, and it wasn't helped by the very
uninformative error the implementation gave (which I think was a core
dump...).

All this, of course, is as nothing to the awful consequences of
binding something like `car' in a single-namespace dynamically-scoped
lisp. I think that cambridge lisp was like that: at least I remember
*very* obscure things caused by accidentally binding some internal
function of a system that we used implemented in cambridge lisp.

--tim

Dimitrios Souflis

unread,
Sep 17, 1998, 3:00:00 AM9/17/98
to
Erik Naggum wrote:
>
> step back one article in the thread and try this line again. it should
> fit equally well. it might actually fit equally well to your article.
>
> I wonder why Scheme adherents are helpless to respond, while Common Lisp
> users who don't like the Scheme wars on Common Lisp are causing them to
> respond the way they do. it looks like you use disagreement as the basis
> of your blaming people for most everything you don't like. just quit it.

To all flamers:
I never could understand why obviously intelligent and proffessionally
competent people flame each other. I can't imagine the actual designers of
CL and Scheme calling each other names or being impolite over which
language is "superior".

Scheme has its place in the world. It builds upon elementary lambda-calculus
notions and expresses many constructs as derived from them. For example LET,
LETREC and LET* are expressible using function application. To some, myself
included, this minimalism is considered elegant. Maybe it is that my feeble
mind cannot hold many things at once, and having a small number of orthogonal
concepts to deal with is the only way I can get on.

Common Lisp also has its place in the world. It's designed to cater to every
need LISP programmers have encountered in the past 30 years. For some people,
having a wide variety of ready-made constructs to confront their problems
with is considered powerfull, and mastering them all is considered rewarding.

--
Dimitrios Souflis dsou...@altera.gr
Altera Ltd. http://www.altera.gr/dsouflis

*** Reality is what refuses to disappear when you stop believing
*** in it (VALIS, Philip K. Dick)

Dimitrios Souflis

unread,
Sep 17, 1998, 3:00:00 AM9/17/98
to
Kent M Pitman wrote:
>
> Programming should be an extension to our
> basic way of thinking, not a replacement. It's fine to add call/cc to
> the set of ways you can think about things, but before you call it a
> natural way to think about things, I think the burden is to show
> examples of ordinary people who naturally DO think about things this
> way long before they are programmers.

Engineers of any kind *do not* think in a "natural way" in matters of
their profession. They are educated people whose understanding of
processes and phenomena (missing from everyday endeavors where
the "natural way to think about things" is amply adequate) make them
effectivelly to use counter-intuitive scientific knowledge (from
the viewpoint of a layman).

If engineering was done the way "ordinary people <...> think about
things <...> long before they are" engineers/scientists, we wouldn't
have gone very far.

Pierre Mai

unread,
Sep 17, 1998, 3:00:00 AM9/17/98
to
tras...@david-steuber.com (David Steuber "The Interloper") writes:

> % to Harlequin Lispworks Personal Edition (IIRC)...
> %
> % Regs, Pierre
> %
> % Footnotes:
> % [1] free as in "no money" for personal use, _not_ free as in GPL'd
> % software.
>
> I am downloading it now. It is cripple ware. It may be enough for me
> to learn on, but I don't think I can create another compiler with it.

You asked for a free version of ANSI CL for Windows which would allow
you to learn:

> | like what I have seen so far. I just wish I had a free ANSI Common Lisp
> | implementation on NT that I could play with his examples on. Any
> | recommendations?

And the only restrictions of LispWorks PE that I know about is the
limitation of core runtime to IIRC 5h, and I think you cannot dump
core (but there I may be wrong). Plenty for getting to know ANSI CL,
and even develop a lot of code for personal use!

And you have the simple path of upgrading to the commercial versions
of LispWorks!

If you want a free, mostly-compliant version of ANSI CL for personal
use, Allegro CL 5.0 for Linux is available at Franz's site. BTW, wrt
your comments that you cannot switch to Linux: a) You can run Linux
side-by-side with NT, if you choose to do so, and b) especially wrt to
Internet software, but also wrt most other non-specialised software,
there is plenty of free[1] software to fill all your needs, most of
the time more functional, stable and user-friendly[2].

If OTOH your needs can be satisfied by a somewhat outdated,
byte-compiled implementation of CL (CLTL1 with some additions from
CLTL2), clisp is available for Windows, and is free under the GPL.

Maybe GCL has also been ported to Windows, but I don't know.

Regs, Pierre.

Footnotes:
[1] this time free as in GPL, or even more free, as in BSD-style or
public domain software.

[2] This message is being written using Gnus, a News and Mail-reader
for Emacs/XEmacs, one of several really useable News/Mail-readers for
Unix, while I've yet to meet a really useable News/Mail-reader for
Windows (besides Gnus, that is, which also sort of runs under
NTEmacs... ;).

Kent M Pitman

unread,
Sep 17, 1998, 3:00:00 AM9/17/98
to
[ replying to comp.lang.lisp only. reason in
http://world.std.com/~pitman/pfaq/cross-posting.html ]

Dimitrios Souflis <dsou...@altera.gr> writes:

> I never could understand why obviously intelligent and proffessionally

> competent people flame each other. [...]

Because from time to time, certain individuals expound opinions
that are framed in a way that requires certain people to either
tacitly agree through silence or to react in an alarmed way.
(The provocation is called "passive aggressiveness" and the
act of it often goes unnoticed, sometimes by accident and
sometimes by design, by others around. The idea is to provoke
someone in a subtle way that makes it look like they started
it.)

On newsgroups, the term "flame bait" is sometimes used and has a
sometimes-relation to passive aggressiveness. (I think if you made a
Venn diagramm you'd find them as partly overlapping circles,each of
which sometimes but not always implies the other.)

I happen not to ever use the word "flame" because I find it a
dismissive term. It's one of those adjectives that is only ever used
in the second and third person, like "egotistical", and always to
diminish. (I am proud. You are egotistical. He/she/it is
egotistical. We are proud. You are egotistical. They are
egotistical. Similarly: I am stating my case. You are flaming.
He/she/it is flaming. We are stating our case. You are flaming. They
are flaming.) I think the only additional information "flame" carries
over "states one case" is not about the person appearing to be spoken
about but about the speaker's attitude toward that person. This isn't
an attempt to make you feel bad for having used it here; only an
attempt to sensitize you to the idea that you are doing little but
pouring gasoline on a fire when you find someone who is in a defensive
mode because they feel threatened and tell them they're not entitled
to defend themselves.

There may be other uses of "flame" as well. I'm addressing the sense
of it as used above. Even in cases where the one addressed is taking
a more proactive attack--which again I emphasize is hard to detect
because you may not see the provocation, and defense always looks like
attack when you miss the initial cue--it usually helps to pick a different
word that is more precise. "Flaming" carries only-negative connotation
and so is hard to accept as a constructive criticism. I would almost
go so far as to say that virtually implies that someone will assume
your purpose in saying it is not constructive and assume you then to
be passive aggressive, thus refueling the fire.

> Scheme has its place in the world. [...]
> Common Lisp also has its place in the world. [...]

I like this conception of things, but a lot of what starts this
problem is very clear statements by certain people that Common Lisp
has no such place. And worse, some have heard it in the classroom,
which I consider a travesty. In my experience, most of the acid
discussion among the communities are caused by really a lot of Scheme
people, some the designers and many their (sometimes unwitting)
disciples, insisting that CL is confused and has no place.

The problem, as I see it, was bootstrapped as follows: No one was
advocating tail calls and continuations as a way of viewing control
structure. This was a shame. Scheme decided to do that. This was
good because it added to the available world views a useful way of
thinking. No one was advocating tail recursion as an alternate
paradigm for iteration. I don't think Scheme invented the idea, but
they did decide to feature it as an option in the language. This was
good because again it was a new and useful way of framing problems.
There were so many places in the world already teaching other models
that the Scheme people decided to teach tail recursion more or less
exclusively to try to achieve some balance. This was
well-intentioned, but it started a process in motion that I think had
some unintended ill effects. Students who went through these classes
saw that their instructors taught only tail recursion and never simple
looping like (dotimes (i 5) (print i)). These students inferred in
many cases, and very occasionally were actually told, that this was
because loops were bad and only tail recursion was good. (Consider
that if they used looping constructs in their homework or on tests,
they were marked down.) After some amount of time, under a Skinnerian
model of behavior modification, students learned to react with horror
at the idea of paradigms that were not taught in their class.
What had started with all good intention as an exercise in expanding
one's view of the world had ended up, for lack of attention to the
mere importance of "variety of paradigms", as a way of limiting
one's thinking. It's as if someone invented a mathematical transform
space to make some problem easier and then someone said "well, if
it always makes problems easier, why not live in the transform space
all the time". Well, that defeats the purpose. The reason one has
alternate ways of thinking about things is so you can shift ways of
thinking, not because there's one single way that's always better.

If certain very vocal people in the Scheme community spent half as
much time learning to study and learn from CL as they do bashing it,
the acid rivalry would fall away. I don't even think they have to
come away choosing to use it. But they should learn a respect for the
fact that others do use it for a legitimate reason, and they should
know it's available as an option.

It's one thing to think English or German or Spanish or Mandarin
are hard or unpleasant to learn or use if you're not a native
speaker. It's quite another to make the inductive leap that people
using those languages are confused in their enjoyment of them
and if they only realized the error of their ways would and should
use your own language. And it doesn't help when people add to
their arguments the fact that one has gained popularity, as Scheme
has in classrooms. It's great there is a lot of literature people
can share in schools and it's not surprising people gravitate toward
one out of the practical need to have common materials. Sometimes,
though, the choice is more arbitrary than the winners might
have you believe. But considered how wrong it would
be for me to suggest English (the world interchange language of
the day) is superior because it's "catching on" everywhere. Having it
be ubiquitous is an advantage to English speakers, but it's no proof
English is somehow morally superior. It was a combination of chance,
"well-timed" imperialism, and perhaps some learnability characteristics
that got it to where it was. But it's not obvious that it's the
best choice. All that's obvious is that it's good to have SOME
choice. And beyond that, one can only hope it doesn't become the
only choice because much of value would be lost if that happens.
The same is true for Scheme. It's fine for it to be taught in schools,
but I hope they aren't teaching fairytales like "there is only one
right language" or "Scheme is morally superior to Lisp" or
"the only way to do iteration is by tail recursion" or even
"there is no good excuse for the absence of good handling of tail
calls".

On the point of tail calls, for example, Java doesn't have it.
Now, mind you, Steele was in on the design and I don't think it's
just an accident that it was omitted. It's not present because it
could lead to problems with the security model, so far as I know.
Again reinforcing my point. It's not that it's a thing that's either
good or bad--it's a thing with positives and negatives. And within
the context of a language, you can choose whether the negatives are
prohibitive. But not from outside.


> It's designed to cater to every
> need LISP programmers have encountered in the past 30 years.

Don't get me wrong. At a very strict level, what you say is true. And
it satisfies my needs on a regular basis. But, surprising as it
sounds, there are more ways to think about things than how you and I
and other Lisp programmers do. Not that you don't compute the same
things in the end, but languages necessarily make some things hard in
the process of making others easy. C makes pointer manipulation easy
but GC hard. Java makes certain object things easy but others
(multiple inheritance) hard. CL makes a great many library operations
and interesting notations easy, but the creation of a CL
implementation hard. Scheme makes writing Scheme interpreters easy
but makes portability hard (compared to CL). Both Scheme and Lisp
make various exact large math operations easy (bignums) but make
access to low-level machine instructions for fixed-bit arithmetic
(modulo machine word size) or bit manipulation hard. C makes the
low-level bit twiddling and math easy, but makes correctness of math
on large numbers a worry and makes extended datastructures like
rationals or bignums hard. Pascal makes type inferencing easy but
tagged dynamic datatyping hard. Teco and Perl make string operations
easy but other operations hard. [Not impossible; I once wrote a lisp
compiler in Teco. But it's not normally done.] Prolog makes it easy
to have programs start but hard to have programs stop. :-) Nothing comes
without a cost. We may like the costs and use one, but it doesn't
mean it's a universal tool--except in the Turing sense, which I find a
little dull.

> For some people, having a wide variety of ready-made constructs to
> confront their problems with is considered powerfull, and mastering
> them all is considered rewarding.

Yes. But if you're trying to understand the tension, it's from
the either accidental or intentaional "evangelism" of the "there is
one right way" view which seems to me to come from the Scheme
community and to be mostly directed at "CL".

In fact, this phenomenon is so strong that when Dick Gabriel and I
wrote a paper a while back debating the issue of 1 namespace vs 2, we
started off using "Scheme" vs "Common Lisp" and I found I (defending
CL) had lost the argument from the get-go because many people
associate so many "warm fuzzies" with Scheme that they would give
Scheme an atuomatic win on the issue for reasons unrelated to the
debate. So I created, for defensive reasons, the notion of two
abstract dialects "Lisp1" and "Lisp2" and we debated these as a way of
dissociating ourselves from the languages. I noted that since you
could imagine a two-namespace Scheme and a one-namespace CL, that it
was proper to just debate the Lisp1 vs Lisp2 issue independent of the
question of what the rest of the subroutine base was. At that point,
the debate is less clear because a two-namespace Scheme might still be
better in some people's minds than a one-namespace CL, "proving" (as a
thought exercise anyway) that the key feature which wins clients for
Scheme is not the namespace issue. And at that point you can have an
honest debate about the particular language feature and its various
effects, positive and negative.

And not that we're debating Lisp1/Lisp2 here, but it's worth pointing
out that that debate is not symmetric. It's not between people who
think one namespace is right and people who think two is right. It's
between people who think one namespace is right and people who think the
number of namespaces should be chosen according to need. In the
language of the abortion debate, there are pro-1 and pro-choice
participants. In the abortion debate, it's sometimes useful to the
pro-life people to suggest that opponents of pro-life are pro-death.
As if they were advocating that all pregnancies be aborted. When
in fact, no pro-choice person is arguing that. All they want is space
for some abortions. And whether you agree with the position of allowing
abortions or not, you have to admit that pro-choice people aren't trying
to lock out the idea of keeping a baby to term, they are only trying to
lock out the idea of ALWAYS keeping a baby to term. Similarly here,
and the debate gets just that contentious becuase it's near and dear
to the hearts of people on both sides, the pro-1 people are often
trying to keep pro-choice people from existing even though the pro-choice
people are not trying to say there should never be a lisp1.
If the pro-1 people understand that, then they are not threatened
unless they consider it morally repugnant for someone in the other
camp to ever have his way just on principle. But the lisp1 advocates
often use very charged "death to lisp2; it's wrongheaded" kind of
battle cries.

And, since you asked, that's what causes reasonable people to
sometimes raise their virtual voices. The key isn't to ask them
to lower their voices. The key is to ask those who are threatening
them to learn a little tolerance.

I could make a similar argument about GPL advocacy I've seen here
recently. But I'll leave it as an exercise to the reader.
Suffice it to say that beginning a discussion with a statement
like "any software not covered by GPL is useless" doesn't win
friends or advance the cause of GPL usage.

Barry Margolin

unread,
Sep 17, 1998, 3:00:00 AM9/17/98
to
In article <ey3af3z...@haystack.aiai.ed.ac.uk>,

Tim Bradshaw <t...@aiai.ed.ac.uk> wrote:
>* Kent M Pitman wrote:
>> (g f)". The fact that "buffalo buffalo buffalo" and even "buffalo
>> buffalo buffalo buffalo buffalo" and "buffalo buffalo buffalo buffalo
>> buffalo buffalo buffalo" are valid sentences in english (and speakers
>> can tell you which are the nouns and which are the verbs) suggest that
>> it is natural to make strong positional distinctions.
>
>Perhaps in American English -- can someone tell me what these mean, as
>I think they must depend on terms I don't know!

In "buffalo buffalo buffalo", the first and last words are nouns, and I
assume you know that a buffalo is an animal, and that "buffalo" can also be
used as the plural. The middle word is a transitive verb meaning
"bewilder, baffle" (from Merrian-Webster Online: WWWebster Dictionary
<http://www.m-w.com/cgi-bin/dictionary>). So it translates to "buffaloes
bewilder buffaloes." It's still somewhat ambiguous, as it could mean
"buffaloes bewilder other buffaloes" or "buffaloes find other buffaloes
bewildering."

The second one is has a useful, but optional function word elided. Adding
it in results in "buffalo that buffalo buffalo that buffalo buffalo buffalo
buffalo." This is still hard to understand, so I'll add parentheses around
the nested noun phrase and also use the alternate plural form (as above) to
distinguish the nouns from the verbs:

buffaloes that buffalo (buffaloes that buffalo buffaloes) buffalo buffaloes.

Finally, using the synonym "bewilder" as above, results in:

buffaloes that bewilder (buffaloes that bewilder buffaloes) bewilder buffaloes.

Keith Wright

unread,
Sep 17, 1998, 3:00:00 AM9/17/98
to
Dimitrios Souflis <dsou...@altera.gr> writes:

> To all flamers:


> I never could understand why obviously intelligent and proffessionally

> competent people flame each other. I can't imagine the actual designers of
> CL and Scheme calling each other names or being impolite over which
> language is "superior".
>

They couldn't. Guy Steele would have refused to write either of their manuals.
While the Usenet flamers insult each other, _the_ _same_ _people_ are doing
the work on both languages.
--
--Keith

This mail message sent by GNU emacs and Linux.
Food, Shelter, Source code.

Klaus Schilling

unread,
Sep 17, 1998, 3:00:00 AM9/17/98
to
I recently stumbled over a rudimentary lisp interpreter written in awk ... Is
awk turing-complete?
complete?

Klaus Schilling

Christopher B. Browne

unread,
Sep 18, 1998, 3:00:00 AM9/18/98
to
On 17 Sep 1998 20:50:33 +0200, Klaus Schilling
<Klaus.S...@home.ivm.de> posted:

>I recently stumbled over a rudimentary lisp interpreter written in awk
>... Is awk turing-complete?
>complete?

AWK is surely "Turing complete." It offers variables and branching,
which is pretty much enough in practice, if not a precise proof. :-).

The problem of "sterilitity" to the "Turing complete" notion is that
it is normally not a terribly useful result from a *prescriptive* perspective
to know that a system is "Turing complete." The fact that it is, in theory,
powerful enough to compute any computable thing implies neither:
a) That it can do so in a *useful* period of time, or
b) That you'll be able to write any particular program with any great
degree of efficiency. Writing a Lisp in TECO may be *possible;* the
idea fills me with fear and trepidation as a *REALLY TOUGH TASK.*

You may be able to, given *highly* clever thinking, build a Lisp interpreter
in awk. Performance is likely to be poor, and there will be little
opportunity to improve on that. You may win the "geek of the week" award
for doing something others can't do, but you'll probably not want to use
the resulting Lisp system to rewrite Emacs...

--
Those who do not understand Unix are condemned to reinvent it, poorly.
-- Henry Spencer <http://www.hex.net/~cbbrowne/lsf.html>
cbbr...@hex.net - "What have you contributed to Linux today?..."

David Steuber The Interloper

unread,
Sep 18, 1998, 3:00:00 AM9/18/98
to
On 17 Sep 1998 16:12:45 +0200, Pierre Mai <pm...@acm.org> claimed or
asked:

% You asked for a free version of ANSI CL for Windows which would allow
% you to learn:

Be careful what you wish for, you just might get it! I asked. I got.
I appreciate it.

% If you want a free, mostly-compliant version of ANSI CL for personal
% use, Allegro CL 5.0 for Linux is available at Franz's site. BTW, wrt
% your comments that you cannot switch to Linux: a) You can run Linux
% side-by-side with NT, if you choose to do so, and b) especially wrt to
% Internet software, but also wrt most other non-specialised software,
% there is plenty of free[1] software to fill all your needs, most of
% the time more functional, stable and user-friendly[2].

I've decided to go ahead and get Linux. I've been hesitating over
doing it for well over a year. But people keep telling me how great
it is. So fine, I'll get it. I just need to get another hard drive
so that I will have room for the darn thing. I am getting System
Commander Deluxe so that I can triple boot. (Yes, I have Win95 as
well).

% [2] This message is being written using Gnus, a News and Mail-reader
% for Emacs/XEmacs, one of several really useable News/Mail-readers for
% Unix, while I've yet to meet a really useable News/Mail-reader for
% Windows (besides Gnus, that is, which also sort of runs under
% NTEmacs... ;).

I am connected to the Internet by I$DN. My mode of reading news
messages is to grab all the headlines, then grab all the bodies, then
get off the line. I read off line, compose replies off line. I go
online to post. Can I do that with Gnus? What about for cases where
people post MIME with base64 or uuencoded binary attachments?

% "Such is life." -- Fiona in "Four Weddings and a Funeral" (UK/1994)

"It's like ping pong, only you use smaller balls." FWAAF (five of the
same thing -- Al Bundy)

David Steuber The Interloper

unread,
Sep 18, 1998, 3:00:00 AM9/18/98
to
On Thu, 17 Sep 1998 16:04:03 GMT, Kent M Pitman <pit...@world.std.com>
claimed or asked:

<most of article cut because I hate to over quote>

% some unintended ill effects. Students who went through these classes
% saw that their instructors taught only tail recursion and never simple
% looping like (dotimes (i 5) (print i)). These students inferred in
% many cases, and very occasionally were actually told, that this was
% because loops were bad and only tail recursion was good. (Consider
% that if they used looping constructs in their homework or on tests,
% they were marked down.) After some amount of time, under a Skinnerian
% model of behavior modification, students learned to react with horror
% at the idea of paradigms that were not taught in their class.

Are students actually taught this? It is just as well that I never
got a CS degree.

As I think about computer languages, I come to one particularly
significant conclusion. At the risk of stating the bleeding obvious,
a computer language is an interface between the human and the machine.
The machine only understand's its particular machine code. Humans use
formalized languages to communicate a process in such a way that the
description of the process can be converted into the machine code and
executed by the machine.

It is very easy to measure which is the best machine representation of
a particular process. You can measure it. It is more difficult to
measure which is the best language for describing a process. That
introduces human psychology into the equation. As an engineering
professor kept saying in class, "If you can't assign a number to it,
you don't know it." While it may be blindingly obvious as to which
language is best suited to a specific task (in some cases), it is
really an exercise in futility to claim that some language is a better
general purpose language than another.

To take an example of two completely different languages: C and Common
Lisp (this way we don't have to argue about scheme); which language is
best suited towards the writing of an operating system and support
software? I know it can be done in C. I assume it can also be done
in Common Lisp. The operating system has to do some low level stuff
that can be easily expressed in C. The operating system also may do
more complicated tasks that are more easily expressed in Common Lisp.
This example is a bit contrived. How about choosing an application to
write an application such as a spread sheet or word processor? I
suspect choosing a language at this point is more a question of which
one are you most familiar with. There is no right answer because the
machine code can be processed so that it is a optimized as is
mathematically possible so that speed and size are not an issue.

This brings the language debate down to the level of dogmatism.
Nothing can be gained from it. When I asked about the differences
between Scheme and Common Lisp in my original post, I was asking just
that. I certainly wasn't expecting a debate as to which language is
the best at everything. My interest in Lisp is based on relative ease
of implementation compared to Java or JavaScript (ECMAScript). I am
also curious from an intellectual point of view of having a different
perspective from which I can think about problems. I still think in
C++ with a hybridized blend of structured and object oriented
analysis. This ties me to a mode of thinking that is powerful. But
it also ties me to a mode of thinking that is limited. Lisp is a
completely different language with a completely different way of
expressing things. What ever the flavor, it is a new way to think.
Not better, not worse, just different.

For me at least, the issue is how else can I approach a difficult
problem?

Jussi Piitulainen

unread,
Sep 18, 1998, 3:00:00 AM9/18/98
to
David Steuber writes:

> When I asked about the differences between Scheme and Common Lisp in
> my original post, I was asking just that. I certainly wasn't
> expecting a debate as to which language is the best at everything.

The original question was, if I still remember, just as you say. It's
a pity that such an interesting question produces such hostility, but
that is indeed quite predictable here.

There is a cultural difference between Scheme and Common Lisp. I
regret to say that Kent Pitman's observations seem true to me: some
vocal Schemers tend to denigrate Common Lisp, even in comp.lang.lisp,
for not being the right thing, and they did learn that attitude inside
the Scheme culture. Then some Common Lisp folk respond in kind.

I resent it when people from outside come and declare that Scheme
should be something else. I'm ashamed when Scheme folk do that to
Common Lisp. The uncompromising attitude of the designers of Scheme is
a great thing, it's what has made Scheme the great thing it is. The
other kind of politics that led to Common Lisp is another kind of
great thing. Mistaking one for the other is boring and, on Usenet,
often results in insults.

The contrast, between languages and between language cultures, is
indeed fascinating. Any student of programming languages should learn
several.

Thank you for being civil.
--
Jussi

Erik Naggum

unread,
Sep 18, 1998, 3:00:00 AM9/18/98
to
* tras...@david-steuber.com (David Steuber "The Interloper")

| The operating system has to do some low level stuff that can be easily
| expressed in C.

actually, this is false. the low-level stuff you need for an operating
system is as hard to express in C as in Common Lisp, and you may well
have to resort to assembler at the same points in both languages. (you
get an opportunity to realize this for yourself when you get Linux. :)

as for how easily low-level things can be expressed in C, I had fun
replying to someone who needed a portable way to store floating-point
numbers. I suggested a byte stream, consisting of a total length byte, a
mantissa length byte, bytes of the left-adjusted mantissa in big-endian
order, an exponent length byte, bytes of the right-adjusted exponent in
big-endian order, and a byte with the signs of exponent and mantissa.
the core of this function is CL's INTEGER-DECODE-FLOAT. C doesn't have
this function. writing the bytes out from a C program is not trivial,
and is certainly not portable. this is a case of much better support for
low-level operations in CL than in C. just a data point...

#:Erik
--
ATTENTION, all abducting aliens! you DON'T need to RETURN them!

Erik Naggum

unread,
Sep 18, 1998, 3:00:00 AM9/18/98
to
* David Steuber

| When I asked about the differences between Scheme and Common Lisp in my
| original post, I was asking just that. I certainly wasn't expecting a
| debate as to which language is the best at everything.

* Jussi Piitulainen <jpii...@ling.helsinki.fi>


| The original question was, if I still remember, just as you say. It's
| a pity that such an interesting question produces such hostility, but
| that is indeed quite predictable here.

imagine a question "what is the difference between capitalism and
communism?" posted to newsgroups dedicated to each political theory.

that would be known as a "troll" in USENET jargon, regardless of the
honest intents of the requestor to learn about the differences. USENET
is simply not a forum suitable for such inquiries.

however, asking for the history, evolution, goals, and purposes of each
language or political theory in separate messages to each group would
shed light that an intelligent reader could use to produce a list of
useful differences for his _own_ needs. requesting such a list from
others can _only_ produce different lists from each contributor, not one
of them being right for anybody else if the list is indeed useful for a
person's choice of language, and no agreement is possible.

Tim Bradshaw

unread,
Sep 18, 1998, 3:00:00 AM9/18/98
to
* Klaus Schilling wrote:

> I recently stumbled over a rudimentary lisp interpreter written in
> awk ... Is awk turing-complete?

Yes, it is. So is vi actually!

--tim


Ray Dillinger

unread,
Sep 18, 1998, 3:00:00 AM9/18/98
to
Jeffrey Mark Siskind wrote:
>
> The unspecified
> > order is so that compilers are free to try to extract the
> > maximum parallelism available from the algorithm;
>
> I have been told that this is expressly *not* the reason for
> underspecification of argument evaluation order. Precisely for the reason that
> you subsequently state
<clip>
> But rather the reason for the underspecification that I have been told is that
> when the standard was defined, different preexisting implementations used
> different evaluation orders and there was no agreement on what the order
> should be because no one wanted to change their implementation so the standard
> was made sufficiently flexible that all those implementations could conform to
> it.

Well, when I read it, I certainly took it as permission for implementors
to choose the most efficient evaluation order available (including
threaded) for the subexpressions, or to parallelize them on multiple
CPUs if they don't have side effects.

And in fact that's exactly what I'm writing an optimizer to do. I don't
really know that much about the historical rationales; I'm just reading
the spec and taking every optimization I can get -- and I've been
assuming
that the rationales, where applicable, are for purposes of optimization.

Ray

Jeffrey B. Siegal

unread,
Sep 18, 1998, 3:00:00 AM9/18/98
to
Ray Dillinger wrote:

> or to parallelize them on multiple
> CPUs if they don't have side effects.

If they don't have side effects, then the order is both irrelevant and not
determinable by a program, so an implementation could perform the evaluation any way
it likes).

be...@softwell.se

unread,
Sep 18, 1998, 3:00:00 AM9/18/98
to
In article <k1aL1.7148$1g4.7...@news.giganews.com>, "Mark Watson"
<ma...@sedona.net> wrote:

...deleted
>I used a simple procedure in Scheme to keep moderately
>large systems tidy: after I implement a framework that might
>include a dozen or so functions (and debug them!), I used
>to write a "wrapper" function, and copy all of the "worker"
>functions into the "wrapper" function, using LEXICAL SCOPING
>to keep my namespace clean (i.e., the worker functions are
>now invisible to the global name space).
>
>I could never think of a reason why this was a bad idea, and it
>allowed me to keep systems tidy: especially important if I would
>have to occasionally set aside work on a Scheme project for
>a month or so.

This is a good idea.
I only wonder if it will be practical if I try to use other peoples code
and they haven't done it? Or, perhaps worse(?), if they have done this but
used the same names for their different wrappers?
--

Kent M Pitman

unread,
Sep 19, 1998, 3:00:00 AM9/19/98
to
[comp.lang.scheme removed.
http://world.std.com/~pitman/pfaq/cross-posting.html]

Ray Dillinger <be...@sonic.net> writes:

> Well, when I read it, I certainly took it as permission for implementors
> to choose the most efficient evaluation order available (including

> threaded) for the subexpressions, or to parallelize them on multiple

> CPUs if they don't have side effects.

I think all of the reasons I've heard cited here are among those I've
heard used at Scheme meetings over the years when this issue has been
raised. That is:
- Pre-existing implementations.
- Tradition.
- Parallelism.

Also (though not mentioned earlier in this thread), an intent to make
you notate side-effects because they are expected to be uncommon and
they want them flagged as they go buy. What's funny abou this
as I think about it today is that this is nearly identical to the
reason CL requires (funcall foo x) instead of (foo x) to call a
parameter foo as a function. CL doesn't want to keep you from doing
functions but doesn't expect you to do huge numbers of them, so it
makes you notate them when you do. (There are other reasons, too,
but this is one of the reasons for the use of funcall.) CL programmers
balk at having to explicitly notate flow-of-control because they
are used to left-to-right meaning something and they think
writing explicit sequentialization code is tedious.

Neither of these issues, the FUNCALL/Lisp2 issue and the arg-order
issue, has a right or wrong answer--the decisions are made by looking
at the language and what its intended use is. But if you're a Scheme
programmer trying to understand why CL programmers don't mind writing
FUNCALL when calling functions, think about why you don't mind writing
explicit calls to BLOCK to force order of evaluation. And if you're a
CL programmer who doesn't undersand why Scheme people balk at
writing FUNCALL, think about how you'd feel if you had to take a
common thing like ordered side-effects and writ explicit calls to
PROGN everywhere you wanted to guarantee arg order. The situations
are very symmetric and there's a big opportunity to learn some
appreciation for someone else's culture here.

David Steuber The Interloper

unread,
Sep 19, 1998, 3:00:00 AM9/19/98
to
On 18 Sep 1998 09:56:31 +0000, Erik Naggum <er...@naggum.no> claimed or
asked:

% that would be known as a "troll" in USENET jargon, regardless of the
% honest intents of the requestor to learn about the differences. USENET
% is simply not a forum suitable for such inquiries.

Yeah, that is something that just didn't occur to me at the time.
I've never liked trolls, and I don't like being one of them. Not even
by accident.

David Steuber The Interloper

unread,
Sep 19, 1998, 3:00:00 AM9/19/98
to
On 18 Sep 1998 09:32:24 +0000, Erik Naggum <er...@naggum.no> claimed or
asked:

% * tras...@david-steuber.com (David Steuber "The Interloper")
% | The operating system has to do some low level stuff that can be easily
% | expressed in C.
%
% actually, this is false. the low-level stuff you need for an operating
% system is as hard to express in C as in Common Lisp, and you may well
% have to resort to assembler at the same points in both languages. (you
% get an opportunity to realize this for yourself when you get Linux. :)

Maybe. But I know how to bang bits in C which I don't know how to do
in CL (yet).

% as for how easily low-level things can be expressed in C, I had fun
% replying to someone who needed a portable way to store floating-point
% numbers. I suggested a byte stream, consisting of a total length byte, a
% mantissa length byte, bytes of the left-adjusted mantissa in big-endian
% order, an exponent length byte, bytes of the right-adjusted exponent in
% big-endian order, and a byte with the signs of exponent and mantissa.
% the core of this function is CL's INTEGER-DECODE-FLOAT. C doesn't have
% this function. writing the bytes out from a C program is not trivial,
% and is certainly not portable. this is a case of much better support for
% low-level operations in CL than in C. just a data point...

Hmmm. What's so non portable about IEEE 704? I would have just said,
"damn the bandwidth, print it out in ascii!"

% ATTENTION, all abducting aliens! you DON'T need to RETURN them!

But leave our cows alone!

Sorry for over quoting.

Erik Naggum

unread,
Sep 19, 1998, 3:00:00 AM9/19/98
to
* tras...@david-steuber.com (David Steuber "The Interloper")
| Hmmm. What's so non portable about IEEE 704?

I have no idea what IEEE 704 is. it is not listed in IEEE Standards
Status Report, either.

| I would have just said, "damn the bandwidth, print it out in ascii!"

sometimes, you can't damn the bandwidth.

#:Erik
--

Barry Margolin

unread,
Sep 19, 1998, 3:00:00 AM9/19/98
to
In article <36153f05....@news.newsguy.com>,

David Steuber "The Interloper" <tras...@david-steuber.com> wrote:
>Maybe. But I know how to bang bits in C which I don't know how to do
>in CL (yet).

See the section of CLtL about byte specifiers. It's at least as easy to
"bang bits" in CL as it is in C (often easier -- C requires you to do all
the shifting and masking yourself). The only thing CL doesn't have that C
does in this regard is structures with bit specifiers.

>Hmmm. What's so non portable about IEEE 704? I would have just said,


>"damn the bandwidth, print it out in ascii!"

Not all systems use IEEE 704, and neither C nor CL provide any way to find
out, so it's not really portable to write out your FP values as raw bytes.
And printing in ASCII can result in round-off errors accumulating.

David Steuber The Interloper

unread,
Sep 20, 1998, 3:00:00 AM9/20/98
to
On Sat, 19 Sep 1998 17:59:26 GMT, Barry Margolin
<bar...@bbnplanet.com> claimed or asked:

% Not all systems use IEEE 704, and neither C nor CL provide any way to find
% out, so it's not really portable to write out your FP values as raw bytes.
% And printing in ASCII can result in round-off errors accumulating.

I meant to type IEEE 754. You are right if the number is printed out
with less precision than is needed. In binary arithmetic, 0.1 can't
be represented with a finite number of bits.

I would think that ascii digits would be quite portable. Presumably
anything else requires agreeing on a representation. An alternative
to IEEE 754 without the waste of ascii might be some form of BCD.

Of course, if CL can do bit banging as easily as C, then IEEE 754 is
portable if that is the agreed upon representation for exchange. If
you need more precision, there is Intel's 80 bit representation. I
don't know if it specifies NaN, or +- infinity.

David Steuber The Interloper

unread,
Sep 20, 1998, 3:00:00 AM9/20/98
to
On 19 Sep 1998 10:28:56 +0000, Erik Naggum <er...@naggum.no> claimed or
asked:

% * tras...@david-steuber.com (David Steuber "The Interloper")
% | Hmmm. What's so non portable about IEEE 704?
%
% I have no idea what IEEE 704 is. it is not listed in IEEE Standards
% Status Report, either.

Damn my memory! I should have typed IEEE 754-1985, standard for
representing 64 bit floating point numbers that is widely used.

Barry Margolin

unread,
Sep 20, 1998, 3:00:00 AM9/20/98
to
In article <36056367...@news.newsguy.com>,

David Steuber "The Interloper" <tras...@david-steuber.com> wrote:
>Of course, if CL can do bit banging as easily as C, then IEEE 754 is
>portable if that is the agreed upon representation for exchange. If
>you need more precision, there is Intel's 80 bit representation. I
>don't know if it specifies NaN, or +- infinity.

The problem is that C doesn't provide any way to get the components of an
FP number -- it has nothing equivalent to FLOAT-MANTISSA. So if you've
agreed upon IEEE 754 as the interchange format, there's still no portable
way to write the native_to_ieee754() and ieee754_to_native() functions in
C. It's relatively simple in CL.

BTW, even if you know the machine uses IEEE 754 representation natively,
the conversion functions still aren't portable because you have to worry
about byte ordering.

David Steuber The Interloper

unread,
Sep 21, 1998, 3:00:00 AM9/21/98
to
On Sun, 20 Sep 1998 06:33:55 GMT, Barry Margolin
<bar...@bbnplanet.com> claimed or asked:

% The problem is that C doesn't provide any way to get the components of an
% FP number -- it has nothing equivalent to FLOAT-MANTISSA. So if you've
% agreed upon IEEE 754 as the interchange format, there's still no portable
% way to write the native_to_ieee754() and ieee754_to_native() functions in
% C. It's relatively simple in CL.
%
% BTW, even if you know the machine uses IEEE 754 representation natively,
% the conversion functions still aren't portable because you have to worry
% about byte ordering.

I thought IEEE 754 specified byte ordering. Anyway, one could agree
to use the byte representation that Intel uses for 64 bit floats. It
is then a question of implementing the native_to_ieee754() and
ieee754_to_native() functions, which by definition are platform
specific.

Given a pointer to a memory location, and knowledge of the size of the
type and its representation, I can shove the bits in. It isn't hard
to do in C. CL makes it easy because the compiler did the work for
you. Or so I assume.

Kent M Pitman

unread,
Sep 21, 1998, 3:00:00 AM9/21/98
to
tras...@david-steuber.com (David Steuber "The Interloper") writes:

> CL makes it easy because the compiler did the work for
> you. Or so I assume.

Whether or not the "compiler" does it is up to the implementation.
The language requires the semantics. The market dictates the
performance requirements.

It is loading more messages.
0 new messages