Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Which one, Lisp or Scheme?

113 views
Skip to first unread message

Yunho Jeon

unread,
Jan 20, 1997, 3:00:00 AM1/20/97
to

Hello,
I have some idea to experiment with, and the idea needs a language which can
execute codes produced by the program itself. Naturally, I thought LISP would
be the best choice. But while reading LISP FAQ, I found that there are
some varients of the language. Notably, Scheme seemed to be more modern and
cleaner language than LISP. Because I have almost no experience in those
languages, I would like to get answers from LISP/Scheme experts
for the following questions.

1) Which language is easier to learn (for C/C++ programmer)?
Scheme seems to be easier because it is much smaller than common lisp,
but how big is the difference (in terms of learning curve)?

2) The command to parse and execute a source program is 'eval', right?
If it is, is it a standard feature of Scheme? The Scheme FAQ was not
very clear on this point. Surely some implementations have it, but
the language seems to be more suited for compilers. Does the run-time
has embedded compiler? What's the overhead of it (size/speed)?
If I can't use compiler for my purposes, what do I lose and how much
(again in terms of speed)?

3) What about foreign language (C/C++) language interface and GUI's?
It's not essential for now, but it may be needed in future.

4) I am going to use Linux as the experiment platform, and don't want to
buy any commercial software - I'm only a student and it's hard to buy
and get support of such a software in Korea, where I live.
Both language has a lot of free implementations, but are they mature
enough?

Thanks for any helps in advance.
Best regards,
------------------------------------------------------------------------------
Yunho Jeon Tel: +82-2-880-6482~90 ext) 416
Intelligent Control Lab +82-2-875-9183
School of Electrical Engineering Fax: +82-2-888-4182
Seoul National University, Korea Email: yu...@csl.snu.ac.kr


Rainer Joswig

unread,
Jan 20, 1997, 3:00:00 AM1/20/97
to

> 1) Which language is easier to learn (for C/C++ programmer)?
> Scheme seems to be easier because it is much smaller than common lisp,
> but how big is the difference (in terms of learning curve)?

Common Lisp comes with a larger library, has some rough edges,
is more "industrial strength", ...

Scheme is much smaller, has some very different implementations,
has a lot of non-standard extensions, ..

Both Common Lisp and Scheme basics are relatively easy to learn.

> 2) The command to parse and execute a source program is 'eval', right?

Common Lisp has "READ", "EVAL" and "COMPILE".

> If it is, is it a standard feature of Scheme? The Scheme FAQ was not
> very clear on this point.

EVAL is not a standard feature of Scheme. Still, most Scheme systems
have it.

> 3) What about foreign language (C/C++) language interface and GUI's?
> It's not essential for now, but it may be needed in future.

FFI and GUI functionality is available. But no standard in sight.


> 4) I am going to use Linux as the experiment platform, and don't want to
> buy any commercial software - I'm only a student and it's hard to buy
> and get support of such a software in Korea, where I live.
> Both language has a lot of free implementations, but are they mature
> enough?

You may want to use Allegro CL 4.3 from Franz Inc. for Linux.
It is ***free*** for non commercial use. ACL 4.3 should be mature enough. ;-)
See http://www.franz.com/ how to get it. Got my CD-ROM (thanks Franz!,
but haven't tried it yet.


Rainer Joswig

Jussi Mantere

unread,
Jan 20, 1997, 3:00:00 AM1/20/97
to

Yunho Jeon (yu...@csl.snu.ac.kr) wrote:
: 2) The command to parse and execute a source program is 'eval', right?
: If it is, is it a standard feature of Scheme? The Scheme FAQ was not
: very clear on this point. Surely some implementations have it, but

: the language seems to be more suited for compilers. Does the run-time
: has embedded compiler? What's the overhead of it (size/speed)?
: If I can't use compiler for my purposes, what do I lose and how much
: (again in terms of speed)?
You're thinking way too C here.
When you install a Scheme (or any LISP) package on your computer,
you install the _interpreter_, or evaluator.
If you run the interpreter and type any command or procedure, the
evaluator automatically evaluates it and displays whatever value it's
supposed to display.

So, a "source program" is actually just a huge procedure which you'll execute
like any command.

(eval <op> <env>) as such is, afaik, a standard feature in scheme, defined in
R4RS. It evaluates whatever you feed to it in a given environtment.

See SICP for more info;)


: 4) I am going to use Linux as the experiment platform, and don't want to


: buy any commercial software - I'm only a student and it's hard to buy
: and get support of such a software in Korea, where I live.
: Both language has a lot of free implementations, but are they mature
: enough?

They are mature enough. Don't buy anything commercial.
Use Emacs and whatever scheme implementation you find convenient.
Guile, MIT Scheme... whatever.

-obs
--
(define me '((jussi mantere) (jmt 6b112a 02150 espoo) (09-468 2718)
(o...@iki.fi) (http://www.iki.fi/~obs)
(TiK abilobbari '97)))
Mikä Ihmeen Tiainen? Saakeli Cun Häiritsee - En Muista Enää!

Thant Tessman

unread,
Jan 20, 1997, 3:00:00 AM1/20/97
to

Yunho Jeon wrote:

> I have some idea to experiment with, and the idea needs a language which can

> execute codes produced by the program itself. [...]

> 2) The command to parse and execute a source program is 'eval', right?

Yes, but it's probably not what you need. The magical part of Scheme is
"lambda" which is how functions build other functions.

Actually, Scheme contains three levels of enlightenment. The first is
higher-order functions (lambda). The second is continuations
(call-with-current-continuation), and the third is macros.

Each will thoroughly hurt your brain, but re-birth is a painful process.
If you persevere you will be transformed, thrice, into a higher being.

-thant

Howard R. Stearns

unread,
Jan 20, 1997, 3:00:00 AM1/20/97
to

Yunho Jeon wrote:
>
> Hello,

> I have some idea to experiment with, and the idea needs a language which can
> execute codes produced by the program itself. Naturally, I thought LISP would
> be the best choice. But while reading LISP FAQ, I found that there are
> some varients of the language. Notably, Scheme seemed to be more modern and
> cleaner language than LISP. Because I have almost no experience in those
> languages, I would like to get answers from LISP/Scheme experts
> for the following questions.
> ...
> 2) The command to parse and execute a source program is 'eval', right?
> ...

Sort of.

The function READ parses information from a character stream and creates
lisp data (lists of literals, symbols, and more lists) that represent
the program "text".

The function EVAL can be used to execute such data as a program. EVAL
does not operate on characters, strings or streams.

In practice, EVAL is usually not necessary. In my experience, EVAL is
used in most textbooks only in discussing the implementation of an
interpreter (for Lisp or some other language). This is NOT the only way
to have a program execute utilities that are produced by the program
itself. Most projects are more cleanly and efficiently written using
compiled closures.

Will Hartung

unread,
Jan 20, 1997, 3:00:00 AM1/20/97
to

yu...@csl.snu.ac.kr (Yunho Jeon) writes:

>Hello,
>I have some idea to experiment with, and the idea needs a language which can
>execute codes produced by the program itself. Naturally, I thought LISP would
>be the best choice. But while reading LISP FAQ, I found that there are
>some varients of the language. Notably, Scheme seemed to be more modern and
>cleaner language than LISP. Because I have almost no experience in those
>languages, I would like to get answers from LISP/Scheme experts
>for the following questions.

I'm no expert, but that hasn't stopped me before.

>1) Which language is easier to learn (for C/C++ programmer)?
> Scheme seems to be easier because it is much smaller than common lisp,
> but how big is the difference (in terms of learning curve)?

Frankly, I would say that Lisp would be easier to learn than Scheme
for an C/C++ programmer.

Scheme is a lovely, elegant language, and is, I believe simpler and
easier to learn in its own right. It is hard not to like Scheme. But,
for someone who has a lot of history with C/C++, the way Scheme is
presented could throw you for a loop. You, as the student, would
probably take the approach of trying to learn "Scheme Syntax", whereas
the books spend more time on the "Scheme Way".

The "Scheme Way" of programming is very functional, lots of recursion,
local helper functions, etc. It is really a pretty nice way to go
about task of coding. However, its not the way MOST people
(particularly C/C++ people) write code. The idioms are all wrong.

If you look at how Lisp is presented, especially in something like
Paul Grahams "ANSI Common Lisp" book, it is easier to see the how your
entrenched C/C++ idioms translate into Lisp syntax and structures.

Once you get past the hurdles of the fact that you don't need pointers,
its pretty easy to write C/C++ code in a Lisp syntax. And Common Lisp
has an enormous catalog of functions to do all sorts of things. All of
the structures you are used to are in the language.

If you want to change the way to think about progamming and problem
solving, then grab all of the Scheme books, dig in, strap yourself
down, and hang on for a wild ride. It's quite a trip.

If you just want to work on your task, using your current mindset,
then get into Common Lisp, and let your fingers do the talking. You can
treat CL like C/C++ a lot easier. However, I do suggest you go in with
an open mind for the new, more powerful ways of solving problems that CL
can provide for you.

>2) The command to parse and execute a source program is 'eval', right?

> If it is, is it a standard feature of Scheme? The Scheme FAQ was not
> very clear on this point. Surely some implementations have it, but
> the language seems to be more suited for compilers. Does the run-time
> has embedded compiler? What's the overhead of it (size/speed)?
> If I can't use compiler for my purposes, what do I lose and how much
> (again in terms of speed)?

'eval' is standard in CL, not standard in Scheme, but as has been
mentioned, many Schemes provide it. 'eval' can incur a pretty dramatic
hit on the execution size of a program. How much overhead depends on
the implementation.

>3) What about foreign language (C/C++) language interface and GUI's?
> It's not essential for now, but it may be needed in future.

Many systems provide foriegn function interfaces. GUI's are available,
though less prominent.

>4) I am going to use Linux as the experiment platform, and don't want to
> buy any commercial software - I'm only a student and it's hard to buy
> and get support of such a software in Korea, where I live.
> Both language has a lot of free implementations, but are they mature
> enough?

There are several systems available for Linux. The FAQ lists most of
them. I like Aubrey Jaffers SCM Scheme package, and the latest Gambit-C
2.2 Scheme compiler getting a lot of good press. Scheme packages
differ wildly, so check the details.

There is a lot of "real" work going on in many of these packages, and
most are very mature.

Good Luck!

--
Will Hartung - Rancho Santa Margarita. It's a dry heat. vfr...@netcom.com
1990 VFR750 - VFR=Very Red "Ho, HaHa, Dodge, Parry, Spin, HA! THRUST!"
1993 Explorer - Cage? Hell, it's a prison. -D. Duck

Erik Naggum

unread,
Jan 21, 1997, 3:00:00 AM1/21/97
to

* Jussi Mantere

| When you install a Scheme (or any LISP) package on your computer,
| you install the _interpreter_, or evaluator.

this is factually wrong.

| (eval <op> <env>) as such is, afaik, a standard feature in scheme,
| defined in R4RS.

this is factually wrong.

| They are mature enough. Don't buy anything commercial.
| Use Emacs and whatever scheme implementation you find convenient.
| Guile, MIT Scheme... whatever.

this is disquieting. the strongest effects of free Lisp implementations to
date have been to turn people away from Lisp due to low performance, high
memory usage, etc; to perpetrate the _myth_ that all Lisps are interpreted,
that the language is slow, etc; to make people believe that Lisps don't fit
in with the rest of the operating system, that you can't make executables;
etc ad nauseam.

commercial implementations have taken Lisps out of the experimental lab and
made them shippable and supportable as useful systems. apparently, the
discipline needed to do this is not available for free, so it is safe to
assume it is hard, mostly uninspiring, work. (note, however, that my
experience is with Common Lisp. I don't know Scheme very well.)

#\Erik
--
1,3,7-trimethylxanthine -- a basic ingredient in quality software.

aro...@momotombo.austin.ibm.com

unread,
Jan 21, 1997, 3:00:00 AM1/21/97
to

Thant Tessman wrote:
> Actually, Scheme contains three levels of enlightenment. The first is
> higher-order functions (lambda). The second is continuations
> (call-with-current-continuation), and the third is macros.

Let's rename Scheme 'Scheme Trismegistus'.

Chris Bitmead

unread,
Jan 22, 1997, 3:00:00 AM1/22/97
to

In article <30627981...@naggum.no> Erik Naggum <er...@naggum.no> writes:

>| They are mature enough. Don't buy anything commercial.
>| Use Emacs and whatever scheme implementation you find convenient.
>| Guile, MIT Scheme... whatever.
>
>this is disquieting. the strongest effects of free Lisp implementations to
>date have been to turn people away from Lisp due to low performance, high
>memory usage, etc; to perpetrate the _myth_ that all Lisps are interpreted,
>that the language is slow, etc; to make people believe that Lisps don't fit
>in with the rest of the operating system, that you can't make executables;
>etc ad nauseam.
>
>commercial implementations have taken Lisps out of the experimental lab and
>made them shippable and supportable as useful systems. apparently, the
>discipline needed to do this is not available for free, so it is safe to
>assume it is hard, mostly uninspiring, work. (note, however, that my
>experience is with Common Lisp. I don't know Scheme very well.)

There are free Scheme and Lisp compilers capable of producing binary
executables. So you don't need a commercial product. (Although I'm
sure Franz lisp is an excellent product).

Erik Naggum

unread,
Jan 22, 1997, 3:00:00 AM1/22/97
to

* Chris Bitmead

| There are free Scheme and Lisp compilers capable of producing binary
| executables. So you don't need a commercial product. (Although I'm sure
| Franz lisp is an excellent product).

it may say more about my experience than anything else, but I grabbed all
the (free) Common Lisp implementations I could get my hands on for my
SPARC, including akcl, gcl, wcl, clisp, cmucl, and since I didn't have any
experience from any "real" Lisp systems, didn't know what I misssed outside
of CLtLn (n = 1 (akcl, gcl, wcl) or 2 (clisp, cmucl)). I don't want to go
advertising any products, but when I got my first commercial Lisp system
six weeks ago, I stopped working on my (Lisp) projects and sat down to
learn the _rest_ of the Lisp systems, as documented in about 1200 pages.
this has indeed paid off _very_ handsomely, yet it tells me that if all you
have ever seen are the free Lisps, you might be in for a very big surprise
when you get a Lisp-machine-like commercial implementation of Lisp.

(however, I might easily have missed similar software for free Lisps -- I
didn't know what to look for. maybe it would be useful if somebody who
knows what to look for in each compared free and commercial Lisp?)

Martin Cracauer

unread,
Jan 23, 1997, 3:00:00 AM1/23/97
to

Erik Naggum <er...@naggum.no> writes:

>| They are mature enough. Don't buy anything commercial.
>| Use Emacs and whatever scheme implementation you find convenient.
>| Guile, MIT Scheme... whatever.

>this is disquieting. the strongest effects of free Lisp implementations to
>date have been to turn people away from Lisp due to low performance, high
>memory usage, etc; to perpetrate the _myth_ that all Lisps are interpreted,
>that the language is slow, etc; to make people believe that Lisps don't fit
>in with the rest of the operating system, that you can't make executables;
>etc ad nauseam.

CMUCL is as fast and even smaller than most commercial implementations
on Unix. The only things I miss are threads and a better garbage
collector (although CMUCL's superior warnings help not to produce as
much gargabe in first place).

A commercial implementation has a nice environment, nice browsers,
maybe an editor in Common Lisp and therefore controllable from Lisp
and I found it very valuable to have such a visualization toolkit
around when I learned Common Lisp. But I think Eric missed the point
here.

I agree with the point that many free Lisp implementations are slow
and fail to point out in their documentation that one can run the same
program faster. In fact, I already had an argument with the author of
Xlisp about it after some magazine compared Lisp and perl (that is
Xlisp and perl) and headlined that Lisp is slow.

The problem here is that people choose a free implementation by other
criteria than speed and then complain it is too slow because they
underestimated the amount of efficiency they give up.

The Scheme community with eval implemented as write/read to disk
sometimes and total lack of declarations, but implementations that
take numbers as 32bit-limited without programmers permission is
another issue. While the author is not responsible, slib was what
turned me away from Scheme rather quickly. Some slib functionality in
unbeleivable slow in many implementation, functionality that is
standard in Common Lisp and Perl and therefore implemented either in C
or as overhead-free, declared code.

Martin
--
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Martin_...@wavehh.hanse.de http://cracauer.cons.org Fax.: +4940 5228536
"As far as I'm concerned, if something is so complicated that you can't ex-
plain it in 10 seconds, then it's probably not worth knowing anyway"- Calvin

Erik Naggum

unread,
Jan 24, 1997, 3:00:00 AM1/24/97
to

* Martin Cracauer

| CMUCL is as fast and even smaller than most commercial implementations on
| Unix. The only things I miss are threads and a better garbage collector
| (although CMUCL's superior warnings help not to produce as much gargabe
| in first place).

ah, agreed, but CMUCL is a breed apart from the rest. it compiles to
native code directly for a number of platforms, which is a formidable task,
and it is exceptionally good at helping the programmer declare types of
values when it could have used a hint. the code it generates is also quite
good. however, this is not the norm for free Lisp implementations.

out of the several hundred free Lisps and Schemes out there, most are toys.
they may be fun toys, but they are still toys. I think this is because it
is so easy to write a toy Lisp or Scheme, and so hard to write a fully
functional Lisp system.

| A commercial implementation has a nice environment, nice browsers, maybe
| an editor in Common Lisp and therefore controllable from Lisp and I found
| it very valuable to have such a visualization toolkit around when I
| learned Common Lisp. But I think Eric missed the point here.

well, those have been immaterial to me, but I'm sure we meet different
aspects of Lisp systems according to our experience. I have never used a
real Lisp system for real until I got Allegro CL, so I was impressed by how
much the system knew about itself, how much time the source code manager
saved me, how well the debugging is integrated with the cross-referencing
utilities, etc. example: I have about 200 functions in a package I'm
writing, spread across several files, and I can ask who calls a given
function, directly or indirectly. I know which functions and files to
recompile after changing a macro by asking for who uses it. I can ask the
system to tell me which functions bind, set, and/or reference a variable.
I have needed this functionality for _years_, not just in Lisp. Allegro
also has search lists that can do wondrous things, too. it's everything
_outside_ of the common Common Lisp stratum that impresses me most.

| In fact, I already had an argument with the author of Xlisp about it
| after some magazine compared Lisp and perl (that is Xlisp and perl) and
| headlined that Lisp is slow.
|
| The problem here is that people choose a free implementation by other
| criteria than speed and then complain it is too slow because they
| underestimated the amount of efficiency they give up.

I see a possible pattern here. if C was slow on some machine or in some
particular implementation, nobody would blame C or headline that C is slow.
the myth is that Lisp is slow, and every time somebody meets a slow Lisp,
that myth is reinforced. that you can compile with CMUCL at high speed
settings and beat the sh*t out of C is just as much as an aberration as C
running under some bounds-checking and memory-tracking package is slow.

yes, we all know that Lisp can be incredibly fast. we all know that Scheme
can be statically typed, too, as in the Stalin compiler. neither changes
the naive perceptions and the myths that will be reinforced the next time a
small, neat, fun toy, somewhat like SIOD, is released and tested by
somebody who carries those myths with him.

that's how I meant that the free Lisps have mostly worked to turn people
away from Lisp. I didn't mean that you can't find free Lisps that people
would have flocked to if they would only get over their prejudices. I
meant that they don't, because of the many toys they have used and think
are the norm. it seems, for instance, that in educational settings, Lisp
and Scheme are not at all presented with anything resembling speed in mind,
and so students who are used to C and C++ and ads reading "X compiles Java
at over 10,000 lines per second", will have trouble _not_ remembering that
speed was never discussed, that their compilers and interpreters were slow,
etc, etc. I mean, we have people come in here and state that Lisp is an
interpreted language at least once a week! it's an image problem, and it's
a tragic element of that story that as Lisp implementers focus on other
things, students and amateur programmers are turned into speed fanatics
because that is the only forte of those other languages and systems. (and
never mind that Allegro CL for Linux produces code that runs rings around
C++ under Windows NT on the same machine. *sigh*)

Marco Antoniotti

unread,
Jan 24, 1997, 3:00:00 AM1/24/97
to

I am enjoying this thread and I believe that it would be helpful to do
some classification in the Lisp/Scheme field and to produce some
"advice paper" (or whatever you want to call it.) Any magazine
article which wouldn't refer to such a pamphlet (as the aforemetioned
Lisp vs. Perl bogus comparison) should immediatly be dismissed with
for lack of parenthesis :)

First of all (and here I already know, there will be chaos) there are
only three dialects to be considered.

Common Lisp
Emacs Lisp (which, btw. can always be turned almost to a CLtL1 by
(require 'cl))
Scheme

Xlisp claims to be more and more CL (at least CLtL1) compliant.
Therefore it should not be considered as a standalone dialect.

Now we list the only reasonable alternatives for a free (gratis)
Common Lisp implementation and rank them for speed.

1 - CMUCL (Sparc, HP-PA, MIPS, X86)
4 - GCL, ECL (most Unix flavors and architectures - KCL and AKCL
are *OLD*)
5 - CLISP (Sorry Bruno! :) )
6 - Xlisp

The gap is intended :) (even if it may not be as wide as I'd like) The
latest version of Allegro CL for Linux is free, but who knows what
will be the marketing policies of Franz Inc.

In the Scheme world, though I never tried it, I hear that the Stalin
compiler could be a 1 or a 2 in my previous scale. AFAIK all the
other Scheme's (apart from being incompatible from each other at some
level) rank at 6 OR WORSE.

Emacs Lisp is reasonably fast and I would rank it 5.

In the commercial field, it looks like there are only three
alternatives for Common Lisp and their relative speed in my modest
experience would be the following.

1/2 Lucid (Un*x)
2/3 ACL, LispWorks (Un*x)
2/3 MCL (MacOS)

I left out Genera/Symbolics because it is in a league of its own. And
I am not familiar with any of the other (if surviving) commercial CL
implementations.

I have not been using ACL/PC enough to have a good idea of it, but the
environment sure looks as appealing as the fantastic MCL on the Mac
(of which I am a big fan :) )

--
Marco Antoniotti - Resistente Umano
===============================================================================
...it is simplicity that is difficult to make.
Bertholdt Brecht

Michael Sperber [Mr. Preprocessor]

unread,
Jan 24, 1997, 3:00:00 AM1/24/97
to

>>>>> "DB" == David Betz <db...@xlisper.mv.com> writes:

DB> [ ... ] If Lisp is going to be compared with
DB> other languages for speed, it should be the commercial implementations
DB> that are used in the comparison. (Although perl isn't commercial as far
DB> as I know.)

Still, there are free implementations of Scheme that are *VERY* fast.
Both Gambit and Bigloo can actually compete with C on at least some
applications. I'd be suprised if, say, Chez Scheme, were
significantly faster.

Cheers =8-} Mike


Guillermo (Bill) J. Rozas

unread,
Jan 24, 1997, 3:00:00 AM1/24/97
to

In article <30631029...@naggum.no> Erik Naggum <er...@naggum.no> writes:

| From: Erik Naggum <er...@naggum.no>
| Date: 24 Jan 1997 13:56:07 +0000

| that's how I meant that the free Lisps have mostly worked to turn people
| away from Lisp. I didn't mean that you can't find free Lisps that people
| would have flocked to if they would only get over their prejudices. I
| meant that they don't, because of the many toys they have used and think
| are the norm. it seems, for instance, that in educational settings, Lisp
| and Scheme are not at all presented with anything resembling speed in mind,
| and so students who are used to C and C++ and ads reading "X compiles Java
| at over 10,000 lines per second", will have trouble _not_ remembering that
| speed was never discussed, that their compilers and interpreters were slow,
| etc, etc. I mean, we have people come in here and state that Lisp is an
| interpreted language at least once a week! it's an image problem, and it's
| a tragic element of that story that as Lisp implementers focus on other
| things, students and amateur programmers are turned into speed fanatics
| because that is the only forte of those other languages and systems. (and
| never mind that Allegro CL for Linux produces code that runs rings around
| C++ under Windows NT on the same machine. *sigh*)

Actually, I think that the speed of the implementation, although
important, is nowhere near as critical as other components.

Lisp/Scheme gives you so much rope that it is so much easier to hang
yourself with, especially with respect to performance.

There is an old adage that goes something like "Lisp programmers know
the value of everything and the cost of nothing", and I think that it
is very true.

I have seen people complain about slow implementations only to
discover quadratic algorithms because they were treating lists as
arrays, or some other similar problem. Because C lacks lists, they
would never have written the program that way. Even if they used
lists, they would have to implement list-ref and list-set! themselves,
and would immediately realize how expensive they are, so they would
rethink their strategy.

Three issues that help C in efficiency (in my view) are:

- Good efficiency model in the small. It is very clear without much
thought to most C programmers how expensive or cheap the primitive
operations are. As a counterpart, the cost of closures and
continuations in Lisp/Scheme (and even simple addition) is much harder
to tell (even by an experienced programmer) because they depend so
much more on how well the compiler was able to optimize the code,
leading to small transformations affecting performance in
non-negligible ways.

- The standard library in C is very small. Thus C programmers reinvent
the wheel (hash tables, etc.) over and over in their programs. In so
doing, the cost of these operations (which might be primitives in
Lisp/Scheme) becomes painfully obvious to them, so they use them much
more judiciously than they might if given to them freely.

- Compilers still largely compile Fortran-style code much better than
small-procedure-intensive code. In as much as the prevailing
programming style in C is closer to Fortran than it is to Scheme's
preferred style, the performance of the code will be better. Remember
that most data flow algorithms (and so-called "global" optimizations)
work within procedures. Register allocation and instruction
scheduling are understood (to the degree they are) only within
procedures, etc. There is just not that much that a compiler can do
(short of inlining which is difficult with computed function calls) if
the procedures are really small. Lisp/Scheme compilers try to do a
good job at procedure calls, often better than C/Fortran, but they
can't generally compensate for bad style (with respect to efficiency
if not modularity and elegance). That is why it is not unusual to
find that a Lisp/Scheme compiler will beat some C compiler in
Fibonacci, Tak, or some such, but not in a more realistic programs.

Scott Draves

unread,
Jan 24, 1997, 3:00:00 AM1/24/97
to

David Betz <db...@xlisper.mv.com> wrote

> If Lisp is going to be compared with

> other languages for speed, it should
> be the commercial implementations

why? there are plenty of free&fast
implementations of *both* C (GCC, LCC)
and Lisp/scheme (CMUCL, gcl, gambit, cscheme).

--
balance
equilibrium
death

http://www.cs.cmu.edu/~spot

Juergen Nickelsen

unread,
Jan 25, 1997, 3:00:00 AM1/25/97
to

Marco Antoniotti <mar...@crawdad.icsi.berkeley.edu> wrote:

> Emacs Lisp (which, btw. can always be turned almost to a CLtL1 by
> (require 'cl))

While Emacs Lisp is perhaps the most widespread Lisp dialect (with most
of its users unaware of its existence), I would not consider it one of
the major Lisp dialects -- its scope (no pun intended) is just too
small.

cl.el does not turn Emacs Lisp into a CLtL1 Lisp. It does provide some
of the convenience functions like caadr, backquote and comma (but as
macros, not reader macros, like in "(` (3 4 (, (+ 5 6)) 7)))", and some
other.

--
Juergen Nickelsen

Erik Naggum

unread,
Jan 26, 1997, 3:00:00 AM1/26/97
to

* Juergen Nickelsen

| cl.el does not turn Emacs Lisp into a CLtL1 Lisp. It does provide some
| of the convenience functions like caadr, backquote and comma (but as
| macros, not reader macros, like in "(` (3 4 (, (+ 5 6)) 7)))", and some
| other.

I'm getting _real_ sick and tired of old prejudice against various Lisps.

next time, check your facts with a recent Emacs. the backquote package was
completely rewritten 1994-03-06. that's nearly three years ago! the Lisp
reader now accepts both the old and the new style equally well.

*sigh*

Juergen Nickelsen

unread,
Jan 26, 1997, 3:00:00 AM1/26/97
to

By mistake (I was too stupid to handle my newsreader) I posted the
following before the article was complete:

> Marco Antoniotti <mar...@crawdad.icsi.berkeley.edu> wrote:
>
> > Emacs Lisp (which, btw. can always be turned almost to a CLtL1 by
> > (require 'cl))
>
> While Emacs Lisp is perhaps the most widespread Lisp dialect (with most
> of its users unaware of its existence), I would not consider it one of
> the major Lisp dialects -- its scope (no pun intended) is just too
> small.
>

> cl.el does not turn Emacs Lisp into a CLtL1 Lisp. It does provide some
> of the convenience functions like caadr, backquote and comma (but as
> macros, not reader macros, like in "(` (3 4 (, (+ 5 6)) 7)))", and some
> other.

To be precise, it adds a lot of convenience functions, and this is what
is intended by cl.el. Its author Dave Gillespie writes in the cl.el
documentation:

> * Some features are too complex or bulky relative to their benefit
> to Emacs Lisp programmers. CLOS and Common Lisp streams are fine
> examples of this group.
>
> * Other features cannot be implemented without modification to the
> Emacs Lisp interpreter itself, such as multiple return values,
> lexical scoping, case-insensitive symbols, and complex numbers.
> The "CL" package generally makes no attempt to emulate these
> features.

cl.el does indeed make Emacs Lisp programming easier for programmers
familiar with Common Lisp. Emacs Lisp with cl.el loaded is still way
different from Common Lisp, though.

--
Juergen Nickelsen

Juergen Nickelsen

unread,
Jan 26, 1997, 3:00:00 AM1/26/97
to

Erik Naggum <er...@naggum.no> wrote:

> * Juergen Nickelsen


> | cl.el does not turn Emacs Lisp into a CLtL1 Lisp. It does provide some
> | of the convenience functions like caadr, backquote and comma (but as
> | macros, not reader macros, like in "(` (3 4 (, (+ 5 6)) 7)))", and some
> | other.

[...]


> the backquote package was completely rewritten 1994-03-06. that's nearly
> three years ago! the Lisp reader now accepts both the old and the new
> style equally well.

Please apologize for this error, it is true that I didn't follow
developments of Emacs closely.

But that did only seem like my major point here; in fact I wanted to
write more (and did in a follow-up article), but due to a mistake the
article slipped out too early. In the follow-up I quote Dave Gillespie,
recent cl.el's author, from the documentation of cl.el (as contained in
XEmacs 19.14), which states clearly the facts why cl.el does not turn
Emacs Lisp into Common Lisp.

--
Juergen Nickelsen

Martin Cracauer

unread,
Jan 26, 1997, 3:00:00 AM1/26/97
to

Erik Naggum <er...@naggum.no> writes:

>* Martin Cracauer


>| A commercial implementation has a nice environment, nice browsers, maybe
>| an editor in Common Lisp and therefore controllable from Lisp and I found
>| it very valuable to have such a visualization toolkit around when I
>| learned Common Lisp. But I think Eric missed the point here.

>well, those have been immaterial to me, but I'm sure we meet different
>aspects of Lisp systems according to our experience. I have never used a
>real Lisp system for real until I got Allegro CL, so I was impressed by how
>much the system knew about itself, how much time the source code manager
>saved me, how well the debugging is integrated with the cross-referencing
>utilities, etc. example: I have about 200 functions in a package I'm
>writing, spread across several files, and I can ask who calls a given
>function, directly or indirectly. I know which functions and files to
>recompile after changing a macro by asking for who uses it. I can ask the
>system to tell me which functions bind, set, and/or reference a variable.
>I have needed this functionality for _years_, not just in Lisp. Allegro
>also has search lists that can do wondrous things, too. it's everything
>_outside_ of the common Common Lisp stratum that impresses me most.

Again, I think the difference between commercial and free
implementations is not so much that free implementations don't offer
such functionality, but that all this nice functionality is offered in
a very easy way. Tools like this are availiable for example in Mark
Kantrowitz' (spelling?) tools. Since many people what to design their
own tools -maybe for severl implementations, the availiable source for
free tools can become even more important.

As I stated earlier, I think it can be very important for a new user
to have such functionality offered without further thinking and
therefore to be free to concentrate on languages issues. This is at
least where I found my commercial implementation useful (Lispworks).

I think the most important thing the free commercial Lisp for Windows
and ACL for Linux is that new Common Lisp user get a much better
chance to become productive in Common Lisp before they give up.

Later, I found CMUCL and free tools not to be second rate anymore.

Comfortable CLOS browsing is another issue when talking about free
alternatives, though.

>| In fact, I already had an argument with the author of Xlisp about it
>| after some magazine compared Lisp and perl (that is Xlisp and perl) and
>| headlined that Lisp is slow.
>|
>| The problem here is that people choose a free implementation by other
>| criteria than speed and then complain it is too slow because they
>| underestimated the amount of efficiency they give up.

>I see a possible pattern here. if C was slow on some machine or in some
>particular implementation, nobody would blame C or headline that C is slow.
>the myth is that Lisp is slow, and every time somebody meets a slow Lisp,
>that myth is reinforced. that you can compile with CMUCL at high speed
>settings and beat the sh*t out of C is just as much as an aberration as C
>running under some bounds-checking and memory-tracking package is slow.

>yes, we all know that Lisp can be incredibly fast. we all know that Scheme
>can be statically typed, too, as in the Stalin compiler. neither changes
>the naive perceptions and the myths that will be reinforced the next time a
>small, neat, fun toy, somewhat like SIOD, is released and tested by
>somebody who carries those myths with him.

In my opinion, the problem is that many Lisp and especially Scheme
tool implementors couple their tools to an inefficient
implementation. No C programmer would do so.

Of course, many Scheme tools require extending the language, whereas C
tools usually can force the programmer to use clumsy interfaces.

But I think Lisp and especially Scheme tools implementors overrated
language elegancy and gave up intergrating their tools into the best
availiable language implmentation too early.


>that's how I meant that the free Lisps have mostly worked to turn people
>away from Lisp. I didn't mean that you can't find free Lisps that people
>would have flocked to if they would only get over their prejudices. I
>meant that they don't, because of the many toys they have used and think
>are the norm. it seems, for instance, that in educational settings, Lisp
>and Scheme are not at all presented with anything resembling speed in mind,
>and so students who are used to C and C++ and ads reading "X compiles Java
>at over 10,000 lines per second", will have trouble _not_ remembering that
>speed was never discussed, that their compilers and interpreters were slow,
>etc, etc. I mean, we have people come in here and state that Lisp is an
>interpreted language at least once a week! it's an image problem, and it's
>a tragic element of that story that as Lisp implementers focus on other
>things, students and amateur programmers are turned into speed fanatics
>because that is the only forte of those other languages and systems. (and
>never mind that Allegro CL for Linux produces code that runs rings around
>C++ under Windows NT on the same machine. *sigh*)

I think it's not *that* bad.

In my opinion, people tend to rate Lisp as interpreted because they
can't get the idea of a dynamic language that is compiled. They hear
of Lisp features and automatically say "interpreted" where they may
have thought "dynamic".

In other words, the problem is not a bad image of Lisp, but the fact
that people are ignorant and tend to fail to recognize what
performance problems are really involved with a dynamic language.

Georg Bauer

unread,
Jan 26, 1997, 3:00:00 AM1/26/97
to

Hi!

MA> AFAIK all the
MA> other Scheme's (apart from being incompatible from each other at some
MA> level) rank at 6 OR WORSE.

Wrong. Many Scheme-Implementations are quite fast. And most of them
implement R4RS, so you have a standard. Ok, PC-Scheme in it's two flavours
only implements R3RS. My favorites with Scheme are Gambit-C and S88. The
first is a highly portable implementation that delivers good performance,
the latter one is a special DOS-version with native code compiler, and a
_very_ good performance. S88 is so fine because it runs on my little
Quaderno (sub-notebook computer). :-)

bye, Georg

--
Nachricht von Mausefalle

Alexey Goldin

unread,
Jan 27, 1997, 3:00:00 AM1/27/97
to

crac...@wavehh.hanse.de (Martin Cracauer) writes:
>
> Comfortable CLOS browsing is another issue when talking about free
> alternatives, though.
>


OOBR (object oriented browser) for Emacs/Xemacs helps a lot. It comes
with Xemacs distribution, but works with Emacs too.

Cyber Surfer

unread,
Jan 27, 1997, 3:00:00 AM1/27/97
to

In article <30631029...@naggum.no> er...@naggum.no "Erik Naggum" writes:

> that's how I meant that the free Lisps have mostly worked to turn people
> away from Lisp. I didn't mean that you can't find free Lisps that people
> would have flocked to if they would only get over their prejudices. I
> meant that they don't, because of the many toys they have used and think
> are the norm. it seems, for instance, that in educational settings, Lisp
> and Scheme are not at all presented with anything resembling speed in mind,
> and so students who are used to C and C++ and ads reading "X compiles Java
> at over 10,000 lines per second", will have trouble _not_ remembering that
> speed was never discussed, that their compilers and interpreters were slow,
> etc, etc. I mean, we have people come in here and state that Lisp is an
> interpreted language at least once a week! it's an image problem, and it's
> a tragic element of that story that as Lisp implementers focus on other
> things, students and amateur programmers are turned into speed fanatics
> because that is the only forte of those other languages and systems. (and
> never mind that Allegro CL for Linux produces code that runs rings around
> C++ under Windows NT on the same machine. *sigh*)

I agree with all the above. ACL can effortless beat the sh*t out
of C++ even on a Windows platform, and it does it using far less
memory. Perhaps I'm being unfiar by comparing ACL for Windows with
VC++ and MFC, but C++ and MFC is a popular tool for developing
Windows code, so it seems like a reasonable comparison to me.
VC++ compiling MFC code demands more than 5 times as much RAM as
ACL, and still sucks in terms of compilation speed (ACL is fast
as lightning, while VC++ crawls), and ease of development.

All of which can translate into more development time for C++
than with Lisp. Why then am I not using Lisp to development
the code I'm paid for? Simply because I'm still saving up for
ACL for Windows, which costs about 8 times more than VC++,
which I got for _free_, as that's what I'm paid to use.

While the value of Lisp is obvious to me, this doesn't write
cheques. The money has to come first, then the tools, then the
code. Not that this is a real problem, as I can get by using
free tools (the ones with no strings attached), and then say,
"Hey, this works - in spite of not using C++ - and I _know_
I can do the same in Lisp, only better and faster."

I like Rainer's point about stats and numeric accuracy. I may
well use that argument to justify replacing our existing stats
code with something written in Lisp (or Haskell, which provides
similar numerical advantages as Lisp).

I recently took another look at a Lisp interpreter that I wrote
in the late 80s. I learned a lot from writing that, but I've
learned even more since then. Revising it so it could be compiled
with GNU C - for any platform that GNU C runs on - was a very
satisfying experience.

However, I'll probably need to do a lot more work before I make
it available on the Internet. It needs some docs, and a heavy
disclaimer about how and when I wrote it, i.e. to help me learn
more about Lisp. As a result, it's no speed demon.

Thanks for the thought-provoking posts!
--
<URL:http://www.wildcard.demon.co.uk/> You can never browse enough
Martin Rodgers | Developer and Information Broker | London, UK
Please remove the "nospam" if you want to email me.
"Blow out the candles, HAL."


Reini Urban

unread,
Jan 27, 1997, 3:00:00 AM1/27/97
to

On Mon, 20 Jan 1997 11:19:30 +0100, jos...@lavielle.com (Rainer Joswig) wrote:
>Both Common Lisp and Scheme basics are relatively easy to learn.

In my eyes Common Lisp is quite hard to learn
(compared to standard lisp or scheme)
--
To prevent spam replies, my newsreader has misspelled my email address.
To send me email, remove the final "!". Sorry for the inconvenience!

Reini You can never surf too much!

Erik Naggum

unread,
Jan 27, 1997, 3:00:00 AM1/27/97
to

* Reini Urban

| In my eyes Common Lisp is quite hard to learn
| (compared to standard lisp or scheme)

what is "standard lisp"?

Martin Cracauer

unread,
Jan 27, 1997, 3:00:00 AM1/27/97
to

g...@hplgr2.hpl.hp.com (Guillermo (Bill) J. Rozas) writes:

>In article <30631029...@naggum.no> Erik Naggum <er...@naggum.no> writes:

>| From: Erik Naggum <er...@naggum.no>
>| Date: 24 Jan 1997 13:56:07 +0000

>| that's how I meant that the free Lisps have mostly worked to turn people
>| away from Lisp. I didn't mean that you can't find free Lisps that people
>| would have flocked to if they would only get over their prejudices. I
>| meant that they don't, because of the many toys they have used and think
>| are the norm. it seems, for instance, that in educational settings, Lisp
>| and Scheme are not at all presented with anything resembling speed in mind,
>| and so students who are used to C and C++ and ads reading "X compiles Java
>| at over 10,000 lines per second", will have trouble _not_ remembering that
>| speed was never discussed, that their compilers and interpreters were slow,
>| etc, etc. I mean, we have people come in here and state that Lisp is an
>| interpreted language at least once a week! it's an image problem, and it's
>| a tragic element of that story that as Lisp implementers focus on other
>| things, students and amateur programmers are turned into speed fanatics
>| because that is the only forte of those other languages and systems. (and
>| never mind that Allegro CL for Linux produces code that runs rings around
>| C++ under Windows NT on the same machine. *sigh*)

>Actually, I think that the speed of the implementation, although


>important, is nowhere near as critical as other components.

While I think the rest of your posting is very valid, this statement
is not.

An existing performance problem in an implementation is usually a sign
for a misdesign. Several times I thought I should be clever enough to
work around such problems, only to find out that the implementors
aren't stupid either and the problem is a hard one.

I found myself quite often in a situation where a apparent minor
performance problems with a given language implementation (or OS, for
that matter) persisted and got worse and worse as a project continued.

I found CMUCL to be the only free CL implementation without major
performance showstoppers, and only when not taking PCL/CLOS into
account.

William D Clinger

unread,
Jan 27, 1997, 3:00:00 AM1/27/97
to will

The speed of a Lisp or Scheme implementation is tricky to
characterize, because an implementation may very well be
quite fast at some things but slow at others. For example,
I was the primary author of MacScheme, which was fast on tight
loops and generic arithmetic (especially fixnum arithmetic),
but was slow on non-tail calls. An implementor of Lisp or
Scheme has a lot more scope for both creativity and stupidity
than does an implementor of C, which tends to be implemented
approximately the same way by all compilers. See Dick Gabriel's
"Performance and Evaluation of Lisp Systems" for more on this.

I don't think it's useful to get too involved in a discussion
of which implementation is faster than another, because it
usually depends on precisely what you're trying to do and also
on your particular coding style. Having wasted a fair amount
of my life studying this sort of thing, however, I feel an urge
to offer some real albeit useless information.

Marco Antoniotti <mar...@crawdad.icsi.berkeley.edu> wrote:
> In the Scheme world, though I never tried it, I hear that the Stalin
> compiler could be a 1 or a 2 in my previous scale. AFAIK all the

> other Scheme's (apart from being incompatible from each other at some

> level) rank at 6 OR WORSE.

Chez Scheme and Larceny (our unreleased implementation) perform
in the same league with the commercial implementations of Common
Lisp that Antoniotti ranked as better than a 1. Stalin may be
in that league as well, but I haven't tried it. Gambit-C and
Bigloo would be a 1 or a 2; as noted below, their performance
is limited by the fact that they compile to C. MIT Scheme might
not be quite as fast, but it is probably on the order of 100 times
as fast as xlisp, which Antoniotti ranked at 6. I haven't used
Macintosh Common Lisp in recent years, but in 1988 its performance
(on the 68020) was roughly comparable to that of MacScheme, though
its performance profile was somewhat the opposite: MCL was faster
on non-tail calls, but slower on inner loops and arithmetic.

See http://www.ccs.neu.edu/home/will/Twobit/benchmarks1.html for
the kind of numbers that should not be taken very seriously, but
are better than hearsay. In particular these numbers illustrate
how the ranking of an implementation will vary depending on the
nature of the benchmark.

Michael Sperber wrote:
> Both Gambit and Bigloo can actually compete with C on at least some
> applications. I'd be suprised if, say, Chez Scheme, were
> significantly faster.

Chez Scheme is roughly twice as fast as Gambit-C on many programs,
mainly because Gambit-C compiles to C instead of to native code,
and you lose a factor of two because of the hoops that you have to
jump through to guarantee proper tail-recursion when generating
C code. This factor of two is acknowledged by Marc Feeley, the
author of Gambit. Bigloo also compiles to C, but may not suffer
quite as much because it doesn't try as hard to conform to the
IEEE/ANSI standard for Scheme.

William D Clinger

Jeff Barnett

unread,
Jan 27, 1997, 3:00:00 AM1/27/97
to

In article <30633807...@naggum.no>, Erik Naggum <er...@naggum.no> writes:
|> | In my eyes Common Lisp is quite hard to learn
|> | (compared to standard lisp or scheme)
|>
|> what is "standard lisp"?

In the current context, I guess it means an "uncommon Lisp".
So any lisp with a small distribution must be standard!

Jeff Barnett

PS It's been that kind of day.


Espen Vestre

unread,
Jan 28, 1997, 3:00:00 AM1/28/97
to

Erik Naggum <er...@naggum.no> writes:

> | In my eyes Common Lisp is quite hard to learn
> | (compared to standard lisp or scheme)
>
> what is "standard lisp"?

Standard Lisp was a pre common lisp attempt to standardize lisp.
There is a reference to The Standard Lisp Report in CLtL II.

--

Espen Vestre
Telenor Online AS

John Fitch

unread,
Jan 28, 1997, 3:00:00 AM1/28/97
to

Standard LISP is still very much alive. It is the basis of the REDUCE
algebra system the author of whom realised a need for a standard basis
for his programs way back in the 60s. The second Standard LISP report
was written in the late 1970s; I was responsible for the IBM370
implementation at that time.

But it still is active with CSL and PSL.

==John ffitch
Bath and Codemist Ltd

Rainer Joswig

unread,
Jan 28, 1997, 3:00:00 AM1/28/97
to

In article <30633807...@naggum.no>, Erik Naggum <er...@naggum.no> wrote:

> * Reini Urban


> | In my eyes Common Lisp is quite hard to learn
> | (compared to standard lisp or scheme)
>
> what is "standard lisp"?

Maybe Standard Lisp? See http://www.rrz.uni-koeln.de/REDUCE/3.6/doc/sl/
for the Standard Lisp Report.

http://www.lavielle.com/~joswig/lisp.html

--
http://www.lavielle.com/~joswig/

Cyber Surfer

unread,
Jan 28, 1997, 3:00:00 AM1/28/97
to

In article <1997Jan26.1...@wavehh.hanse.de>
crac...@wavehh.hanse.de "Martin Cracauer" writes:

> In my opinion, people tend to rate Lisp as interpreted because they
> can't get the idea of a dynamic language that is compiled. They hear
> of Lisp features and automatically say "interpreted" where they may
> have thought "dynamic".

This is why I can answer almost any attack on Lisp from a C/C++
programmer by suggesting that they read PJ Brown's book. Some
people may have even forgotten that this stuff can be done even
in Basic. Yes, Basic was once interactive. I can't remember the
last time I read anything about a commercial Basic implementation
that was interactive. (Perhaps because today almost everything for
Windows is _batch_ oriented - ironic, eh?)



> In other words, the problem is not a bad image of Lisp, but the fact
> that people are ignorant and tend to fail to recognize what
> performance problems are really involved with a dynamic language.

This is why I so often find myself recommending Brown's book.
Too many people don't have any idea what "interactive" means!
You're right, they think it means "interpreted". Even worse,
they think that "compiled" always means "native code compiled".

Erik sometimes calls people stupid, but if he's right, then we're
in a hopeless situation. If you're right, and I think you are, then
it's a case of ignorance, and we can fix that. It'll take a lot
of effort and time, but it can be done.

Perhaps Java is helping to make this possible, but it might also
make things much worse if it fails. After all, the JVM is seen as
"interpreted", which it isn't. Implementations may, but they also
may be compiled to native code. Not enough people realise this,
and this may be another ignorance problem.

If Java can suffer in this case, think about what ignorance can
do to Lisp, which is much harder for the average C hacker to
understand. It's going to be hard work.

Rainer Joswig

unread,
Jan 28, 1997, 3:00:00 AM1/28/97
to

> On Mon, 20 Jan 1997 11:19:30 +0100, jos...@lavielle.com (Rainer Joswig) wrote:
> >Both Common Lisp and Scheme basics are relatively easy to learn.
>

> In my eyes Common Lisp is quite hard to learn
> (compared to standard lisp or scheme)

Really?

Perhaps some people try to tell you about CL which doesn't understand
it themselves (because they don't use it for example).
Then some people try to tell you that CL lacks pattern matching
like some other functional language. Not only is it easy
to integrate pattern matching, but they don't understand,
that for larger software libraries pattern based invocation
is not very maintainable. Then people begin to tell you
that CL does not allow to return "tuples" of values.
Again this is easy (use VALUES, or structures, whatever).

Common Lisp is releatively easy to understand. Not
full Common Lisp - you don't need to tell them about
DEFSETF or about meta classes. But Common Lisp
has the same basic properties like Scheme.
It additionally has values and function cells and
supports also dynamic binding (aka FLUID-LET in Scheme).
Well, that is no big deal. Then Common Lisp has a
small set of special forms, some macros and functions.
The basic evaluation model is easy.

Then you start programming. You will need some library
functions. Well, Common Lisp has a lots of stuff
in the language. You want to print something -
use PRINC (or whatever). Its already there. If you
need something complicated - its there, too.

You just need an overview over the CL libraries. In case
you need something - just look into the manual.
Why should it be more difficult to program
a student project for searching a maze and
printing the results to ASCII text in Common Lisp,
then in Scheme? All you need of CL looks
similar to the Scheme stuff.

I don't get it. I always thought, that CL is really
easy to master (compared to, say, Haskell or C++).

Rainer Joswig

--
http://www.lavielle.com/~joswig/

Cyber Surfer

unread,
Jan 28, 1997, 3:00:00 AM1/28/97
to

In article <30633807...@naggum.no> er...@naggum.no "Erik Naggum" writes:

> * Reini Urban


> | In my eyes Common Lisp is quite hard to learn
> | (compared to standard lisp or scheme)
>

> what is "standard lisp"?

Reini may be refering to a Lisp dialect called Standard Lisp,
which I believe dates from 1966. The version I used on the Atari
ST included feature like backquote, which suspect is a more
recent Lisp feature, but I don't know.

Think of it as a predecessor to Common Lisp. In fact, you can
find it on the front cover of the paperback edition of CLtL1,
just after Zetalisp, and before NIL.

Cyber Surfer

unread,
Jan 29, 1997, 3:00:00 AM1/29/97
to

In article <joswig-ya0231800...@news.lavielle.com>
jos...@lavielle.com "Rainer Joswig" writes:

> Perhaps some people try to tell you about CL which doesn't understand
> it themselves (because they don't use it for example).
> Then some people try to tell you that CL lacks pattern matching
> like some other functional language. Not only is it easy
> to integrate pattern matching, but they don't understand,

> that CL does not allow to return "tuples" of values.
> Again this is easy (use VALUES, or structures, whatever).

I agree that pattern matching can be added to CL, and other
Lisps too, but I'm not sure why you say that "for larger


software libraries pattern based invocation is not very

maintainable." The issue of code maintainence isn't effected
by pattern matching, as far as I'm aware.

Perhaps some people prefer the functional syntax, using
tuples. That doesn't necessarily imply that a tuple object
is CONstructed. It might be, but I think that's a detail
of the implementation, not the language.

It looks like you're defending CL by attacking another style
of programming. I hope this not the case, as I don't believe
that it's necessary. No language is so good that it justifies
such an attack, and I'm know that you can find better ways
of defending CL.

So, I'm assuming that you just badly phrased your point,
by appearing to include some language politics.



> I don't get it. I always thought, that CL is really
> easy to master (compared to, say, Haskell or C++).

Haskell is _also_ easy to understand. It's just a little
different to CL (and C++), but I find I can apply a great
deal of my experience in Lisp to Haskell. Like Lisp, it
helps you learn it using a good book, and know people who
can answer your questions.

Mastering any language worth learning takes time. After
more than 10 years, I'm still learning CL. I'm also still
learning C++, tho I feel I've used it enough by now.
I haven't _begun_ to master Scheme!

My favourite languages are still CL and Scheme, but I'm
beginning a love affair with Haskell. I don't feel that
I'm cheating any of them, as they're only tools. Damn fine
tools, just the same.

Tim Bradshaw

unread,
Jan 30, 1997, 3:00:00 AM1/30/97
to

* Reini Urban wrote:
> On Mon, 20 Jan 1997 11:19:30 +0100, jos...@lavielle.com (Rainer Joswig) wrote:
>> Both Common Lisp and Scheme basics are relatively easy to learn.

> In my eyes Common Lisp is quite hard to learn

> (compared to standard lisp or scheme)

If it's possible to ask this question without provoking endless futile
discussion, could you say why? I've taught courses on Common Lisp,
and it would be interesting to know what people find hard about basic CL,
especially compared to scheme.

I can see that CL is very *large* & therefore intimidating compared to
Scheme, but that can be fixed by teaching it the right way. Scope &
extent people find hard, but is common between them. Different fn and
variable namespaces make CL harder I think. call/cc makes scheme very
much conceptually harder though for many people.

So it would be quite interesting to know why CL is harder (or why
scheme is harder) and how that could be fixed, if it can.

--tim

Marc Feeley

unread,
Jan 30, 1997, 3:00:00 AM1/30/97
to

William D Clinger <wi...@ccs.neu.edu> wrote:

> Chez Scheme is roughly twice as fast as Gambit-C on many programs,
> mainly because Gambit-C compiles to C instead of to native code,
> and you lose a factor of two because of the hoops that you have to
> jump through to guarantee proper tail-recursion when generating
> C code. This factor of two is acknowledged by Marc Feeley, the
> author of Gambit.

This isn't quite right. I acknowledge that Gambit-C is "on average" a
factor of 2 slower than Gambit when compiling directly to native code
(for details check the old and unpublished paper
http://www.iro.umontreal.ca/~feeley/papers/stc.ps). This factor of 2
is essentially due to the way Gambit-C implements proper
tail-recursive behavior in C.

Gambit-C is about the same performance as Chez-Scheme (when Gambit-C
uses "block" compilation to unsafe code with fixnum/flonum specific
arithmetic). Of course run time performance is only one part of the
story since many other characteristics are important to compare as
well in a practical setting (compile time, portability,
interoperability, adherence to the standards, language extensions,
debugging, etc, etc).

Marc

Earl & Daniella Harris

unread,
Jan 30, 1997, 3:00:00 AM1/30/97
to Rainer Joswig

Rainer Joswig wrote:

>
> In article <32ecf05f...@news.sime.com>, rur...@sbox.tu-graz.ac.at! wrote:
>
> > On Mon, 20 Jan 1997 11:19:30 +0100, jos...@lavielle.com (Rainer Joswig) wrote:
> > >Both Common Lisp and Scheme basics are relatively easy to learn.
> >
> > In my eyes Common Lisp is quite hard to learn
> > (compared to standard lisp or scheme)
>
> Really?

>
> Perhaps some people try to tell you about CL which doesn't understand
> it themselves (because they don't use it for example).

Ok. I don't understand CL. When I tried to teach myself CL, I was
expecting a simple language like Scheme. The details of CL seemed
overwhelming. I kept asking myself, why is this SO complicated?
This is suppose to be Lisp?

IMHO, CL looks like it is much harder to master the Scheme. The
following
is my rebuttal.

> Then some people try to tell you that CL lacks pattern matching
> like some other functional language. Not only is it easy
> to integrate pattern matching, but they don't understand,

> that for larger software libraries pattern based invocation

If you need to understand pattern matching to master CL, this is one
strike
against CL. Scheme doesn't have this; it isn't necessary.

Why would I want (or need) pattern matching in Lisp? Is this like
pattern matching in ML? Can you use CL and avoid using patterns?

> is not very maintainable. Then people begin to tell you


> that CL does not allow to return "tuples" of values.
> Again this is easy (use VALUES, or structures, whatever).

If you need to understand tuples and structures to master CL, this is
one strike against CL.
Scheme doesn't have tuples; it isn't necessary. Some scheme
implementations
have structures, but you don't need to learn it.

Tuples? Values? Structures? If I want to return more than value,
I return them in a list (or vector). Why would I want tuples?
Are structures like structures in C? How are tuples different from
structures and lists? Are values a new data type in CL?

>
> Common Lisp is releatively easy to understand. Not
> full Common Lisp - you don't need to tell them about
> DEFSETF or about meta classes. But Common Lisp
> has the same basic properties like Scheme.

If you have to learn and differentiate between several DEFs, this is
a strike against CL.

While scheme has essentially "define," Common Lisp has several
"DEFsomethings".
Why does Common Lisp have some many definitions?

If you need to understand "meta classes" to master CL, this is also
one strike against CL.

Why do you need meta classes in Common Lisp?

> It additionally has values and function cells and
> supports also dynamic binding (aka FLUID-LET in Scheme).
> Well, that is no big deal. Then Common Lisp has a
> small set of special forms, some macros and functions.
> The basic evaluation model is easy.

I'll talk about the evaluation model at the end.

>
> Then you start programming. You will need some library
> functions. Well, Common Lisp has a lots of stuff
> in the language. You want to print something -
> use PRINC (or whatever). Its already there. If you

If you need to understand several flags and options in the print
functions,
this is one strike against CL.

Scheme's printing options are much clearer to me. You apply the
print function to the value and it prints it.

CL has really exotic print functions with lots of flags and options.
It reminds me of C's print function. There are more details.

> need something complicated - its there, too.
>
> You just need an overview over the CL libraries. In case
> you need something - just look into the manual.

I'm not sure a libary makes a language easier to master.
A libary can be a convenience to the programmer. It saves
me the trouble of writing some programs.

> Why should it be more difficult to program
> a student project for searching a maze and
> printing the results to ASCII text in Common Lisp,
> then in Scheme? All you need of CL looks
> similar to the Scheme stuff.
>

> I don't get it. I always thought, that CL is really
> easy to master (compared to, say, Haskell or C++).

Regarding the evaluation model, CL's doesn't treat
functions as first class objects. I can't pass functions
around like other values (numbers, lists, etc).

I bet CL is really easy to master, when compared to C++.

However, IMHO, it is hard to defend that CL is easier to
master than Scheme. Just compare the reference manual size.

In CL defense, one could argue that CL has other advantages over
Scheme. I bet it is easy print out a number in hexidecimal format.

>
> Rainer Joswig
>
> --
> http://www.lavielle.com/~joswig/

Earl Harris Jr.

Seth Tisue

unread,
Jan 30, 1997, 3:00:00 AM1/30/97
to

In article <32F06A...@widomaker.com>,

Earl & Daniella Harris <esha...@widomaker.com> wrote:
>Regarding the evaluation model, CL's doesn't treat
>functions as first class objects. I can't pass functions
>around like other values (numbers, lists, etc).

This is totally incorrect. Functions are 100% first class objects in
Common Lisp, same as in Scheme.
--
== Seth Tisue <s-t...@nwu.edu> http://www.cs.nwu.edu/~tisue/

Erik Naggum

unread,
Jan 30, 1997, 3:00:00 AM1/30/97
to

* Earl Esharris Daniella Harris

| IMHO, CL looks like it is much harder to master the Scheme.

FWIW, I found the opposite to be true. my favorite example is `member',
which Common Lisp calls `member', but which Scheme calls `memq', `memv', or
`member' according to which function should do the testing. in Common
Lisp, I can choose the test function with :test, and use `eq', `eql' or
`equal' as I see fit. should I want a different function, such as
`string-equal', I can use that, too. in Scheme, I must implement my own
`member' with `string-equal' as the predicate. in practice, I implement a
new `member' (which must be called something other than `member' since
Scheme doesn't have packages and redefining isn't kosher), which takes a
function as argument. in like manner, I must reimplement everything else I
need with a higher level of abstraction than Scheme provides. I have
concluded that Scheme is a pretty dumb language as standardized. had all
of this hype about functions as first-class arguments been true, wouldn't
Scheme have used them more often, I wonder.

| If you need to understand pattern matching to master CL, this is one
| strike against CL. Scheme doesn't have this; it isn't necessary.

you don't need to.

| If you need to understand tuples and structures to master CL, this is one
| strike against CL.

you don't need to.

| Are values a new data type in CL?

no, all functions return multiple values in Common Lisp.

| If you have to learn and differentiate between several DEFs, this is a
| strike against CL.

you need to know only `defvar' and `defun' to get going.

| While scheme has essentially "define," Common Lisp has several
| "DEFsomethings". Why does Common Lisp have some many definitions?

(1) because Common Lisp recognizes that a single namespace for functions
and variables is bad for you. (2) because Common Lisp has features that
Scheme does not have.

`defsetf' was mentioned. in Common Lisp, if you have a function (foo x)
that returns some piece of information, the typical function to make that
function return some new value known by your program is (setf (foo x)
new-value). e.g., if you retrieve elements from an array with (aref A i),
you store a new value with (setf (aref A i) x). in Scheme, you use
specialized functions to access different kinds of arrays, and you must use
different functions to store values into them, too. you define your own
setf methods (or "setter functions") with defsetf. you can also define
them with (defun (setf foo) ...) just like other functions. I find this
very elegant, and certainly much more so than functions named `set-foo!'.

| If you need to understand "meta classes" to master CL, this is also one
| strike against CL.

sigh. you don't need to.

| If you need to understand several flags and options in the print
| functions, this is one strike against CL.

sigh. you don't need to.

| Regarding the evaluation model, CL's doesn't treat functions as first
| class objects. I can't pass functions around like other values (numbers,
| lists, etc).

you should have asked a question. the above is as untrue as you can get.
however, functions aren't normally values of variables. this is seldom as
useful as Schemers think it is. the only difference between Scheme and
Common Lisp regarding functions is that in Scheme the first element of an
expression is evaluated like the rest of the elements, whereas in Common
Lisp, it is evaluated specially. evaluating a function call form in Scheme
means (apply #'funcall (mapcar #'eval form)), except that Scheme is allowed
to evaluate arguments in any order, whereas in Common Lisp it means
(apply (car form) (mapcar #'eval (cdr form))), keeping with Common Lisp
syntax in both cases for the sake of comparison.

| However, IMHO, it is hard to defend that CL is easier to master than
| Scheme. Just compare the reference manual size.

Scheme is a relatively low-level Lisp. you _can_ build tall abstractions,
but you run into many serious problems in scaling, not to mention the fact
that Scheme is a language of bare necessities, like living in a cave, while
Common Lisp is a language of complex societies where all the infrastructure
you want to take for granted is indeed already implemented.

however, I can sympathize with you on the cognitive level. Common Lisp
seems large and unwieldy. let me illustrate with an anecdote. I moved to
California to work on a project and stayed for five months. when I first
got there, I found that I experienced cognitive overload in supermarkets.
this was the first time I had had to buy my own groceries in the U.S. and
was not at all used to the variety or the brand names or anything. there
were dozens of different maple syrups, a hundred kinds of bread, etc. back
home in Oslo, there is much less variety, and some stores even specialize
in having a small number of products, like 800. gradually, over many
years, I had come to choose around 40 different products that I bought on a
regular basis. I could shop sleeping. however, in California, my nearest
supermarket was quite small by U.S. standars, and offered only 3000 or so
different products, none of which even looked like what I was used to. it
dawned on me after having become quite tired of the first couple shopping
experiences that I had tried to take in all of them at the same time, that
I had no grounds for comparisons, and that I still didn't think in dollars
so I didn't even have a working economic model to fit things into. in
response to this cognitive overload, I systematically went through a small,
new section of the store every time I went there, after I had found a
subset of their products that I could eat. it still took two months before
I had a working knowledge of brand names, product categories, price levels,
etc. however, my point is that although I recognized a problem in my
approach to something new and very large by my standards, I didn't starve
while I tried to sort out which maple syrup to pour on which breakfast
waffles with which fruits on it. (I learned later that I had skipped the
best maple syrup, too.) when I got home, I had acquired a taste for the
variety, and found myself buying at least 5 times more kinds of goods on a
regular basis than I used to. if you like, Scheme is like a bakery that
produce three kinds of bread according to clearly specified nutritional
models, while Common Lisp is like a supermarket with baked goods from a
large variety of bakeries. once you enjoy the variety, you won't find the
bakery confined to "only what's good for you" to be much of a treat.

it may be worth noting that I had already become really annoyed by C's and
Unix' lack of useful functions and all the manual work that was necessary
to get even simple things done. (such as: in a system-wide shell init
file, you need to set the search list (path) for interactive shells. it is
important that a few directories be present for all users, but their order
is immaterial and may be chosen by the user. if some directory is not
present in the search list, it should be made the new first element of the
sublist that contains only elements from the required list, in other words:
it should be added before any other required directories, but after any
that the user might have set up. exercise: do this in the various shells,
in Scheme, and in Common Lisp. for extra bonus points, do it in perl.)

Matthew R Wette

unread,
Jan 30, 1997, 3:00:00 AM1/30/97
to

Here's another stick on the fire:

On a SPARCStation an Allegro CL uses *11 meg* to print "hello, world".
SCM uses *1 meg* to print "hello, world".

CL requires more $$ for ram and disk.

Matt
--
matthew...@jpl.nasa.gov -- I speak for myself, not for JPL.

Martin Cracauer

unread,
Jan 30, 1997, 3:00:00 AM1/30/97
to

Tim Bradshaw <t...@aiai.ed.ac.uk> writes:

>* Reini Urban wrote:
>> On Mon, 20 Jan 1997 11:19:30 +0100, jos...@lavielle.com (Rainer Joswig) wrote:
>>> Both Common Lisp and Scheme basics are relatively easy to learn.

>> In my eyes Common Lisp is quite hard to learn
>> (compared to standard lisp or scheme)

>If it's possible to ask this question without provoking endless futile


>discussion, could you say why? I've taught courses on Common Lisp,
>and it would be interesting to know what people find hard about basic CL,
>especially compared to scheme.

>I can see that CL is very *large* & therefore intimidating compared to
>Scheme, but that can be fixed by teaching it the right way. Scope &
>extent people find hard, but is common between them. Different fn and
>variable namespaces make CL harder I think. call/cc makes scheme very
>much conceptually harder though for many people.

>So it would be quite interesting to know why CL is harder (or why
>scheme is harder) and how that could be fixed, if it can.

As someone who likes Common Lisp, I also found it quite hard to
learn.

1) Some concepts were quite hard to get for me, scoping, multiple
namespaces, reader macros (understanding existing programs is a lot
easier when you got the idea that all these #-constructs are just the
same as calling an S-expression macro) and in some ways
setf-constructs.

2) I learn best by reading existing sources. In Common Lisp, you will
face the whole range of teh language.

On the other hand, while it was hard to read such programs, it was
very useful. For example, when a sensible programmer chooses the best
sequence type for a given task. In Common Lisp, he will most likely
use constructs that you can look up in CLtL2. In C++ before STL, he
usually will implement his own stuff or use a non-standard lib and in
C people are very likely to push everyting into arrays.

3) While I liked the syntax, I found it to be pretty uncomfortable
when it comes to access sequence members and struct entries and
instance variables.

Also, the syntax for declarations is not really intuitive.

4) It is not easy to get decent performance out of Common Lisp when
you don't have an idea what makes a hashtable different from a
non-hashing assosication table and why people invented lists and why
it is useful to keep an pointer to the end of a list.

Of course, you will not write good peforming C programs also, but in C
you are likely to use arrays of inlined data members, which was for my
applications fast enough. In Common Lisp, you might end up using lists
like arrays and get no compile-time typechecking at all.

5) Usually no source-level debugging. I found it very useful to see a C
program step-by-step with variable watches turned on.

6) Profiling requires you to set up the symbols you want traced in
advance. With GNU gprof for C, you just say "profile this program" and
it uses all functions it has an entry for.

7) Environment issues. It takes some time to get used to work in a
permanent image. For example, why can't you load a package that
contains a symbol you just tried to access? Because you just triggered
the symbol to be created in the current package. Not easy to get for a
batch-language user.

I don't think Scheme is much easier to learn. C is at least easier to
understand for someone who has an understanding how a CPU works and
how data is arranged in a computer (which was the case for me).

I don't want to start a language flame war, also. After all, this is a
description of a past state of mine and you're not going to change
history, no matter how wrong I was :-)

Happy Lisping

cosc...@bayou.uh.edu

unread,
Jan 30, 1997, 3:00:00 AM1/30/97
to

Earl & Daniella Harris (esha...@widomaker.com) wrote:

[Snip]

: Ok. I don't understand CL. When I tried to teach myself CL, I was


: expecting a simple language like Scheme. The details of CL seemed
: overwhelming. I kept asking myself, why is this SO complicated?
: This is suppose to be Lisp?

It's so complicated because it's so powerful. There's so much
that you can do in Common Lisp that there are naturally many
things to learn. Note however that you can put off learning
many of these things and still write effective programs.
A testament to this fact is that many of Common Lisps'
capabilities can be (or may even be) written in Common Lisp
itself, using a few core primitives.


: IMHO, CL looks like it is much harder to master the Scheme. The
: following
: is my rebuttal.

Again it's harder to master, but then that's because there is
more to master. That's like saying "Chess is Harder to
Master than Checkers". Well sure it is, but that's not
a disadvantage of chess!


: > Then some people try to tell you that CL lacks pattern matching


: > like some other functional language. Not only is it easy
: > to integrate pattern matching, but they don't understand,
: > that for larger software libraries pattern based invocation

: If you need to understand pattern matching to master CL, this is one


: strike
: against CL. Scheme doesn't have this; it isn't necessary.

Common Lisp doesn't offer pattern matching. Maybe you should
actually try to learn it or look at it in detail before reaching
such hasty conclusions.


: Why would I want (or need) pattern matching in Lisp? Is this like


: pattern matching in ML? Can you use CL and avoid using patterns?

Scheme is basically a tiny version of Common Lisp (this may be a bit
of an oversimplification), so that alone should give you an
idea of what Common Lisp can and can't do.


: > is not very maintainable. Then people begin to tell you


: > that CL does not allow to return "tuples" of values.
: > Again this is easy (use VALUES, or structures, whatever).

: If you need to understand tuples and structures to master CL, this is
: one strike against CL.

Mastery of any language means that you should know the ins and outs
of that language. Contrast this with being able to effectively
use a language. With Common Lisp you can effectively use the
language without understanding structures (I don't even know
if tuples are supported). You can use lists instead of
structures and use them instead.

Structures however are very simple to use, and very well designed,
and they are there for when you are ready for them.


: Scheme doesn't have tuples; it isn't necessary. Some scheme


: implementations
: have structures, but you don't need to learn it.

I don't think that Common Lisp even has tuples either, and you
don't have to learn structures in Common Lisp. Again you've
got lists, just use them.


: Tuples? Values? Structures? If I want to return more than value,


: I return them in a list (or vector). Why would I want tuples?

You could do the same in Common Lisp.


: Are structures like structures in C? How are tuples different from
: structures and lists? Are values a new data type in CL?

I can't speak for tuples, and I'm not sure what you mean by
"values", but I can tell you how the structures in Lisp work.

Basically you define a structure (much like you would in C), but
instead of using a "." to access structure fields, Lisp creates
specialized functions -- accessor functions for you to access
the structure fields with. This hides the implementation details
and makes it a snap for you to later replace them with another
implementation (if you so desire). Structures in Lisp are
basically ADTs (abstract data types), and so they are like structures
in other languages in that you access particular fields and can
refer to them as a whole, but are different in that they
are abstracted.


: >
: > Common Lisp is releatively easy to understand. Not


: > full Common Lisp - you don't need to tell them about
: > DEFSETF or about meta classes. But Common Lisp

: > has the same basic properties like Scheme.

: If you have to learn and differentiate between several DEFs, this is
: a strike against CL.

Again you can do a lot (possibly more than Scheme) without
differentiating between several DEFs. Again, think of Scheme
as a tiny subset of Common Lisp. With Common Lisp, you can
choose to use a tiny subset, and so it can be very much like
Scheme (in terms of simplicity). No one is forcing you to
use all these features, but they are there for when you
need them.


: While scheme has essentially "define," Common Lisp has several


: "DEFsomethings".
: Why does Common Lisp have some many definitions?

Because they do different things. Again you are not forced
to use all these definitions.


: If you need to understand "meta classes" to master CL, this is also
: one strike against CL.

The same answers apply, let's just fast forward since this is
redundant.

[Snip]

: Scheme's printing options are much clearer to me. You apply the


: print function to the value and it prints it.

: CL has really exotic print functions with lots of flags and options.
: It reminds me of C's print function. There are more details.

Again use what subset makes you feel comfortable. Scheme is simpler
because it's *WEAKER*. Get it?


[Snip]

: I'm not sure a libary makes a language easier to master.


: A libary can be a convenience to the programmer. It saves
: me the trouble of writing some programs.

Think of much of Common Lisp as optional libraries for you
to use when you decide you need them.


[Snip]

: Regarding the evaluation model, CL's doesn't treat


: functions as first class objects. I can't pass functions
: around like other values (numbers, lists, etc).

Yes you can. The thing is with Common Lisp (as contrasted
with Haskell), you'll need a special quoting notation
to keep things clear, but that's it.

That's how functions like funcall and apply work, by
taking functions as arguments. If you couldn't do that
in Common Lisp then how do these functions even exist?


: I bet CL is really easy to master, when compared to C++.

: However, IMHO, it is hard to defend that CL is easier to


: master than Scheme. Just compare the reference manual size.

I'm not arguing that CL is harder to master than Scheme, I'm
merely trying to point out that mastery is one thing, and
using something productively is another, and you can use
CL productively without all that much effort.


: In CL defense, one could argue that CL has other advantages over


: Scheme. I bet it is easy print out a number in hexidecimal format.

That was uncalled for. Common Lisp is vastly more powerful than
Scheme and is therefore larger. It's that simple. If you can't
come to grips with this simple fact, then maybe Common Lisp
really is beyond you and you should stick with something simpler
like Scheme.


[Snip]

: Earl Harris Jr.

--
Cya,
Ahmed

In order to satisfy their mania for conquest, lives are squandered
Discharge

Rainer Joswig

unread,
Jan 30, 1997, 3:00:00 AM1/30/97
to

In article <854545...@wildcard.demon.co.uk>,
cyber_...@wildcard.demon.co.uk wrote:

> I agree that pattern matching can be added to CL, and other
> Lisps too, but I'm not sure why you say that "for larger
> software libraries pattern based invocation is not very
> maintainable." The issue of code maintainence isn't effected
> by pattern matching, as far as I'm aware.

Not? Right now most of these functional languages
have tuples and lists, but not records. Write
large software with lots of datatypes (a windows system, ...)
using pattern matching. It is possible, but now try
to change the underlying data representation.
What effects does this have to external users if
the implementation changes, etc. Could it
be difficult to understand software if you
have lots of patterns and you have to look very
closely to determine which function will be invoked
when?

> It looks like you're defending CL by attacking another style
> of programming. I hope this not the case, as I don't believe
> that it's necessary. No language is so good that it justifies
> such an attack, and I'm know that you can find better ways
> of defending CL.

This is your interpretation. I don't like it.

But some people have been
struggling with similar approaches years ago. Its
a bit like what people experienced with rule based
languages in real world software (like OPS5-based
configuration systems, or even the infamous sendmail).

> Haskell is _also_ easy to understand. It's just a little
> different to CL (and C++), but I find I can apply a great
> deal of my experience in Lisp to Haskell. Like Lisp, it
> helps you learn it using a good book, and know people who
> can answer your questions.

Until someone has understood monadic IO, he may already
have successfully written some 10000 lines of
Common Lisp stream-based IO code. Also
I might add the Haskell type system is not *that*
easy to understand.

I'm not saying anything bad about Haskell, it´s just that
even with FP knowledge it is not easy to master and some
books (there is a nice German one) about Haskell do look
like white noise to the uninitiated.

> Mastering any language worth learning takes time. After
> more than 10 years, I'm still learning CL.

After more than 10 years, I'm still writing lots
of code with Common Lisp.

> My favourite languages are still CL and Scheme, but I'm
> beginning a love affair with Haskell. I don't feel that
> I'm cheating any of them, as they're only tools. Damn fine
> tools, just the same.

You still have failed to ground your "love affair" on
rationalism (as far as possible).

Steve Austin

unread,
Jan 31, 1997, 3:00:00 AM1/31/97
to

On 30 Jan 1997 20:15:48 +0000, Erik Naggum <er...@naggum.no> wrote:

>(1) because Common Lisp recognizes that a single namespace for functions
>and variables is bad for you.

Could you clarify this for me please? I'm very much a newcomer to
Common Lisp, and I naively assumed that the originators of Scheme used
a common namespace to simplify the syntax of higher order functions.
What advantages do separate namespaces provide?

Steve Austin
sau...@nf.sympatico.ca


Michael Sperber [Mr. Preprocessor]

unread,
Jan 31, 1997, 3:00:00 AM1/31/97
to

Some misconceptions about Scheme from the view of CL programmers need
clarification.

>>>>> "Erik" == Erik Naggum <er...@naggum.no> writes:

Erik> in Scheme, I must implement my own `member' with `string-equal'
Erik> as the predicate.

In Scheme, equal? works on strings. No need to.

Erik> in practice, I implement a new `member' (which must be called
Erik> something other than `member' since Scheme doesn't have packages
Erik> and redefining isn't kosher), which takes a function as
Erik> argument.

Redefining *is* kosher in Scheme as of the IEEE standard.

Erik> in like manner, I must reimplement everything else I
Erik> need with a higher level of abstraction than Scheme provides.

At least that is easy in Scheme. In Common Lisp, if I want call/cc
(and it is *much* more useful than Common Lisp programmers usually
care to acknowledge), I cannot express it in terms of other Common
Lisp primitives.

The high-level macro system that Scheme has (about to be made a
mandatory feature for R5RS) (or something with equivalent
functionality) is very hard to implement right in Common Lisp. I
doubt that it's been done. (Except maybe as part of PseudoScheme :-})
I'd be happy to be educated on the subject.

Erik> (1) because Common Lisp recognizes that a single namespace for functions
Erik> and variables is bad for you.

Again, that's an assertion without any proof. Multiple namespaces
greatly complicate dealing with names conceptually, especially when
the same name has multiple bindings with disjoint meanings. Possibly
a matter of taste, admittedly.

Erik> (2) because Common Lisp has features that Scheme does not have.

So? Scheme has features that Common Lisp does not have.

Erik> `defsetf' was mentioned.

defsetf is trivial to define with Scheme high-level macros.

Erik> you should have asked a question. the above is as untrue as you can get.
Erik> however, functions aren't normally values of variables. this is seldom as
Erik> useful as Schemers think it is.

Erik, you should have asked a question. It is immensely useful all
the time. I'd be happy to send to oodles of source code where having
to use funcall would greatly screw up the code. Admittedly, code that
CL programmers would

Erik> Scheme is a relatively low-level Lisp. you _can_ build tall abstractions,
Erik> but you run into many serious problems in scaling,

Such as?

Erik> not to mention the fact that Scheme is a language of bare
Erik> necessities, like living in a cave, while Common Lisp is a
Erik> language of complex societies where all the infrastructure you
Erik> want to take for granted is indeed already implemented.

As far as infrastructure for building abstractions is concerned, I
want (and need) call/cc and macros. So?


Erik Naggum

unread,
Jan 31, 1997, 3:00:00 AM1/31/97
to

* Matthew R. Wette

| On a SPARCStation an Allegro CL uses *11 meg* to print "hello, world".
| SCM uses *1 meg* to print "hello, world".

this is very odd. ACL 4.3 on my SPARCstation has a swap footprint close to
5M. CMUCL has a swap footprint of about 1M. scsh uses 9M, and MIT Scheme
eats 12M.

| CL requires more $$ for ram and disk.

some Scheme _implementations_ require far more RAM and disk than some
Common Lisp _implementations_, and vice versa, I'm sure.

Erik Naggum

unread,
Jan 31, 1997, 3:00:00 AM1/31/97
to

* Michael Sperber

| Some misconceptions about Scheme from the view of CL programmers need
| clarification.

that may be, but please do not add more of them.

| >>>>> "Erik" == Erik Naggum <er...@naggum.no> writes:
|
| Erik> in Scheme, I must implement my own `member' with `string-equal'
| Erik> as the predicate.
|
| In Scheme, equal? works on strings. No need to.

`equal' is case sensitive. `string-equal' is not. `equal?' in Scheme is
also case sensitive. if this is not sufficient, choose a different
function, and get the point.

| Erik> in like manner, I must reimplement everything else I
| Erik> need with a higher level of abstraction than Scheme provides.
|
| At least that is easy in Scheme.

sigh. it may be hard, it may be easy. in Common Lisp I don't have to.

| Erik> (1) because Common Lisp recognizes that a single namespace for functions
| Erik> and variables is bad for you.
|
| Again, that's an assertion without any proof. Multiple namespaces
| greatly complicate dealing with names conceptually, especially when
| the same name has multiple bindings with disjoint meanings. Possibly
| a matter of taste, admittedly.

where was the first "assertion without proof"? your own?

| Erik> `defsetf' was mentioned.
|
| defsetf is trivial to define with Scheme high-level macros.

again, you need to roll your own. all those "trivial" things add up.

| Erik> however, functions aren't normally values of variables. this is
| Erik> seldom as useful as Schemers think it is.


|
| Erik, you should have asked a question. It is immensely useful all the
| time.

because in Scheme, you have no other choice. if you need it in Common
Lisp, you've implemented a different evaluation model before all those
trivial issues in Scheme have been implemented.

| I'd be happy to send to oodles of source code where having to use funcall
| would greatly screw up the code.

"greatly screw up the code"? misconceptions, eh? you're marketing.

| Erik> Scheme is a relatively low-level Lisp. you _can_ build tall
| Erik> abstractions, but you run into many serious problems in scaling,
|
| Such as?

lack of a standard package system, for starters.

| Erik> not to mention the fact that Scheme is a language of bare
| Erik> necessities, like living in a cave, while Common Lisp is a
| Erik> language of complex societies where all the infrastructure you
| Erik> want to take for granted is indeed already implemented.
|
| As far as infrastructure for building abstractions is concerned, I
| want (and need) call/cc and macros. So?

as if Common Lisp didn't have macros. sheesh!

call-with-current-continuation is unique to Scheme. somehow, people can
actually get work done in other languages. listening to Schemers, I wonder
how this is at all possible without call-with-current-continuation. could
it be that Scheme has removed all the _other_ mechanisms and replaced them
with a single very complex idea that is then used to reimplement them all?

in Scheme, you have to implement a lot of minor stuff. this creates one
Scheme environment per user or group of users. such is indeed the case.
in Common Lisp, it's there.

Michael Sperber [Mr. Preprocessor]

unread,
Jan 31, 1997, 3:00:00 AM1/31/97
to

>>>>> "Erik" == Erik Naggum <er...@naggum.no> writes:

Erik> * Michael Sperber
Erik> | Some misconceptions about Scheme from the view of CL programmers need
Erik> | clarification.

Still ...

Erik> | Erik> (1) because Common Lisp recognizes that a single namespace for functions
Erik> | Erik> and variables is bad for you.
Erik> |
Erik> | Again, that's an assertion without any proof. Multiple namespaces
Erik> | greatly complicate dealing with names conceptually, especially when
Erik> | the same name has multiple bindings with disjoint meanings. Possibly
Erik> | a matter of taste, admittedly.

Erik> where was the first "assertion without proof"? your own?

The assertion was "a single namespace for functions and variables is
bad for you", to quote you. Yours. No proof.

Erik> lack of a standard package system, for starters.

Admitted, but also possible to build yourself. The code is out there
Erik, just download it. Of course, all these "little things add up."
A matter of taste if you'd rather be able to choose which ones and how
they work, and then grab them, or if Common Lisp pushes all this stuff
at you. For some programmers (me, for instance), Common Lisp rarely
provides the right abstractions, but rather something which is only
almost right. For others, it may be perfect.

This seems to be the difference in design philosophies between Scheme
and CL. Language elements make it into Scheme only on unanimous
consent of the RnRS authors, pretty good evidence that they are "The
Right Thing".

Read Dick Gabriel's paper on Common Lisp for some evidence on why.
I'm not trying to argue that Scheme is "better" than CL (which Erik is
trying to push at me), I'm just saying that people exist who prefer
Scheme to CL. (And that they, too, are getting serious work done in
Scheme.)

Erik> | As far as infrastructure for building abstractions is concerned, I
Erik> | want (and need) call/cc and macros. So?

Erik> as if Common Lisp didn't have macros. sheesh!

CL's macro system is by far not as convenient and (worse) far more
unsafe than Scheme macros.

Erik> call-with-current-continuation is unique to Scheme. somehow, people can
Erik> actually get work done in other languages.

Certain things you can't do without call/cc (or some equivalent
mechanism such as shift/reset), such as building mechanisms for
coroutines, threads etc. With call/cc, however, you can build *any*
control structure.

Erik> listening to Schemers, I wonder how this is at all possible
Erik> without call-with-current-continuation.

I've never seen anybody claim that. Quote someone, Erik, just once!

Erik> could it be that Scheme has removed all the _other_ mechanisms
Erik> and replaced them with a single very complex idea that is then
Erik> used to reimplement them all?

Common Lisp's idea of non-local control transfer is at least as
complex as call/cc, but nevertheless not as powerful. The formal
semantics in the Scheme standard takes up 12 4-inch lines, none of
which has more than 2 inches of stuff on it. Two of those lines are
declaration lines, two are error messages, which leaves 8 operational
lines. Those lines would easily fit on one or two lines on a full
page. How long is the explanation of non-local jumps in CL?

Erik> in Scheme, you have to implement a lot of minor stuff. this creates one
Erik> Scheme environment per user or group of users. such is indeed the case.
Erik> in Common Lisp, it's there.

True. Is this a bad thing for Scheme?

Cheers =8-} Mike

Steinar Bang

unread,
Jan 31, 1997, 3:00:00 AM1/31/97
to

>>>>> spe...@informatik.uni-tuebingen.de (Michael Sperber [Mr. Preprocessor]):

>>>>> "Erik" == Erik Naggum <er...@naggum.no> writes:

Erik> in Scheme, you have to implement a lot of minor stuff. this

Erik> creates one Scheme environment per user or group of users. such
Erik> is indeed the case. in Common Lisp, it's there.

> True. Is this a bad thing for Scheme?

It gets in the way of Scheme becoming a "real" systems programming
language.

Now whether Scheme *should* become one, us a completely different
issue.


- Steinar

Bradley J Lucier

unread,
Jan 31, 1997, 3:00:00 AM1/31/97
to

I don't know if I should comment on a thread that I will soon add to
my kill file, but . . .

I know and like Scheme. Perhaps, as a mathematician, Scheme's small
size and uniform notation (single namespace for variables and functions)
appeals to me. Someone once said that a mathematician tries to
forget as much as possible, in fact anything he can look up, so the small
amount of knowledge I need to remember to use Scheme in a reasonable way
is an advantage.

I don't know Common Lisp, but I've recently read two books on CL that
gave me two different impressions of the language. The first, Paradigms
of Artificial Intelligence, by Norvig, is a great book. It made me
realize that certain CL features, like multiple return values and various
iteration macros, are valuable and useful tools to write code. I've started
to use several of the techniques that Norvig lays out in his book. I
can write the iteration macros in Scheme, and I can box and unbox multiple
return values; it's just more of a pain. Still, for the code I've been
writing, the Scheme notation is shorter and clearer to me.

I'm working through Graham's On Lisp, also a great book, on macros,
nondeterminacy, and other advanced topics in CL programming. My impression
of this book is that Graham starts several chapters saying ``Let's see how
the following would be done in Scheme, where it would be easier, and then
we'll write some CL macros to simulate some limited version of the Scheme
code.'' Consequently, this book gave me a greater appreciation for Scheme's
call-with-current-continuation and what could be done with it.

Brad Lucier luc...@math.purdue.edu

Erik Naggum

unread,
Jan 31, 1997, 3:00:00 AM1/31/97
to

* Michael Sperber

| Erik> where was the first "assertion without proof"? your own?
|
| The assertion was "a single namespace for functions and variables is
| bad for you", to quote you. Yours. No proof.

FYI, you used the word "again", which implies that there is more than one,
and that those other ones precede the instance for which you use "again".
I can no find such no mention of any such from you. ergo my question. I'm
sorry to be pedantic about this, but since you revel in rhetorical devices
and accuse me assertions without proof, it must have been conscious on your
part, although the above suggests that you didn't even read what you wrote.

this is also very odd in light of the rest of your articles. you keep
making claims without proof all through them. only when you can't argue
against something do you need proof, it seems.

this is also odd because it is far from clear _how_ one would "prove" that
a single namespace for functions and variables is bad for anyone, even
though it is. or do I have an opponent who believes that that which cannot
be restated in other terms in such a way as to constitute a proof to his
use. in sum, I conclude that you request proof only as a rhetorical liking
cannot be true. next, you'll challenge any possible axioms I might employ
if you can't counter-arguments. et cetera ad nauseam.

| Admitted, but also possible to build yourself.

yeah, "full employment" seems to an argument in favor of Scheme.

| I'm not trying to argue that Scheme is "better" than CL (which Erik is
| trying to push at me), I'm just saying that people exist who prefer
| Scheme to CL.

look, you may engage in marketing and other lies as much as you want, but
please don't blame your opponents for it. the next quotation from you
really annoy me when you argue the way you do.

| CL's macro system is by far not as convenient and (worse) far more unsafe
| than Scheme macros.

in addition to being a blatant case of "argue that Scheme is `better' than
CL", this also seems like an assertion without proof. if you want to argue
against these things, please do so with respect to your own articles first.

| I've never seen anybody claim that. Quote someone, Erik, just once!

you're being insufferably silly. if call-with-current-continuation is the
be-all and end-all of control structures, and other control mechanisms are
not satisfing to Scheme programmers, then _obviously_ they are unable to
get their work done in other languages, right? when Scheme programmers
make the claim that they can't do without call-with-current-continuation,
such as you do, the only possible conclusion is that they need it for
things that other languages don't provide. this argument is repeated every
time somebody wants to compare Common Lisp to Scheme. ergo, one must
conclude that Scheme programmers are unable to get their work done in any
other language. since this looks pretty amazing compared to the fact that
people _do_ get their job done in any number of languages, I must conclude
that the argument for the necessity of call-with-current-continuation is
constructed ex post facto, and as such is specious at best.

| Common Lisp's idea of non-local control transfer is at least as
| complex as call/cc, but nevertheless not as powerful.

what does this mean if not that you _need_ this power, and that Common Lisp
(and every other language) would not be able to provide what you need?

| The formal semantics in the Scheme standard takes up 12 4-inch lines,
| none of which has more than 2 inches of stuff on it. Two of those lines
| are declaration lines, two are error messages, which leaves 8 operational
| lines. Those lines would easily fit on one or two lines on a full page.
| How long is the explanation of non-local jumps in CL?

this argument is so charmingly irrelevant. it suggests that people don't
program in Scheme, they only prove how elegant it would have been. but,
let me quote something you said just above: "I'm not trying to argue that
Scheme is `better' than CL", and contrast it to the above paragraph. do
you get what I get? is it a contradiction?

| True. Is this a bad thing for Scheme?

I started to work with Scheme some time ago. I got "the Unix feeling",
i.e., that of a system being sub-minimalistic in all interesting areas.
oh, sure, lots of thing could just be downloaded from somewhere, but (1)
the same name was used in different implementations of unrelated features,
(2) everything worked well with standard Scheme, but little else, (3) that
which was "most useful" was not portable or combinable with other "most
useful" features.

moreover, if you start to use a language, and the best answer to your
request for some functionality is not "look it up in standard", but "the
code is out there, Erik, just download it", or "it's trivial to build", I'm
hard pressed to accept an argument that the language is actually easy to
learn. in fact, I'm more convinced after this brief discussion than before
that if I want to get my job done in finite time, I should not use Scheme.

the curious thing is that the exact same argument (build it yourself if it
is not in the standard) is used of C, another sadly lacking language. I
wanted to get _away_ from C and the "you want a glass of beer? why,
there's sand on the beach to make the glass and all the ingredients you
need to make beer are out there, Erik, just go and collect them"-type of
"do it yourself"-ism.

I can also understand why cave dwellers don't like cities: they're full of
noise and pollution and so many things that are just inherited from the
past without redesigning them to fit a pure, formal model. but, somehow, I
like cities. they make it possible for me to make a living working from
home in my comfortable living-room-cum-office with only a purring cat to
distract me, instead of having to kill the animal whose remains are
sizzling in the pan and go pick the rice that's boiling or the herbs and
spices I think are needed to make the sauce that I instead make from water,
milk, and prefabricated, powdered sauce.

I close with two quotes from Michael A. Padlipsky's Elements of Networking
Style[1], Appendix 3, "the self-framed slogans suitable for mounting":

"Just because you think you need steel-belted radial tires and the store
only has polyglas-belted ones at present is still no excuse for going off
in a corner and reinventing the travois."

"The `it's _my_ ball' syndrome would be more understandable if home-made
sandboxes really were superior to store-bought sandboxes."

#\Erik

-------
[1] ISBN 0-13-268111-0

William Paul Vrotney

unread,
Jan 31, 1997, 3:00:00 AM1/31/97
to

In article <y9liv4e...@modas.informatik.uni-tuebingen.de> spe...@informatik.uni-tuebingen.de (Michael Sperber [Mr. Preprocessor]) writes:

>
> Erik> you should have asked a question. the above is as untrue as you can get.
> Erik> however, functions aren't normally values of variables. this is seldom as
> Erik> useful as Schemers think it is.


>
> Erik, you should have asked a question. It is immensely useful all

> the time. I'd be happy to send to oodles of source code where having
> to use funcall would greatly screw up the code. Admittedly, code that
> CL programmers would
>

Instead of oodles, could you just post one good example in Scheme? I'm not
doubting, I would just like to see other peoples view of how funcall is not
as good as. Thanks.

--

William P. Vrotney - vro...@netcom.com

Cyber Surfer

unread,
Jan 31, 1997, 3:00:00 AM1/31/97
to

In article <30636876...@naggum.no> er...@naggum.no "Erik Naggum" writes:

> * Matthew R. Wette
> | On a SPARCStation an Allegro CL uses *11 meg* to print "hello, world".
> | SCM uses *1 meg* to print "hello, world".
>
> this is very odd. ACL 4.3 on my SPARCstation has a swap footprint close to
> 5M. CMUCL has a swap footprint of about 1M. scsh uses 9M, and MIT Scheme
> eats 12M.

A more fair comparison would be with CLISP. This is, as far as I
know, a complete CL system, and yet it can use as little as 1.5 MB.
That's not it's working set, either. I used to run it very comfortably
in 8 MB of RAM, using _no_ virtual memory at all, but that's probably
6 MB more than necessary. Since I've not yet found a DOS machine
with only 2 MB of RAM, I've not been able to test it myself.



> | CL requires more $$ for ram and disk.
>
> some Scheme _implementations_ require far more RAM and disk than some
> Common Lisp _implementations_, and vice versa, I'm sure.

This "bean counting" proves nothing. It's always the same with
bean counting. The first Lisp that I can remember reading about
ran in 16K of RAM, on a machine with a Z80 clocked at 1.76 Mhz.
I have a C++ compiler and framework that may require more than
80 MB of RAM (my current number of memory beans), but I can also
use it without that framework, or use a totaly different C++
compiler. What do these beans tell us? Nothing. If they did mean
anything, then we might be programming in assembly language,
instead of a high level language.

Leave this kind of squabling to the C and Pascal programmers,
and instead let us demonstrate the value of our enlightenment.

Cyber Surfer

unread,
Jan 31, 1997, 3:00:00 AM1/31/97
to

In article <joswig-ya0231800...@news.lavielle.com>
jos...@lavielle.com "Rainer Joswig" writes:

> > I agree that pattern matching can be added to CL, and other
> > Lisps too, but I'm not sure why you say that "for larger
> > software libraries pattern based invocation is not very
> > maintainable." The issue of code maintainence isn't effected
> > by pattern matching, as far as I'm aware.
>
> Not? Right now most of these functional languages
> have tuples and lists, but not records. Write
> large software with lots of datatypes (a windows system, ...)
> using pattern matching. It is possible, but now try
> to change the underlying data representation.

Have you read anythign about functional languages recently?
You're demonstrating the kind of ignorance of C++ programmers,
when discussing Lisp. User defined aggregate data types have
been available in functional languages since the early 70s,
and perhaps even earlier.

> What effects does this have to external users if
> the implementation changes, etc. Could it
> be difficult to understand software if you
> have lots of patterns and you have to look very
> closely to determine which function will be invoked
> when?

As Erik might say, Bzzzt! Wrong. I'll let you figure out
how to find the appropriate FAQ to read...



> > It looks like you're defending CL by attacking another style
> > of programming. I hope this not the case, as I don't believe
> > that it's necessary. No language is so good that it justifies
> > such an attack, and I'm know that you can find better ways
> > of defending CL.
>
> This is your interpretation. I don't like it.

Fair enough. I don't like your interpretation of the abilities
of functional languages, which is not only wrong, but just a little
bizarre, considering how much FP and Lisp have in common.



> But some people have been
> struggling with similar approaches years ago. Its
> a bit like what people experienced with rule based
> languages in real world software (like OPS5-based
> configuration systems, or even the infamous sendmail).

Huh? These have nothing to do with FP. Try reading some
up to date information for a change.



> > Haskell is _also_ easy to understand. It's just a little
> > different to CL (and C++), but I find I can apply a great
> > deal of my experience in Lisp to Haskell. Like Lisp, it
> > helps you learn it using a good book, and know people who
> > can answer your questions.
>
> Until someone has understood monadic IO, he may already
> have successfully written some 10000 lines of
> Common Lisp stream-based IO code. Also
> I might add the Haskell type system is not *that*
> easy to understand.

I agree that it's not so easy, but this is partly due to
it being different to what we, as Lisp programmers, expect.
This doesn't mean that it doesn't work, or that it isn't
a powerful tool. Haskell also a very your tool, while Lisp
in general has been around long enough for some excellent
tutorials to be written, and for a strong culture to evolve.

> I'm not saying anything bad about Haskell, it愀 just that


> even with FP knowledge it is not easy to master and some
> books (there is a nice German one) about Haskell do look
> like white noise to the uninitiated.

I agree that the books available may leave a lot to be
desired. (See above.) It's way too early to use this in
a fair comparison. How many Lisp books were available 30
years ago?

This is a dumb way to judge languages. After all, look at
the vast shelves of books for C++. What does that prove?
Absolutely nothing, but that there's a lot of interest in
that particularly language.



> > Mastering any language worth learning takes time. After
> > more than 10 years, I'm still learning CL.
>
> After more than 10 years, I'm still writing lots
> of code with Common Lisp.

So am I. This proves nothing. We don't measure the quality
of a language only by KLOCs written per year.



> > My favourite languages are still CL and Scheme, but I'm
> > beginning a love affair with Haskell. I don't feel that
> > I'm cheating any of them, as they're only tools. Damn fine
> > tools, just the same.
>
> You still have failed to ground your "love affair" on
> rationalism (as far as possible).

Wander over to comp.lang.functional, and a number of us may
explain some of it to you. However, this is not the right place.
I'd like to avoid language politics - life is too damn short.

Just consider this: I'm writing CGI code in Haskell. The fact
that I can do this should tell you a great deal. Everything else
is hot air. If you're not interested in the truth about FP, then
you're only doing what so many C++ programmers like to do to us.
I don't wish to have any part in that!

Alaric B. Williams

unread,
Feb 1, 1997, 3:00:00 AM2/1/97
to

On 31 Jan 1997 08:40:29 +0000, Erik Naggum <er...@naggum.no> wrote:

>| >>>>> "Erik" == Erik Naggum <er...@naggum.no> writes:
>|

>| Erik> in Scheme, I must implement my own `member' with `string-equal'
>| Erik> as the predicate.
>|
>| In Scheme, equal? works on strings. No need to.
>
>`equal' is case sensitive. `string-equal' is not. `equal?' in Scheme is
>also case sensitive. if this is not sufficient, choose a different
>function, and get the point.

So there should be a 'first-that' or something that takes a lambda
parameter. An oversight in the standard procedure library isn't much
of a problem IMHO... things can be added to Scheme, but nothing can be
removed from CL...

>| Erik> Scheme is a relatively low-level Lisp. you _can_ build tall
>| Erik> abstractions, but you run into many serious problems in scaling,
>|
>| Such as?


>
>lack of a standard package system, for starters.

There's at least one good package system I've seen. Can't remember
where, but it's there if anyone wants it :-)

>call-with-current-continuation is unique to Scheme. somehow, people can

>actually get work done in other languages. listening to Schemers, I wonder
>how this is at all possible without call-with-current-continuation. could
>it be that Scheme has removed all the _other_ mechanisms and replaced them
>with a single very complex idea that is then used to reimplement them all?

>in Scheme, you have to implement a lot of minor stuff. this creates one
>Scheme environment per user or group of users. such is indeed the case.


>in Common Lisp, it's there.

This just needs the slow extension of the standard functions. IMHO
there should be a standard set of extension packages, like
dictionaries in FORTH. Ie, a package that standardises an exception
system based on call/cc, etc.

The objective of Scheme is to find the basic core that is needed to
implement everything else - or as much else as possible; who knows
what constructs CS research will devise in the future? Clearly, to
make full use of the fruits of that effort, the libraries that extend
that basic functionality are needed - but if they are all defined in
terms of that basic core, implementing Scheme is easy and
/manipulating/ Scheme is easy. Scheme source can be thought of as the
combination of a small set of primitives - well suited for axiomatic
transformations and all that fun stuff...

ABW
--

"Simply drag your mother in law's cellphone number from the
Address Book to the Laser Satellite icon, and the Targeting
Wizard will locate her. Then follow the onscreen prompts for
gigawattage and dispersion pattern..."

(Windows for Early Warning and Defence User's manual P385)

Alaric B. Williams Internet : ala...@abwillms.demon.co.uk
<A HREF="http://www.abwillms.demon.co.uk/">Hello :-)</A>

Erik Naggum

unread,
Feb 1, 1997, 3:00:00 AM2/1/97
to

* Alaric B. Williams

| So there should be a 'first-that' or something that takes a lambda
| parameter. An oversight in the standard procedure library isn't much of
| a problem IMHO... things can be added to Scheme, but nothing can be
| removed from CL...

sigh. who can choose a language with such proponents, and such arguments
in its favor? it's legitimate for any Scheme user to point out flaws in
Common Lisp like they could win an olympic medal for it, but if you point
out a design problem with Scheme the same people will readily pardon any
and all flaws in Scheme as "oversights" or even worse trivializations.

it is impossible to argue with people who have detached their emotional
involvement in a language (which any language worth using will inspire in
its users) from rational appreciation of its role, relevance, and value.

Scheme is the only language I have ever seen where people will actually
argue in _favor_ of its flaws, explicitly or implicitly by some stupid
non-argument about some other language. once upon a time, I used to think
that a language (SGML) had such wondrous potential that I would ignore all
present flaws and practical problems. I gradually came to understand that
that potential would never be realized, precisely because nobody cared to
fix the present flaws and practical problems -- those who saw the potential
ignored them and talked about how SGML changed the idea of information and
all that fine management-level nonsense, and those who had to deal with
them just found ways to live with them, even arguing against changes!

take a look at Common Lisp's `member' some day. the `first-that' that you
seem to think of is called `member-if' in Common Lisp. it is different
from a `member' with a :test argument. also note the :key argument.
(and _please_ note that :test-not and `member-if-not' are deprecated.)

Simon Brooke

unread,
Feb 1, 1997, 3:00:00 AM2/1/97
to

In article <ey3915b...@staffa.aiai.ed.ac.uk>,

Tim Bradshaw <t...@aiai.ed.ac.uk> writes:
> * Reini Urban wrote:
>> On Mon, 20 Jan 1997 11:19:30 +0100, jos...@lavielle.com (Rainer Joswig) wrote:
>>> Both Common Lisp and Scheme basics are relatively easy to learn.
>
>> In my eyes Common Lisp is quite hard to learn
>> (compared to standard lisp or scheme)
>
> If it's possible to ask this question without provoking endless futile
> discussion, could you say why? I've taught courses on Common Lisp,
> and it would be interesting to know what people find hard about basic CL,
> especially compared to scheme.

Hi Tim!

OK, lets start:

(i) LISP2: Why is a function different from a variable? Why is there more
than one name space? (see e.g. Gabriel and Pitman, _Technical Issues
in Separation in Function Cells and Value Cells_, in Lisp and Symbolic
Computation 1, 1 1988).

(ii)Weird lambda-list syntax. I *still* have trouble with this. &KEY,
&OPTIONAL, &REST... Both Scheme and Standard LISP are (to my way
of thinking) much more intuitive in this regard. Having things in
lambda-lists which don't get bound, but which affect the way other
things do get bound, seems ... I don't know. Prejudice, I supose;
I didn't learn it that way. But I don't like it!

Those are the two major criticisms I would still make about Common
LISP, ten years down the track. I also dislike the way that the reader
(according to the standard) ignores comments, so that comments are not
(according to the standard) available to an in-core editor; and the
way the reader (according to the standard) ignores case. But these are
details.

--
si...@intelligent.co.uk (Simon Brooke) http://www.intelligent.co.uk/~simon

If you really want a Tory government for ever, keep on voting
Labour. If you want a Labour government soon, vote SNP just once.

Alaric B. Williams

unread,
Feb 1, 1997, 3:00:00 AM2/1/97
to

On 01 Feb 1997 02:20:53 +0000, Erik Naggum <er...@naggum.no> wrote:

>* Alaric B. Williams
>| So there should be a 'first-that' or something that takes a lambda
>| parameter. An oversight in the standard procedure library isn't much of
>| a problem IMHO... things can be added to Scheme, but nothing can be
>| removed from CL...

>sigh. who can choose a language with such proponents, and such arguments
>in its favor? it's legitimate for any Scheme user to point out flaws in
>Common Lisp like they could win an olympic medal for it,

If only ;-)

> but if you point
>out a design problem with Scheme the same people will readily pardon any
>and all flaws in Scheme as "oversights" or even worse trivializations.

The thing is, I see Scheme as a "work in progress". People invent
stuff, toy with it, get fed up with it, reinvent it in a better way -
then it makes it into some standard.

>it is impossible to argue with people who have detached their emotional
>involvement in a language (which any language worth using will inspire in
>its users) from rational appreciation of its role, relevance, and value.

Yup!

>Scheme is the only language I have ever seen where people will actually
>argue in _favor_ of its flaws, explicitly or implicitly by some stupid
>non-argument about some other language.

Try assembly language :-)

> once upon a time, I used to think
>that a language (SGML) had such wondrous potential that I would ignore all
>present flaws and practical problems. I gradually came to understand that
>that potential would never be realized, precisely because nobody cared to
>fix the present flaws and practical problems -- those who saw the potential
>ignored them and talked about how SGML changed the idea of information and
>all that fine management-level nonsense, and those who had to deal with
>them just found ways to live with them, even arguing against changes!

It ain't a perfect world - too many bigots and idiots :-(

>take a look at Common Lisp's `member' some day. the `first-that' that you
>seem to think of is called `member-if' in Common Lisp. it is different
>from a `member' with a :test argument. also note the :key argument.
>(and _please_ note that :test-not and `member-if-not' are deprecated.)

Exactly. Now, CL has the background and development to make it a
powerful language. The Scheme philosophy is to start again from the
bottom and build from cleaner foundations to - eventually - make a
nicer end product... the 'mem*' thing will probably be an unremoveable
flaw in the standard for ever more, but everyone makes mistakes and
ends up depreciating stuff when the new improved "first-that" comes
out!

Ray S. Dillinger

unread,
Feb 1, 1997, 3:00:00 AM2/1/97
to

Seth Tisue wrote:

> Earl & Daniella Harris <esha...@widomaker.com> wrote:

> >Regarding the evaluation model, CL's doesn't treat
> >functions as first class objects. I can't pass functions
> >around like other values (numbers, lists, etc).
>

> This is totally incorrect. Functions are 100% first class objects in
> Common Lisp, same as in Scheme.

CL's insistence on a separate namespace obscures the issue. It
requires various limitations and constraints to be imposed which
are not needed in Scheme.

In scheme, a function is simply a value, like a number or a
string. In CL, a function is in a separate class -- it has to
be funcalled for example if you're evaluating to it.

((function-that-returns-another-function) argument1 argument2)

is a legal expression in Scheme, because it follows the convention
that *any* value in an expression may be replaced by an expression
which evaluates to it. In CL, you have to write something like

(define-function ?foo (function-that-returns-another-function))
(funcall ?foo argument1 argument2)

I'm just guessing about the use of the define-function syntax; I
must admit it's darn weird as far as I can tell to have different
defines for everything when it's only one operation.

But the point is that functions are treated differently than other
values in CL. You can't insert a subexpression evaluating to your
function in place of the function the way you can with any other
value. You can't use your standard assignment statement to assign
a value to a variable if that value happens to be of type function.

And numerous other warts and inconsistencies.

I value scheme more for what it does *NOT* have -- in the
form of exceptions to its clean, simple rules -- than for
what it does. CL is famous for its libraries, and rightly
so. But Scheme has the cleanest, simplest, and most
consistent rules of operation and evaluation of any language
I've ever come across, and that enables me to write correct
programs much more easily; I always know exactly what an
operation means, because it always means exactly the same
thing, and doesn't have weird exceptions like "you can
always substitute an expression evaluating to a value for
the value itself in any expression -- UNLESS it's a
function..." and "the value of ?foo is 27 -- UNLESS it's
the function ?foo instead of the variable ?foo -- and so
on.

I don't even use the "syntactic sugar" forms in scheme for
defining functions, etc; I *want* to think of lambda as an
function that returns a value, and define as a function
that binds a value to a variable name -- that's what they
do, that's what I want to do, so I use them. It's the same
operation, and therefore the same syntax, as assigning the
value returned from any other function to any variable.
Syntactic sugar, to me, just obscures the issue and introduces
another unnecessary exception to the rules to remember. And
if I don't have to memorize unnecessary junk, I'd rather not.

That, I suppose, is why I love scheme -- no other language
gives me such returns of precision and capability for such a
trifling effort of memorization and practice.

Bear
---
Inventions have long since reached their limit, and I see no
hope for further development.
-- Julius Sextus Frontinus
(Highly regarded engineer in Rome, 1st century A.D.)

Erik Naggum

unread,
Feb 1, 1997, 3:00:00 AM2/1/97
to

* Simon Brooke

| I also dislike the way that the reader (according to the standard)
| ignores comments, so that comments are not (according to the standard)
| available to an in-core editor; and the way the reader (according to the
| standard) ignores case. But these are details.

which in-core editor (according to the standard) are you talking about?

Robert Sanders

unread,
Feb 1, 1997, 3:00:00 AM2/1/97
to

"Ray S. Dillinger" <be...@sonic.net> writes:

> ((function-that-returns-another-function) argument1 argument2)
>
> is a legal expression in Scheme, because it follows the convention
> that *any* value in an expression may be replaced by an expression
> which evaluates to it. In CL, you have to write something like
>
> (define-function ?foo (function-that-returns-another-function))
> (funcall ?foo argument1 argument2)

That's not necessary.

(defun make-adder (x)
#'(lambda (y) (+ x y)))
=> MAKE-ADDER

(funcall (make-adder 3) 4)
=> 7

As was stated, functions are first-class objects. Funcall takes a
function argument.

> I don't even use the "syntactic sugar" forms in scheme for
> defining functions, etc; I *want* to think of lambda as an
> function that returns a value, and define as a function
> that binds a value to a variable name -- that's what they
> do, that's what I want to do, so I use them. It's the same

Neither define nor lambda are functions.

Also, you can think of the lack of an explicit "funcall" in Scheme as
a kind of syntactic sugar :-)

regards,
-- Robert

Erik Naggum

unread,
Feb 1, 1997, 3:00:00 AM2/1/97
to

I find it symptomatic of Schemers attacks on Common Lisp that they are
ignorant, arrogant, and so devoid of correctnes and precision as to merit
no other badge than "prejudice".

* Ray S. Dillinger


| In scheme, a function is simply a value, like a number or a string. In
| CL, a function is in a separate class -- it has to be funcalled for
| example if you're evaluating to it.

Ray, you don't know what you're talking about. it also looks as if you
don't know what a namespace is. a namespace is simply a mechanism to refer
to objects by names taken from different sets of possible names. whether
an object is found by a name in one namespace or in another makes no
difference to the object. you seem to believe it does.

let me paraphrase your complaint with an example of a similar complaint
about Scheme:

in CL, a function is simply a value, like a number or string. to use
any value as a function, apply `funcall' to it. in Scheme, a value has
to be placed first in a form if you wish to call it as a function.
this is unlike the use of any other variable, which do not need this
special position in a form when using their value.

I guess you will automatically think the above is a specious argument, and
will ignore it, without having taken the care to examine the fact that it
is entirely valid, and that its main function is to show that your position
is not one of understanding.

occasionally, we hear that syntax is unimportant, that what matters is the
semantics of languages, indeed that arguments about syntax are _invalid_
because all languages are Turing equivalent. however, when some Scheme
users need to debunk Common Lisp, it is tragically common for them to
stumble in the syntax, and make such utterly false accusations as the above
complaints about namespaces.

let's look at one of your ignorant, invalid examples:

| ((function-that-returns-another-function) argument1 argument2)

vs

| (define-function ?foo (function-that-returns-another-function))
| (funcall ?foo argument1 argument2)

I find it interesting that most Common Lisp users who criticize Scheme do
so with syntactically correct Scheme programs or real examples, whereas
Scheme users who criticize Common Lisp generally don't have a clue. here's
how it would be done by somebody who had actually _seen_ Common Lisp:

(funcall (function-that-returns-another-function) argument1 argument2)

| I'm just guessing about the use of the define-function syntax; I must
| admit it's darn weird as far as I can tell to have different defines for
| everything when it's only one operation.

maybe if you didn't guess as much, you wouldn't make a fool of yourself.
and if you had paid attention, you would have known that it isn't only one
"operation" that is performed by differently-named functions, unless you
define "operation" so generally as to make all computers perform the same
operation, namely heat their immediate surroundings.

I wonder why so many people who love Scheme tend to be so unbelievably
arrogant in their belief that Scheme is superior that they don't even
bother to _learn_ anything about the languages they compare to. this
really makes me wonder what made them love Scheme in the first place. it
surely cannot be intelligent, rational, or informed balancing of features.
it cannot be based on a desire to study and learn languages. instead, Ray,
explains to us, it's based on "a trifling effort of memorization and
practice", if I understood what he meant by that, alien as it is to me.

| But the point is that functions are treated differently than other values
| in CL.

that may be your point, but it is a _fact_ that functions are treated just
like other values in Common Lisp. the difference between Common Lisp and
Scheme is that Scheme evaluates the first element of a form to find which
function to call, whereas Common Lisp regards the first element of a form
as a constant symbol, and looks up the function definition of that symbol.

I'm rather surprised by the many Scheme users who fail to understand such a
simple _fact_, instead preferring to make various bogus "points". every
such "point" reduces the credibility of the entire Scheme community.

| You can't insert a subexpression evaluating to your function in place of
| the function the way you can with any other value.

yes, you can. the problem is that you're very confused, mostly because you
treat Common Lisp as if it were Scheme, and then complain when it isn't.
Common Lisp does not _evaluate_ symbols to a function. period.

| You can't use your standard assignment statement to assign a value to a


| variable if that value happens to be of type function.

Ray, you don't know what you're talking about.

| And numerous other warts and inconsistencies.

I'm sorry to say that you seem to produce the warts you wish to attack
straight out of thin air, and that you wouldn't find half as many if you
had bothered to study Common Lisp before your ignorant attacks on it.

| But Scheme has the cleanest, simplest, and most consistent rules of
| operation and evaluation of any language I've ever come across, and that
| enables me to write correct programs much more easily; I always know
| exactly what an operation means, because it always means exactly the same
| thing, and doesn't have weird exceptions like "you can always substitute
| an expression evaluating to a value for the value itself in any
| expression -- UNLESS it's a function..." and "the value of ?foo is 27 --
| UNLESS it's the function ?foo instead of the variable ?foo -- and so on.

I find this entertaining. you don't know diddlysquat about Common Lisp,
yet speak with a stunning degree of certainty in which you prove beyond
_any_ reasonable doubt that you don't even care to look things up, yet you
know _exactly_ what Scheme does. yeah, right. I think the latter is pure
hyperbole -- that the degree to which you "know exactly" what Scheme does
is the same degree to which you know vaguely how Common Lisp works.

perhaps the difference between Scheme and Common Lisp programmers is that
the Common Lisp programmers _know_ they need to look things up, whereas the
Scheme programmers always _think_ they never need to. I guess that's also
why Common Lisp has documentation strings and Scheme doesn't. the lack of
documentation strings has always bothered me in Scheme. (yes, it's been
rehashed a few times. I know all the arguments.)

| That, I suppose, is why I love scheme -- no other language gives me such
| returns of precision and capability for such a trifling effort of
| memorization and practice.

this is _very_ amusing. your utter lack of precision in your description
of Common Lisp really drives a stake through the heart of your arguments.
but I guess that's the _real_ return you get from a trifling effort of
memorization and practice.

why do so few Scheme users seem to care enough to be accurate? could it be
influence from the language? or could it be that people who _don't_ care
flock to Scheme? I'm led to wonder by the many Scheme proponents who like
to attack strawmen and problems only of their own imagination.

Lyman S. Taylor

unread,
Feb 1, 1997, 3:00:00 AM2/1/97
to

I had this at the bottom of my original draft of this. I thought I put
it at the top....

For programming "in the small" Scheme is cool. It doesn't scale up well,
though. Right tool for the right job...

Although the Common Lisp approach is prehaps a tad bit more "complicated".
Having a seperate namespace for function names isn't a totally bad thing.


In article <32F3A4...@sonic.net>, Ray S. Dillinger <be...@sonic.net> wrote:
>Seth Tisue wrote:
>
>> Earl & Daniella Harris <esha...@widomaker.com> wrote:
>> >Regarding the evaluation model, CL's doesn't treat
>> >functions as first class objects. I can't pass functions
>> >around like other values (numbers, lists, etc).
>>
>> This is totally incorrect. Functions are 100% first class objects in
>> Common Lisp, same as in Scheme.
>
>CL's insistence on a separate namespace obscures the issue. It
>requires various limitations and constraints to be imposed which
>are not needed in Scheme.
>

>In scheme, a function is simply a value, like a number or a
>string. In CL, a function is in a separate class -- it has to
>be funcalled for example if you're evaluating to it.

No it is not a seperate "class"... it is a value in a seperate namespace.
What is different is the evaluation rule, not the "classification".
In Common Lisp when looking for the function "value" of an s-expression
to be evaluated you look in the function namespace for that value.

[ If you wished to stick non function "values" into the function
namespace you could do the following:

(setf (symbol-function 'foo ) 3 )

However, it is not likely that setf will allow such an operation
since it is by no means "productive". More on how this makes
things simplier later...]


>((function-that-returns-another-function) argument1 argument2)
>
>is a legal expression in Scheme, because it follows the convention
>that *any* value in an expression may be replaced by an expression
>which evaluates to it.

You might then be preplexed by the following which works in CL...

( (lambda ( x y ) (+ x y )) argument1 argument2 )

[ Other than in "function name position" of a function call if you
just put a #' in front of everywhere you see a lambda in Scheme
I think that works out ok. I tend to think of it as lambda placing
values into the function namespace by default. I think that is to
make the above expression "consistant". A call to LAMBDA is the
only "function call" allowed in this position. This does mean that
in other placements you'll need to preface the lambda expression
with #' to get the value back. ]

>
>(define-function ?foo (function-that-returns-another-function))
>(funcall ?foo argument1 argument2)

...

Perhaps you mean:

(setf foo (function-that-returns-another-function))
(funcall foo argument1 argument2 )

Or you could just drop the setting of the variable all together:

(funcall (function-that-returns-another-function) argument1 argument2 )

Which compared with the Scheme version involves only inserting a FUNCALL
just after the left paren... "Radical" huh? :-)


>I'm just guessing about the use of the define-function syntax; I
>must admit it's darn weird as far as I can tell to have different
>defines for everything when it's only one operation.

...
>value. You can't use your standard assignment statement to assign

>a value to a variable if that value happens to be of type function.

Buzz....

(setf baz #'(lambda (x ) x ))

(functionp baz ) ==> T

If I choose to "invoke" the function then I must follow CL evaluation
rules. By which:

( baz .... )

doesn't work because BAZ doesn't have a binding in the function
namespace... which is where I'm suppose to look for such a value.

If you really only wanted one definer you could use the "all powerful"
SETF everywhere. ;-)

(setf (symbol-function 'foo ) #'(lambda ( ... ) .... ) )

instead of

(defun foo ( ...) .... )


>And numerous other warts and inconsistencies.
>

...


>the value itself in any expression -- UNLESS it's a
>function..." and "the value of ?foo is 27 -- UNLESS it's
>the function ?foo instead of the variable ?foo -- and so
>on.

This is the wrong why to look at it. ?foo has multiple properites.
One of which is it's "value" and one is it's "function value" ( i.e.
binding in the function namespace).

(setf ?foo '( 23 24 ) )

[ or if you perfer (setf (symbol-value '?foo) '( 23 24) ) ]

(setf (symbol-function '?foo) #'car )

?foo ==> ( 23 24)
(?foo '( a b )) ==> A
(?foo ?foo) ==> 23

The "complication" is recognizing which position in the function call
expression you are in to choose which "namespace" to look into for the value.
To some extent, in scheme you have to worry about position in the
expression too. After all

( (+ 1 3 ) .... )

isn't going to work. Any old expression cannot be in that first position.
I don't see the increase in "cognitive" workload as all that great to
contemplate about value and namespace. Especially when the semantics
of the setters are set up so that only function values can be found in the
function namespace.

Way too many lanaguages use multiple namespaces to a positive effect
to discount its usefulness [ e.g. record elements names ( or struct member
names in 'C'-land) are defined in the namespace of the record itelf.
imagine how much a pain it would be if all these names had to be unique.]
In programming in a team of 3-5 people I would rather not have to get
concensus as to what ALL of my function and module "globals" names have to
be in order not to have a name collision. I'd rather specify an interface
and say "this" is what I'm going "provide" and "this" is what I need.
[ I don't find the practice of putting a unique prefix on all of my
module vars and functions very appealing either... ]


--

Lyman S. Taylor "Computers are too reliable to replace
(ly...@cc.gatech.edu) humans effectively."
Commander Nathan Spring, "Starcops"

Rainer Joswig

unread,
Feb 1, 1997, 3:00:00 AM2/1/97
to

In article <854739...@wildcard.demon.co.uk>,
cyber_...@wildcard.demon.co.uk wrote:

> when discussing Lisp. User defined aggregate data types have
> been available in functional languages since the early 70s,
> and perhaps even earlier.

Yeah, and Haskell has added them in 1996.
I'll gladly forward you a mail which summarized
some of the design problems/alternatives, which was posted by
Mark P Jones on the Haskell mailing list in 1995.

> > But some people have been
> > struggling with similar approaches years ago. Its
> > a bit like what people experienced with rule based
> > languages in real world software (like OPS5-based
> > configuration systems, or even the infamous sendmail).
>
> Huh? These have nothing to do with FP.

I was referring to pattern matching which is
common in both functional languages and rule based
languages. What do you think are common problems
with using patterns - can you image some? How
would you avoid them?

> Just consider this: I'm writing CGI code in Haskell. The fact

You could use Fortran, too. If you could not write CGIs
in Haskell, then I would worry.

Rainer Joswig

[To improve the style in this newsgroup, I have deleted and
omitted all personal attacks. Yeah, looks shorter now. :-) ]

--
http://www.lavielle.com/~joswig/

Cyber Surfer

unread,
Feb 1, 1997, 3:00:00 AM2/1/97
to

In article <30637524...@naggum.no> er...@naggum.no "Erik Naggum" writes:

> Scheme is the only language I have ever seen where people will actually
> argue in _favor_ of its flaws, explicitly or implicitly by some stupid

> non-argument about some other language. once upon a time, I used to think

Well, I've found that there's always someone who will argue in
favour of the flaws of a language (or OS, or editor, etc etc).
The saying, "one man's meat", refers to this tendency. <sigh>

> that a language (SGML) had such wondrous potential that I would ignore all
> present flaws and practical problems. I gradually came to understand that
> that potential would never be realized, precisely because nobody cared to
> fix the present flaws and practical problems -- those who saw the potential
> ignored them and talked about how SGML changed the idea of information and
> all that fine management-level nonsense, and those who had to deal with
> them just found ways to live with them, even arguing against changes!

Yes, I know what you mean. I first noticed this with CPUs like
the Z80 and 6502. Perhaps that was because at the time the Z80
vs 6502 question was a felt to be an important one. Today, few
people using computers even know what a CPU is, nor should they.
Yet that didn't stop us from arguing over such choices!

If we can worry about such trivial issues, what hope is there
for tools that are likely to survive a little longer, like SGML?
(I'm using the word "survive" in a way that requires little
qualifier: computers with 8bit CPUs are no longer a big issue.
In fact the same is true for 16bit CPUs, and who knows, maybe
the same is true for the 32bit CPU issue. The hardware itself
is another matter.)

I'm not suggesting that SGML is a fad, of course. On the other
hand, how many people using HTML know of its originals? Alas,
too few. At the very least, it could be of historical interest.

It's useful to reflect on how far we've some, and how much
further we've yet to travel. It's a very personal journey,
which may be why people disagree so much. We can't follow in
each others footsteps, but must instead find our own path to
enlightenment. It may, however, be fun to compare notes!

Cyber Surfer

unread,
Feb 1, 1997, 3:00:00 AM2/1/97
to

In article <32F3A4...@sonic.net> be...@sonic.net "Ray S. Dillinger" writes:

> In scheme, a function is simply a value, like a number or a
> string. In CL, a function is in a separate class -- it has to
> be funcalled for example if you're evaluating to it.

Well, the _names_ are treated differently. Function objects
are another matter. I think that this is the distinction that
you're making - please correct me if I'm mistaken.

Cyber Surfer

unread,
Feb 2, 1997, 3:00:00 AM2/2/97
to

In article <joswig-ya0231800...@news.lavielle.com>
jos...@lavielle.com "Rainer Joswig" writes:

> > when discussing Lisp. User defined aggregate data types have
> > been available in functional languages since the early 70s,
> > and perhaps even earlier.
>
> Yeah, and Haskell has added them in 1996.
> I'll gladly forward you a mail which summarized
> some of the design problems/alternatives, which was posted by
> Mark P Jones on the Haskell mailing list in 1995.

FP does not begin and end with Haskell. Did you miss my
comments about ML? (Note: not just SML - that's more recent.)

> I was referring to pattern matching which is
> common in both functional languages and rule based
> languages. What do you think are common problems
> with using patterns - can you image some? How
> would you avoid them?

Ah, more word games. There's more to pattern matching.

>
> > Just consider this: I'm writing CGI code in Haskell. The fact
>
> You could use Fortran, too. If you could not write CGIs
> in Haskell, then I would worry.

Exactly. You _can_ do it - end of story. Everything else is
childish politics, and I'm sure you can do better than that.

> [To improve the style in this newsgroup, I have deleted and
> omitted all personal attacks. Yeah, looks shorter now. :-) ]

What personal attacks? Are lies about languages pemitted,
but a reference to the lack of truth _not_ permitted?

DOUBLE STANDARD ALERT!

William Clodius

unread,
Feb 2, 1997, 3:00:00 AM2/2/97
to

Generallizing and paraphrasing Eriks comment:

I find it symptomatic of attacks on almost any language by almost any
advocate of another language that they are ignorant, arrogant, and so


devoid of correctnes and precision as to merit no other badge than
"prejudice".

--

William B. Clodius Phone: (505)-665-9370
Los Alamos Nat. Lab., NIS-2 FAX: (505)-667-3815
PO Box 1663, MS-C323 Group office: (505)-667-5776
Los Alamos, NM 87545 Email: wclo...@lanl.gov

Simon Brooke

unread,
Feb 2, 1997, 3:00:00 AM2/2/97
to

In article <5crort$hj4$1...@news1.sympatico.ca>,

This is a *highly* contentious issue, even a religious issue. Basically
people who believe in a single name-space (as I do) would argue for
it's orthogonality and cleanliness; it make treating code as data (and
data as code) far more straightforward, and lets face it that's a very
LISPy programming style. People who believe in multiple name-spaces
will point out (correctly) that in a single name-space finding new
names for things can get awkward (but that's a problem largely solved
by the package system) and will claim that treating data as code (and
vice-versa) is a dangerous thing to do, and ought to be marked by
special rituals so you remember when you're doing it.

Simon

I'm fed up with Life 1.0. I never liked it much and now it's getting
me down. I think I'll upgrade to MSLife 97 -- you know, the one that
comes in a flash new box and within weeks you're crawling with bugs.

Simon Brooke

unread,
Feb 3, 1997, 3:00:00 AM2/3/97
to

In article <30638175...@naggum.no>,

Erik Naggum <er...@naggum.no> writes:
> * Simon Brooke
>| I also dislike the way that the reader (according to the standard)
>| ignores comments, so that comments are not (according to the standard)
>| available to an in-core editor; and the way the reader (according to the
>| standard) ignores case. But these are details.
>
> which in-core editor (according to the standard) are you talking about?

Well, that's just the point. Common LISP assumes that you will edit
the file, not the structure. But as a LISP programmer, I'm not that
interested in the textual representation of my code, I'm interested in
it's structure. While the integration between Emacs and a Common LISP
system can be extremely good, and extremely quick, I still find it
much less intuitive to drop out of the LISP environment to a separate
editor which sees my code as just text than to use an in-core editor
(eg the InterLISP DEDIT, or the Cambridge LISP fedit/sedit) which
understands it's structure and can ensure I don't make silly
bracketing or lexical errors (it is also, of course, immensely easier
to hack up your own structure editor than to write a text editor).

The issue of comments in LISP is of course very difficult, because if
they are part of the structure they have to evaluate to something, and
consequently putting comments in the wrong place can affect the
computation (cf, again, InterLISP, where comments evaluated to
NIL). Of course this should not happen. Richard Barbour's (Procyon
Common LISP) solution of holding the comment structure on the
property-list of the function symbol was an interesting
work-around. But I feel that the solution of treating the comments as
things which are entirely lost when a function is loaded into core is
a cop-out. It also has very unfortunate results for people who write
code which manipulates code, because all internal documentation gets
lost in the process.

Simon

There is no Kay but Kay, and Little is his profit.

Ray S. Dillinger

unread,
Feb 3, 1997, 3:00:00 AM2/3/97
to

William Clodius wrote:
>
> Generallizing and paraphrasing Eriks comment:
>
> I find it symptomatic of attacks on almost any language by almost any
> advocate of another language that they are ignorant, arrogant, and so

> devoid of correctnes and precision as to merit no other badge than
> "prejudice".

This may be true. I have better things to do with my time than
participate in a pointless language flamewar or talk about CL
when, as has been rightly pointed out, I know little about it.

I'll just say briefly what I *like* about Scheme. I like having
one namespace instead of more than one. I like having one form
of (define) instead of more than one. I like never needing
funcall. I like being able to manipulate *any* variables and
values using *exactly* the same few forms and calls.

I like having call/cc instead of bunches of prefab control
structures. I like having very few rules and procedures that I
need to remember. I like that it has an absolute minimum number
of special forms. I like its simplicity. And most of all I like
the way it makes me think about programming and process -- I
put together the absolute fundamentals to a process, or a control
structure, and I get insight into it.

My primary other language is Pascal -- Strongly typed, rigid,
with lots of syntactic rules and special forms. Scheme
changed utterly the way I thought about programs and process;
it's like one of those 'simple' strategy games where you learn
the rules -- ALL the rules -- in one minute and then discover
there's no top end to learning the strategy. Well, Scheme takes
less than two days to learn, and it will change your perspective
on programming utterly. Perhaps the same can be said of CL;
but never mind, I'm just saying it about scheme.

Any flames, attacks, or further invites into language wars, will
be duly ignored.

Bear

Erik Naggum

unread,
Feb 3, 1997, 3:00:00 AM2/3/97
to

* Steve Austin

| I'm very much a newcomer to Common Lisp, and I naively assumed that the
| originators of Scheme used a common namespace to simplify the syntax of
| higher order functions. What advantages do separate namespaces provide?

as others have observed, there are (at least) two schools of thought here.

however, I'd like to approach this issue from a natural language point of
view, instead of a formal language point of view. clearly, if you define a
formal language to have only one namespace, you can argue all sorts of
things from there, but the question is not ex post facto arguments, but
rather the genesis of the idea.

in natural languages, we are used to context. indeed, contextual meaning
is what makes natural languages natural. we have `list' as a verb, and we
have `list' as a noun. we have `listless' as an adjective describing
something (like a programming language) that does not have lists, and an
adjective describing someone who is sort of permanently tired. when we
need to disambiguate, we do so with more words.

in Common Lisp, I can call some temporary variable `list' without having
removed my ability to create new lists with the `list' function. like the
natural language equivalent, `list' is both a verb and a noun, both a
function and a variable. I find that this rhymes very well with me, and I
also find that I would have severe problems if I could not use a word in a
natural language just because it was "used up" by another part of speech.
English is more prone to this than many other languages, but I happen to
like English, too.

why is just one namespace bad for you? first, name space management is
difficult. it is made more difficult by the lack of packages or other
means of creating new namespaces. it is made more difficult by any means
that artificially increase the collision rate of names. most languages
that try to scale have had namespace manipulators added to them. e.g., in
K&R C, struct members shared a single namespace, which nevertheless was
different from that of variables and functions. ANSI C made each struct a
separate namespace. C++ introduced the pervasive typedef, which not only
made class names a new type, but also a reserved word, which leads me to
the second reason. by having one namespace only, you effectively create a
new reserved word every time you name something globally. in Common Lisp,
you can't redefine the functional meaning of symbols imported from standard
packages, but you can use them in (almost) any other way, and you can
(must) declare that you intend to shadow symbols. in Scheme, you need to
be careful that you don't redefine symbols you will later need in their
global sense.

various counter-measures are necessary if you have only one namespace.
e.g., in C, the standard prescribes certain prefixes as belonging to the
compiler and the rest are up for grabs among modules that you might want to
link with. of course, using lexical scope, you reduce the impact of this
problem. still, you can't use reserved words where they have no other use
than to make the compiler barf. `default' is a perfectly reasonable
variable name. then, some compilers will introduce new reserved words just
for fun, like `try' and `catch'. Scheme, lacking both a package system and
a useful number of namespaces, open up for namespace management problems
that we know so well from large C programs (C being slightly better than
Scheme in the namespace division). a single namespace in the linker also
forced C++ to include a gross counter-measure appropriately called "name
mangling". lacking packages, lacking symbol types, lacking everything, a
C++ name as seen by the linker is some _implementation-specific_ junk that
makes life "interesting" for everything that wishes to talk with the C++
modules. as a counter-counter-measure against the collision-avoidance that
you need in one namespace, C++ has C linkage (extern "C") as an option to
make names visible in a predictable namespace.

now, C and C++ are language we love to hate, and the more we know the more
we hate them, partly because of these problems, but my point is that Scheme
is even _less_ scalable because of its severe restriction on names, and
doubly so because Schemers, like most Lispers, like descriptive names, not
cryptic naming conventions in somewhat less than 8 characters, which means
that artificial naming in Scheme looks a lot worse than artificial naming
in C.

it is often said that small is beautiful. now, anything can be beautiful
when it is small. the ugliest person you can think of was probably a quite
pretty baby. it doesn't take much effort to find a beautiful 16-year-old
girl, either. in fact, our modern notions of beauty and elegance are
_defined_ in terms of size and maturity, so the chance of anything small
and immature being beautiful is vastly higher than anything big or mature.
now, despite all the marketing that seems to be aimed at telling me that I
should dump a girlfriend when she becomes 25 and get a new 16-year-old (or
even younger), I plan to stay with mine partly because of her ability to
grow older in a way I like. consequently, I take exceptions to the
pedophilic attitudes to beauty and elegance that our societies have adopted
over the years. this is why I don't like the "small is beautiful" model of
aesthetics. this is why I think that almost anybody could make something
small and beautiful, but only a few can create something that grows from
small to huge and still remains beautiful. but then again, look at
interior architecture -- with huge spaces come a need for size-reducing
ornamentation. the scaling process _itself_ adds "junk" to what was "clean
surfaces" in a small model. Schemers refer to Common Lisp's "warts", and
prefer to think of Scheme as "clean". now, I wonder, would Schemers prefer
to live in small houses with nothing on their walls? would they still
prefer this if the walls were a 100 feet high and 200 feet long, or would
they, too, desire some ornamentation that would have looked _very_ bad if
it had been on a 10 by 20 feet wall?

Scheme's single namespace is a function of its size. Scheme with more than
one namespace _would_ have had bags on its side -- it would be very
inelegant. however, as applications grow and as Scheme environments grow,
the single namespace becomes disproportionately _small_. therefore, people
resist a growth path that would have been natural, because their notion of
beauty forbid it. Common Lisp with a single namespace would be confined
and forbidding, for the same reason. an analogy may be in order. in very
small towns, houses may have unique names. as the town grows in size, this
becomes too hard to even imagine working, and houses are instead numbered,
and the number space is managed by a street name. as the town grows more,
streets in neighboring towns it merges with may have the same name. but
towns have names, too, and states may have many towns. the United States
has lots of towns with the same name. there are even towns that bear the
name of countries in the global namespace. some people may still wish to
name their house, but it would be foolish to hope that that name would be
globally unique. all over the place, we invent namespaces to manage the
huge number of things we deal with. in Scheme, there are few things to
deal with, so few names are necessary. in Common Lisp, there are many
things to deal with, so means to keep names apart is _necessary_. in
consequence, Common Lisp has packages and symbol slots and namespaces.

why is a single name space bad for you? in addition to the reasons given
above, I'd like to add a problem as a conclusion: nothing restricts your
growth path more than a restricted ability to name your inventions or
creations. the psychological factor known as "cognitive load" imposes a
very heavy burden on our design, namely by having to avoid excesses in that
load. a single namespace is good if you have few names, and more than one
namespace would be bad. at some size of the set of names, however, a
single namespace becomes bad because what you once knew (namely, what a
symbol meant), _ceases_ to be rememberable. namespaces introduce context
to a language. I think communication without context is a contradiction in
terms, so naturally I applaud such introduction.

Erik Naggum

unread,
Feb 3, 1997, 3:00:00 AM2/3/97
to

* Simon Brooke

| Common LISP assumes that you will edit the file, not the structure.

this is of course untrue. there are no assumption at all on how or what
you will edit if you want to edit Common Lisp programs. all the standard
says is that if you put things in a file, and use `read' to convert the
external representation into the internal representation, the semantics of
that operation is well-defined. editing is certainly outside the scope of
the standard.

| The issue of comments in LISP is of course very difficult, because if
| they are part of the structure they have to evaluate to something, and
| consequently putting comments in the wrong place can affect the
| computation (cf, again, InterLISP, where comments evaluated to NIL).

I find the issue of comments to be simple. if you need them, you bind the
semicolon and the sharp vertical bar to reader macro functions that return
an object of the appropriate comment type, which also prints as a comment
in the usual syntax. your codewalkers then need to learn to skip such
objects. shouldn't be too hard. if you need to load it as code, after you
have edited it, it should be no harder then to remove comment forms. the
way I see it, this can be done entirely outside the language.

| Of course this should not happen. Richard Barbour's (Procyon Common
| LISP) solution of holding the comment structure on the property-list of
| the function symbol was an interesting work-around. But I feel that the
| solution of treating the comments as things which are entirely lost when
| a function is loaded into core is a cop-out. It also has very
| unfortunate results for people who write code which manipulates code,
| because all internal documentation gets lost in the process.

I had this problem in SGML a few years back. it is not a problem as long
as you don't confuse code-as-data and code-as-code. the conversion is not
trivial to begin with, and a pre-pass to delete unwanted elements is not
really an issue. the issue is fundamentally the same as retaining white
space in processing many other languages. in Common Lisp, it's easy to
modify the behavior of the reader.

Barry Margolin

unread,
Feb 3, 1997, 3:00:00 AM2/3/97
to

In article <30639660...@naggum.no>, Erik Naggum <er...@naggum.no> wrote:
>I find the issue of comments to be simple. if you need them, you bind the
>semicolon and the sharp vertical bar to reader macro functions that return
>an object of the appropriate comment type, which also prints as a comment
>in the usual syntax. your codewalkers then need to learn to skip such
>objects. shouldn't be too hard.

If the comments aren't removed by the reader, how do you ensure that all
user-written and third-party macros don't see them? Common Lisp (and most
Lisp-family languages) doesn't have a standard interface to the code
walker, so you can't depend on that to remove them.
--
Barry Margolin
BBN Corporation, Cambridge, MA
bar...@bbnplanet.com
(BBN customers, call (800) 632-7638 option 1 for support)

Cyber Surfer

unread,
Feb 3, 1997, 3:00:00 AM2/3/97
to

In article <5d582t$g...@tools.bbnplanet.com>
bar...@tools.bbnplanet.com "Barry Margolin" writes:

> If the comments aren't removed by the reader, how do you ensure that all
> user-written and third-party macros don't see them? Common Lisp (and most
> Lisp-family languages) doesn't have a standard interface to the code
> walker, so you can't depend on that to remove them.

You could declare a macro for comments, so that when it expands
the expression, it discards the comment and leaves just code.

(defmacro rem (remark &rest body)
`(progn ,@body))

This is crude, but it should allow Simon to use a structure
editor to edit his code, and yet maintain comments to document
what the code does etc.

Obvious, a better name could be chosen...

Rainer Joswig

unread,
Feb 3, 1997, 3:00:00 AM2/3/97
to

In article <854885...@wildcard.demon.co.uk>,
cyber_...@wildcard.demon.co.uk wrote:

> > Yeah, and Haskell has added them in 1996.
> > I'll gladly forward you a mail which summarized
> > some of the design problems/alternatives, which was posted by
> > Mark P Jones on the Haskell mailing list in 1995.
>
> FP does not begin and end with Haskell.

But it seems like a very prominent member.
Haskell tries to be a standard which unifies
non strict, typed FP languages. The ongoing evolution
of Haskell gives a good example about the problems
the designers are facing and which solutions they
are choosing. I find it very telling that they added
field access very late in the game. If they felt
more comfortable with one of the varius approaches
(and there have been some), they would have included
it earlier into the Haskell standard.

> Did you miss my
> comments about ML? (Note: not just SML - that's more recent.)

Have you used it? Tell us a bit about that.

> There's more to pattern matching.

What do you mean? Please give some examples. How do you
use pattern matching?

[To improve the style in this newsgroup, I have deleted and
omitted all personal attacks. Yeah, looks shorter now. :-) ]

--
http://www.lavielle.com/~joswig/

Erik Naggum

unread,
Feb 4, 1997, 3:00:00 AM2/4/97
to

* Erik Naggum

| I find the issue of comments to be simple. if you need them, you bind
| the semicolon and the sharp vertical bar to reader macro functions that
| return an object of the appropriate comment type, which also prints as a
| comment in the usual syntax. your codewalkers then need to learn to skip
| such objects. shouldn't be too hard.

* Barry Margolin


| If the comments aren't removed by the reader, how do you ensure that all
| user-written and third-party macros don't see them? Common Lisp (and most
| Lisp-family languages) doesn't have a standard interface to the code
| walker, so you can't depend on that to remove them.

I think macroexpansion sees the code qua code, so if you submit something
for macroexpansion, you must remove the comments. I also need to clarify
what I meant by "codewalker". I assume that in an editing setting, a
different kind of code walker is needed than in a compilation setting, and
that never the twain shall meet. in essence, I see editing and compiling
code as very different tasks. e.g., during editing whitespace means a lot,
during compiling it means nothing. in fact, during editing, a whole lot of
issues come up that don't in compiling. another example is the #+ and #-
reader macros. they must be retained in an edited function. I think
comments are just a special case of the many differing needs, and that we
delude ourselves if we think that executing directly from an editable form
of the code is going to be much simpler than reading forms from a file.

Simon Brooke

unread,
Feb 4, 1997, 3:00:00 AM2/4/97
to

In article <855007...@wildcard.demon.co.uk>,

cyber_...@nospam.wildcard.demon.co.uk (Cyber Surfer) writes:
> In article <5d582t$g...@tools.bbnplanet.com>
> bar...@tools.bbnplanet.com "Barry Margolin" writes:
>
>> If the comments aren't removed by the reader, how do you ensure that all
>> user-written and third-party macros don't see them? Common Lisp (and most
>> Lisp-family languages) doesn't have a standard interface to the code
>> walker, so you can't depend on that to remove them.
>
> You could declare a macro for comments, so that when it expands
> the expression, it discards the comment and leaves just code.
>
> (defmacro rem (remark &rest body)
> `(progn ,@body))
>
> This is crude, but it should allow Simon to use a structure
> editor to edit his code, and yet maintain comments to document
> what the code does etc.

It's actually not as simple as this, because while the code-walker and
the compiler can trivially be programmed to ignore specially marked
forms, EVAL cannot (at least not trivially). Much cleverer people than
I have given this a lot of thought to this and not come up with a real
solution (although as I said earlier Richard Barbour's was a
reasonable work-around -- does anyone know what Richard is up to these
days?).

-- mens vacua in medio vacuo --

Cyber Surfer

unread,
Feb 4, 1997, 3:00:00 AM2/4/97
to

In article <joswig-ya0231800...@news.lavielle.com>
jos...@lavielle.com "Rainer Joswig" writes:

> But it seems like a very prominent member.
> Haskell tries to be a standard which unifies
> non strict, typed FP languages. The ongoing evolution
> of Haskell gives a good example about the problems
> the designers are facing and which solutions they
> are choosing. I find it very telling that they added
> field access very late in the game. If they felt
> more comfortable with one of the varius approaches
> (and there have been some), they would have included
> it earlier into the Haskell standard.

I don't see this as a problem. If I did, then I'd be more
interested in using SML. There are plenty of alternatives!



> > Did you miss my
> > comments about ML? (Note: not just SML - that's more recent.)
>
> Have you used it? Tell us a bit about that.

I've used the evaluation version of MLWorks. The only
thing that bothers me about this implementation is the
lack of an integrated editor, but this could be fixed
by customising a programmable editor. For Unix, Harliquin
recommend Emacs, which should do nicely.



> > There's more to pattern matching.
>
> What do you mean? Please give some examples. How do you
> use pattern matching?

I use it for simple branching. I've found that the compilers
I've used can branch very efficiently using little more than
matching. It's essentially just a "case" control structure,
but with a nicer syntax. So far, I've never had to use "case"
in Haskell, and I hardly ever use "if".

File handling is Where Haskell is weakest, IMHO. If a compiler
has support for PackedString and I/O for this data type, then
it could be as efficient as any other language.



> [To improve the style in this newsgroup, I have deleted and
> omitted all personal attacks. Yeah, looks shorter now. :-) ]

I deny that they were personal. You've made some claims that I
believe are wrong, or at best, misleading. I'm merely challenging
your statements. If you wish to dump on FP, please do it in an
FP newsgroup, where you'll find some people far better informed
than myself, who can answer you.

Followup-To: comp.lang.functional

Thant Tessman

unread,
Feb 4, 1997, 3:00:00 AM2/4/97
to

Tim Bradshaw wrote:
>
> Reini Urban wrote:
[...]

> > In my eyes Common Lisp is quite hard to learn
> > (compared to standard lisp or scheme)
>
> If it's possible to ask this question without provoking endless futile
> discussion, could you say why? I've taught courses on Common Lisp,
> and it would be interesting to know what people find hard about basic
> CL, especially compared to scheme.

My problem with Common Lisp is that predicates end in "p" instead
of "?". This drives me up the wall.

Will Hartung

unread,
Feb 4, 1997, 3:00:00 AM2/4/97
to

Please Erik, Don't kill me...but :-)...

BW:"Hello, and Welcome to the Barbara Walters Special. Today my guest is
Noted Technologist, Erik Naggum.

Welcome Erik"

EN: "Happy to be here."

BW: "Our viewers are interested on your views about language
complexity, expressiveness and power. We were hoping to could
enlighten us about Lisp and Scheme, and perhaps C/C++"

EN: "Of course..."

Erik Naggum <er...@naggum.no> writes:

[ stuff about complexity, namespaces, big and small deleted ]

>it is often said that small is beautiful. now, anything can be beautiful
>when it is small. the ugliest person you can think of was probably a quite
>pretty baby. it doesn't take much effort to find a beautiful 16-year-old
>girl, either. in fact, our modern notions of beauty and elegance are
>_defined_ in terms of size and maturity, so the chance of anything small
>and immature being beautiful is vastly higher than anything big or mature.
>now, despite all the marketing that seems to be aimed at telling me that I
>should dump a girlfriend when she becomes 25 and get a new 16-year-old (or
>even younger), I plan to stay with mine partly because of her ability to
>grow older in a way I like. consequently, I take exceptions to the
>pedophilic attitudes to beauty and elegance that our societies have adopted
>over the years.

Barbara Walters: "So...Are you saying that Schemers are pedophiles?"

I can just see this as her response to Eriks post.

:-) :-)...<- It's a JOKE!

Sorry Erik, it just struck me as rather funny.

It's amazing what directions a language design discussion can take.

:-) :-)

No offense to anyone intended...

--
Will Hartung - Rancho Santa Margarita. It's a dry heat. vfr...@netcom.com
1990 VFR750 - VFR=Very Red "Ho, HaHa, Dodge, Parry, Spin, HA! THRUST!"
1993 Explorer - Cage? Hell, it's a prison. -D. Duck

Martin Rodgers

unread,
Feb 4, 1997, 3:00:00 AM2/4/97
to

In article <5d71bn$q...@caleddon.intelligent.co.uk>,
si...@caleddon.intelligent.co.uk says...

> It's actually not as simple as this, because while the code-walker and
> the compiler can trivially be programmed to ignore specially marked
> forms, EVAL cannot (at least not trivially). Much cleverer people than
> I have given this a lot of thought to this and not come up with a real
> solution (although as I said earlier Richard Barbour's was a
> reasonable work-around -- does anyone know what Richard is up to these
> days?).

I guess I've forgotten what it's like to use EVAL! I could probably make
this work in my own Lisp intperpreter, but there I have the luxury of
being able to hack the C source code whenever I like, at the deepest
levels. (Not every Lisp will permit that, of course.) I could add REM as
a special form, and in fact I'd need to, as the macro processor in my
Lisp would remove the REM expr!

So, I agree that my solution isn't a perfect one. However, if you stored
the expressions in a tabel, representing the "source file", the editor
could structure edit the "source" expressions, and then redefine a
function or whatever by feeding the expr to EVAL.

Hmm - that's not so elegant, after all. You're right, it's not simple. I
don't think it's impossible, however.

Brian Rogoff

unread,
Feb 4, 1997, 3:00:00 AM2/4/97
to

On Tue, 4 Feb 1997, Will Hartung wrote:
> Erik Naggum <er...@naggum.no> writes:
>
> [ stuff about complexity, namespaces, big and small deleted ]
>
> >it is often said that small is beautiful. now, anything can be beautiful
> >when it is small. the ugliest person you can think of was probably a quite
> >pretty baby. it doesn't take much effort to find a beautiful 16-year-old
> >girl, either. in fact, our modern notions of beauty and elegance are
> >_defined_ in terms of size and maturity, so the chance of anything small
> >and immature being beautiful is vastly higher than anything big or mature.
> >now, despite all the marketing that seems to be aimed at telling me that I
> >should dump a girlfriend when she becomes 25 and get a new 16-year-old (or
> >even younger), I plan to stay with mine partly because of her ability to
> >grow older in a way I like. consequently, I take exceptions to the
> >pedophilic attitudes to beauty and elegance that our societies have adopted
> >over the years.

This is really outside of the charter of these newsgroups, but the
notion that "youth = beauty" is neither modern, nor Western, nor the
product of Madison Avenue marketeers. It is quite old, and exists in
practically every culture (to be precise, males favoring younger brides is
universal; females frequently favor older mates). If you are interested in
all of the details, I suggest you start with a book by David Buss called
"The Evolution of Desire" and proceed to Helen Fischer's "Anatomy of Love".
Similarly, using the discredited Sapir-Whorf hypothesis to support ones
computer language preferences is also absurd.

That said, for day to day work, I generally prefer larger languages
to their smaller cousins (Common Lisp over Scheme, Ada 95 over Pascal, C++
over..., nahh, maybe not) for a number of reasons, although there are
certainly good reasons for using Scheme in certain niches, like teaching,
research into PL design, as an extension language, etc.

-- Brian


Kelly Murray

unread,
Feb 4, 1997, 3:00:00 AM2/4/97
to

I hack way of adding comments which are not discarded by parsing of
the source code is to use simple strings:

(progn "we are incrementing variable x to add 1 to its value -kem"
(incf x)
)

Any reasonable compiler will eliminate the string from the compiled code,
but it will remain after macro expansion.

-kelly edward murray k...@franz.com
Yes, it's true, Franz has made ACL/Linux available Free! http://www.franz.com


Mike McDonald

unread,
Feb 4, 1997, 3:00:00 AM2/4/97
to

In article <32F758...@acm.org>,
Thant Tessman <th...@acm.org> writes:

> My problem with Common Lisp is that predicates end in "p" instead
> of "?". This drives me up the wall.

Just the opposite with me. All those bleepin ?'s and !'s in scheme. :-)

Mike McDonald
mik...@engr.sgi.com

Kevin Russell

unread,
Feb 5, 1997, 3:00:00 AM2/5/97
to

In trying to juxtapose Scheme users and dirtly old men who lust after
teenage girls, Erik may have accidentally hit on an even better analogy.
Scheme appeals to me as a teenager, not because I'm the kind of dirty old
man you'll meet in Nabokov, but because I'm the kind of heartless
orphanage-owning slave-driver you'll meet in Dickens. A programming
object isn't the object of my sexual desire. It's a butler/maid/secretary/
houseboy/stockbroker/valet. If Scheme is a teenager, then it's a
competent teenager who knows how to do its job without whining and I don't
have to pay it minimum wage. If it's as anorexic as a supermodel, at
least it doesn't eat up all my food while it's scrubbing out my fridge.

Some people have seemed to be mystified by phrases like "small cognitive
footprint". I'll try to explain why it makes perfect sense to me.

The languages I use with any degree of frequency are Perl (for massive
text processing), XLISP-STAT (for statistics), Java and JavaScript (in
cute web-pages for students in my courses), Prolog (honest!), and Scheme.
I'll touch C only under duress and only with a ten-foot pole. The only
languages which I find I don't have to "relearn" every time I use them are
Prolog and Scheme.

It boils down to this: I'm not a professional programmer; programming is
only useful if it makes the things I have to do in my real job easier;
programming is a means, not an end (scary thought, I know). I don't have
enough brain-space for the live human languages I study, I certainly don't
have enough left over for the syntactic minutiae of artificial languages.
Arguments like "But that becomes second-nature after the first couple of
months of doing it eight hours a day" don't wash with me, since I'm never
going to give it those couple of months. If you believe people like me
have no business programming computers without professional help and that
programming languages have no obligation to make my life easier, that's a
flamewar we can take somewhere else.

Given these circumstances, I haven't found the same weaknesses in working
with Scheme that a few others have mentioned. I find MzScheme and MrEd
more than sufficient to meet my usual needs, whether it's building a quick
and dirty GUI editor to do some specialized SGML tagging or trying to
decipher the file headers created by that archaic lab instrument we can't
find the manual for (interactively poking around with its bits, seeing if
those two bytes there make sense as an unsigned integer, etc.). I don't
have to build huge systems. I doubt anything I write will ever be longer
than NanoCAD or the e-lisp code for PSGML mode. If you need to build
systems bigger than that, I'll gladly make sympathetic noises and take
your word for it that present Scheme implementations are insufficient for
you.

Scheme being "small" certainly helps, but I think what makes me like it
even more is an apparently very trivial fact: it's easy to type. So easy
I can usually forget it's there.

Two mantras that my ideal computer language would live by are:

1) 95% of what I type is ordinary English. The other 5% should be too.
2) The shift key is the work of the devil.

Scheme function names, for example, are almost plain English! I've typed
"string" hundreds of times in my life. I've typed "copy" hundreds of
times. If I want to copy a string in Scheme, I type "string-copy". I'm
not sure it's fair to say that C has its head permanently stuck in the
60s, but it's hard to think of other reasons for its apparent paranoid
belief that the universe would come to cataclysmic end if a standard
function name were ever longer than six characters.

Maybe C programmers really are atrocious typists. Maybe dropping randomly
chosen letters actually helps them enter their code more quickly. I find
it slows me down. I can never remember which letters were randomly
dropped. I want to copy a string in C -- now was that function "strcpy"
or "strngcopy" or "strcopy" or maybe "stringcpy" or "str_copy" or what? I
either stop everything and look it up or, more likely, guess and let the
compiler bitch about it later (and look it up then). Either way, I spend
several orders of magnitude more time getting the name right than I ever
save through fewer keystrokes. Even the fewer keystrokes are a mirage. As
a middling good typist, "string" comes flowing at a decent speed off my
fingers, as does "copy", since I've typed both often enough for it to be a
habit. With "strcpy" I have to slow down and consciously hunt and peck
for the right keys. Even for brute-force typing speed, "string-copy"
beats "strcpy" hands down.

On this count, I find Common Lisp to be far closer to C than I'd like.
"rplaca"? Give me a break. (Or was that "replca"? I'd have to look it
up.) "progn"? "princ"? In the time it takes me to type "defsetf" (let
alone look it up to make sure I've got it right), I'd be all the way
through "call-with-current-continuation" and well into the next
S-expression. (Not, I hasten to add, that I'd have any idea what to *do*
with "call-with-current-continuation".)

The difference here between Scheme and CL may lay not so much in
legislated standards as in the common practice of the communities. Given a
context where both Scheme and CL have both an English word and a
gobbledygook symbol that could be used (if, cond; third, caddr; etc.), you
wouldn't go broke in the long run if you laid money on the Scheme
programmer using the English symbol and the CL programmer using the
gobbledygook. R4RS certainly doesn't force you to use English-like
symbols, though perhaps it makes it easier to use them more consistently
if you want to. (Erik's example of the various Scheme counterparts to
"member" is one of the very rare exceptions.)

Real English words are useful for debugging too. If 95% of what I type is
ordinary English, 99.99% of what I *read* is (undergraduate term papers
excepted). "strng-copy" leaps off the screen as being wrong in a way that
"strngcpy" can never hope to. With most languages, I have to spend
irritating ages cleaning up bonehead typing errors before I can even think
about finding the logic flaws. In Scheme, this initial phase almost
completely disappears.

(Uh-oh, I'm starting to get scared now. Before I end up convincing myself
that Commodore Grace Hopper was a visionary saint, I think I'd better move
on to the second mantra.)

Almost as bad as wretched fnctn-nams (sometimes worse) are obscure
combinations of punctuation symbols, guaranteed to induce carpal-tunnel
syndrome in anyone who doesn't immediately sprain their wrist. I
appreciate many things about Perl. The sheer joy of typing code in it is
not one of them. (I'm not against shifted characters per se -- after all,
the first computer language I ever learned was APL, which may have spoiled
me. If those five strange characters in a row don't constitute a complete
program that can balance your chequebook and during spare CPU cycles come
up with the Grand Unified Theory of physics, then honestly what's the
point?)

Again, Common Lisp is closer to C than I'd like it to be. The second most
common habit that drives me crazy when I switch back to XLISP-STAT after
Scheming for a while (right after typing "define" for "def") is forgetting
to sharp-quote a function name. I suppose CLers need to save as much time
as they can by leaving "ine" off the end of "def" in order to make up for
having to type "(map 'list #'sin a-list)" instead of "(map sin a-list)".
And don't get me going on those **GLOBAL-VARIABLES**.

[[Yes, I realise "(map 'list #'sin a-list)" is probably wrong in CL,
probably wrong in XLISP-STAT too. You might see it as evidence that
Schemers are lazy bastards who can't be bothered getting their CL facts
straight. I'd see it as support for my main point: I don't *have* to look
up the Scheme version.]]

In Scheme, I can almost avoid the shift key altogether -- apart from the
occasional ? or !, whose usefulness for readability is so obvious that
I'll gladly move my pinky for them. When I'm in one of my shift-o-phobic
moods, I'll even use [] for () and rely on MzScheme's standard-bending
treatment of them as synonymous until I feel guilty enough to do a global
search and replace -- or else remap the keyboard. (Now if I could just
convince the Rice people that percent signs at the end of class names
really aren't necessary...)

As I begin to pull on my asbestos-lined overclothes, I should say this:
I realise none of these cosmetic reasons for preferring Scheme are inherently
impossible in CL. I know I could go and write a "define" macro. I know
I could write a "map" macro that did without the 'list flag and automatically
sharp-quoted the function symbol. (Or at least I could if I felt like
spending a couple of weeks deciphering macros.) I know that I could make
CL definitions of every last English word I ever use in Scheme in terms
of the gobbledygook primitives. Or at least I could if I didn't have
real work to do. How do the Common Lisp advocates put it? Why bother
writing your own library if the other language already has built-in
everything you need?

I realise that someday CLtL3 might even do all this for me without
changing the fundamental nature of the language in the slightest. If that
day ever comes, and if anybody ever implements this CLtL3 on both Unix and
Windows so well that I can run the same programs without even thinking
which one I'm using at the moment, and if I can install it on both
platforms without having to get any closer to C than my ten-foot-pole will
allow, and if they hand out it for free, and if they courteously answer my
bonehead questions despite having given it to me for free -- when that day
comes, I'll think about maybe defecting to Common Lisp. Until then, I'm
sticking with MzScheme/MrEd, the closest thing I've seen yet to an
environment that does what I want it to, when I want it to, and keeps the
hell out of my mind the rest of the time.

Yeah, small is beautiful. Children should be seen and not heard. If
they're small enough, you don't even have to see them.

-- Kevin

-------------
Kevin Russell
Linguistics, University of Manitoba
kru...@cc.umanitoba.ca


Tim Bradshaw

unread,
Feb 5, 1997, 3:00:00 AM2/5/97
to

* Simon Brooke wrote:
> The issue of comments in LISP is of course very difficult, because if
> they are part of the structure they have to evaluate to something, and
> consequently putting comments in the wrong place can affect the
> computation (cf, again, InterLISP, where comments evaluated to
> NIL). Of course this should not happen. Richard Barbour's (Procyon

> Common LISP) solution of holding the comment structure on the
> property-list of the function symbol was an interesting
> work-around. But I feel that the solution of treating the comments as
> things which are entirely lost when a function is loaded into core is
> a cop-out. It also has very unfortunate results for people who write
> code which manipulates code, because all internal documentation gets
> lost in the process.

It's relatively easy to do the following in CL:

make comments read as special comment structures to allow
structure manipulating programs access to them.

strip these structures from the code before it is evaluated.

I have code that does the former, the latter is pretty trivial.
Actually it's not quite trivial, because you can get bad cases like:

#(1 2 ; comment
3)

where arrays might change length during stripping, so I think that if
you want to be really efficient you need to change other things in the
readtable so things like arrays & structures get constructed only
during the comment-stripping process, not at the initial read time.

I don't think it's very hard if you're willing to put some effort into
it. The thing that made me gave up was that I couldn't get XP to
print comment structures in a reasonable way.

--tim

Espen Vestre

unread,
Feb 5, 1997, 3:00:00 AM2/5/97
to

kru...@cc.umanitoba.ca (Kevin Russell) writes:

> "rplaca"? Give me a break.

Hmm.... let me cite from my handout for a course I used to give
in Common Lisp at the Comp. Linguistics department at the university
of Saarbrücken (on-the-fly translated from German):

"it's good [Common] Lisp Programmer Behaviour always to use setf
instead of the specialized modification functions. For instance, you
should always use (setf (first a) b) instead of (rplaca a b)"

--

regards,
Espen Vestre
Telenor Online AS
Norway

Tim Bradshaw

unread,
Feb 5, 1997, 3:00:00 AM2/5/97
to

* Michael Sperber [Mr Preprocessor] wrote:
>>>>>> "Erik" == Erik Naggum <er...@naggum.no> writes:
Erik> lack of a standard package system, for starters.

> Admitted, but also possible to build yourself. The code is out there
> Erik, just download it.

No no no! Read his lips: STANDARD package system. Standard as in `in
the language spec', and preferably `in the official standard language
spec', so you can absolutely rely on it being there and working
properly, and complain at your vendor if it doesn't. People who write
and sell large software systems like that kind of thing!

--tim


Michael Sperber [Mr. Preprocessor]

unread,
Feb 5, 1997, 3:00:00 AM2/5/97
to

>>>>> "Tim" == Tim Bradshaw <t...@aiai.ed.ac.uk> writes:

Tim> * Michael Sperber [Mr Preprocessor] wrote:
>>>>>>> "Erik" == Erik Naggum <er...@naggum.no> writes:
Erik> lack of a standard package system, for starters.

>> Admitted, but also possible to build yourself. The code is out there
>> Erik, just download it.

Tim> No no no! Read his lips: STANDARD package system. Standard as in `in
Tim> the language spec', and preferably `in the official standard language
Tim> spec', so you can absolutely rely on it being there and working
Tim> properly, and complain at your vendor if it doesn't. People who write
Tim> and sell large software systems like that kind of thing!

I didn't dispute that at all. I just stated that it may not be
important for everyone. There's a lot of other STANDARD things CL
doesn't have, but noone makes a fuzz about them. It doesn't have
STANDARD first-class continuations, it doesn't have STANDARD windowing
primitives, it doesn't have a STANDARD code browser with more than 17
colors simultaneously (which is what people REALLY like), it doesn't
...

Let me re-iterate: There's a lot of things in CL that Scheme doesn't
have. Most are easy to add. There is at least one important
(important for some people ...) thing in Scheme it's *impossible* to
add to CL if it isn't there natively: call-with-current-continuation.
Different people consider different things important. Most people
make do with call/cc. Some can't. Some have good (and valid) reasons
to prefer CL. Some have good (and valid) reasons to prefer Scheme.
None is a superset of the other.

Cheers =8-} Mike

Marco Antoniotti

unread,
Feb 5, 1997, 3:00:00 AM2/5/97
to

jos...@lavielle.com (Rainer Joswig) writes:

>
> In article <854885...@wildcard.demon.co.uk>,
> cyber_...@wildcard.demon.co.uk wrote:
>
> > > Yeah, and Haskell has added them in 1996.
> > > I'll gladly forward you a mail which summarized
> > > some of the design problems/alternatives, which was posted by
> > > Mark P Jones on the Haskell mailing list in 1995.
> >
> > FP does not begin and end with Haskell.
>

> But it seems like a very prominent member.
> Haskell tries to be a standard which unifies
> non strict, typed FP languages. The ongoing evolution
> of Haskell gives a good example about the problems
> the designers are facing and which solutions they
> are choosing. I find it very telling that they added
> field access very late in the game. If they felt
> more comfortable with one of the varius approaches
> (and there have been some), they would have included
> it earlier into the Haskell standard.
>

The situation may have changed, but last I remember (about a year and
a half ago) Haskell was actually delivered as a....

hear, hear

...CMU Common Lisp image. :)

--
Marco Antoniotti - Resistente Umano
===============================================================================
International Computer Science Institute | mar...@icsi.berkeley.edu
1947 Center STR, Suite 600 | tel. +1 (510) 643 9153
Berkeley, CA, 94704-1198, USA | +1 (510) 642 4274 x149
===============================================================================
...it is simplicity that is difficult to make.
...e` la semplicita` che e` difficile a farsi.
Bertholdt Brecht

Cyber Surfer

unread,
Feb 5, 1997, 3:00:00 AM2/5/97
to

In article <5d886f$p...@sparky.franz.com>, k...@math.ufl.edu says...

> I hack way of adding comments which are not discarded by parsing of
> the source code is to use simple strings:
>
> (progn "we are incrementing variable x to add 1 to its value -kem"
> (incf x)
> )
>
> Any reasonable compiler will eliminate the string from the compiled code,
> but it will remain after macro expansion.

If I'd spent a little more time writing my macro, I might've done
that way myself. I briefly considered it, of course. ;-) Tools like
this sometimes get written in a hurry, and then slowly refined over
years of use. This one should take less than 5 mins...

Thanks.

Robert Sanders

unread,
Feb 5, 1997, 3:00:00 AM2/5/97
to

Marco Antoniotti <mar...@crawdad.icsi.berkeley.edu> writes:

> The situation may have changed, but last I remember (about a year and
> a half ago) Haskell was actually delivered as a....
>
> hear, hear
>
> ...CMU Common Lisp image. :)

I believe there was a Haskell compiler written in Common Lisp. The
first one has to be written in some language other than itself :-)
Today Glasgow Haskell, Chalmers Haskell, and Hugs all exist without
the help of Common Lisp. At least the first two are partially or
wholly written in Haskell.

regards,
-- Robert

Joe English

unread,
Feb 5, 1997, 3:00:00 AM2/5/97
to

Michael Sperber [Mr. Preprocessor] <spe...@informatik.uni-tuebingen.de> wrote:
>>>>>> "Tim" == Tim Bradshaw <t...@aiai.ed.ac.uk> writes:
>>>>>>>> "Erik" == Erik Naggum <er...@naggum.no> writes:
>>>
>Erik> lack of a standard package system, for starters.
>>>
>>> Admitted, but also possible to build yourself. The code is out there
>>> Erik, just download it.
>>>
>Tim> No no no! Read his lips: STANDARD package system. Standard as in `in
>Tim> the language spec', and preferably `in the official standard language
>Tim> spec', so you can absolutely rely on it being there and working
>Tim> properly, and complain at your vendor if it doesn't. People who write
>Tim> and sell large software systems like that kind of thing!
>
>I didn't dispute that at all. I just stated that it may not be
>important for everyone. There's a lot of other STANDARD things CL
>doesn't have, but noone makes a fuzz about them. It doesn't have
>STANDARD first-class continuations, it doesn't have STANDARD windowing
>primitives, it doesn't have a STANDARD code browser with more than 17
>colors simultaneously (which is what people REALLY like), it doesn't
>...
>
>Let me re-iterate: There's a lot of things in CL that Scheme doesn't
>have. Most are easy to add.


But a standard package system is not one of them.

True: anybody can build *a* package system in Scheme,
but only the R5RS committee can define *the* package system
(and they don't seem interested in doing so).

The whole point of a package system is so that code
written by different people can easily be glued together
in a larger program. If everybody builds their
own package system, that's worse than having none at all.

Granted, not everybody need a package system, only those who
are trying to build large software systems. Since Scheme doesn't
have one (it has, depending on how you look at it, either
zero or several, but not *one*, which is the number required),
that's a significant point in Lisp's favor when it
comes to large-scale programs.


--Joe English

jeng...@crl.com

Simon Brooke

unread,
Feb 5, 1997, 3:00:00 AM2/5/97
to

In article <5d92h2$12u$1...@canopus.cc.umanitoba.ca>,
kru...@cc.umanitoba.ca (Kevin Russell) writes:

<about functions having sensible names>

> On this count, I find Common Lisp to be far closer to C than I'd like.
> "rplaca"? Give me a break. (Or was that "replca"? I'd have to look it
> up.) "progn"? "princ"? In the time it takes me to type "defsetf" (let
> alone look it up to make sure I've got it right), I'd be all the way
> through "call-with-current-continuation" and well into the next
> S-expression. (Not, I hasten to add, that I'd have any idea what to *do*
> with "call-with-current-continuation".)

Look, enough heat has been generated in this thread already, but this
last point is totally bogus. Firstly, anyone who has trouble
remembering how to type RPLACA and RPLACD ought not to be using them;
things which smash structure are not for beginners or casual
users. Second, Common LISP provides an (in my opinion) all too user
friendly way of overwriting J Random Bit of Structure, and it's called
SETF. Applied to a pointer, SETF translates to RPLACA or RPLACD as
appropriate. Of course, 'SETF' isn't English... (what was your
criticism again? :-) )



> The difference here between Scheme and CL may lay not so much in
> legislated standards as in the common practice of the communities. Given a
> context where both Scheme and CL have both an English word and a
> gobbledygook symbol that could be used (if, cond; third, caddr; etc.), you
> wouldn't go broke in the long run if you laid money on the Scheme
> programmer using the English symbol and the CL programmer using the
> gobbledygook.

I write a number of languages, and LISP is one of them. LISP isn't
English; it's a completely different language. It has a completely
different purpose. I use English to communicate with other human
beings, to socialise, to politic, to unwind; I use LISP to describe
computations. The two languages each have their aesthetic, but it is a
completely different aesthetic; they each have their vocabulary,
equally.

If a language is one of the main tools of your daily trade, you will
learn it's vocabulary. If that vocabulary is more closely geared to
the expression of algorithms than to social grooming rituals, the
language is likely to be more effective - at least for the expression
of algorithms. COND means something usefully different from (and more
general than) the English word 'if'; CADDR means something usefully
different from (and more specific than) the English word 'third'.

But what I take as the core of your argument is that Scheme is more
suitable for casual users than Common LISP. This may be so; if so it
may consequently be more suitable for teaching purposes. It doesn't
mean that it's better for developing and expressing algorithms, which
in my view is the purpose of a programming language. It may be or may
not be so; I confess I don't know enough about Scheme to express a
view. But for the regular, serious developer of code, the
English-like-ness of a programming language is a completely bogus
issue. In my opinion, of course.

Simon

;; When your hammer is C++, everything begins to look like a thumb.

Marco Antoniotti

unread,
Feb 6, 1997, 3:00:00 AM2/6/97
to

Thant Tessman <th...@acm.org> writes:

>
> Tim Bradshaw wrote:
> >
> > Reini Urban wrote:
> [...]
> > > In my eyes Common Lisp is quite hard to learn
> > > (compared to standard lisp or scheme)
> >
> > If it's possible to ask this question without provoking endless futile
> > discussion, could you say why? I've taught courses on Common Lisp,
> > and it would be interesting to know what people find hard about basic
> > CL, especially compared to scheme.
>

> My problem with Common Lisp is that predicates end in "p" instead
> of "?". This drives me up the wall.

YES! This is a real problem. I agree.... :)

But then again, there was a time when Logic Programming systems were
built with MacLisp (or similarly Common Lisp) and the "logic
variables" were implemented by a '?' reader macro.

Cheers

Reini Urban

unread,
Feb 6, 1997, 3:00:00 AM2/6/97
to

On Tue, 4 Feb 1997 12:15:19 -0800, Brian Rogoff wrote:
> ... I generally prefer larger languages

>to their smaller cousins (Common Lisp over Scheme, Ada 95 over Pascal, C++
>over..., nahh, maybe not) for a number of reasons, although there are
>certainly good reasons for using Scheme in certain niches, like teaching,
>research into PL design, as an extension language, etc.

I only started with:

>In my eyes Common Lisp is quite hard to learn (compared to standard lisp

>or scheme) ^^^^^

Of course when (time will be my witness) i mastered CL i will certainly prefer
it over scheme or my small lisps. i do mainly autolisp and beta test a new
lisp. cl is a full blown horse, even harder to learn, then the ms windowing
api (which i refused to study :)
but good books about cl are very hard to get. i only got cltl2, the ansi
specs, winston/horn and sonya keene so far, but would die for norvig, both
graham's and the touretzky.
for scheme i dont need any books (even a book armada) as well as for autolisp
or xlisp. but cl will be certainly worth the effort.

my week points in cl are:
1) I generally prefer recursion over iteration, but with cl sources on my
desktop i only see "for" alikes (yes, allegro) and tons of "do".

just a few examples from my (simple) autolisp:
;;; recursive version, my favourite
;;; works in plain AutoLISP only up to 100 elements (stack overflow)
(defun remove_rec (what data)
(cond
((null data) nil) ;of course it should say endp but we dont have it
((equal what (car data)) (remove_rec what (cdr data)))
(t (cons (car data) (remove_rec what (cdr data))))
)
)

(defun remove_iter (what data / item ndata)
(foreach item data
(if (not (equal item what))
(setq ndata (cons item ndata))
)
)
(reverse ndata)
)

;;; mapped version
(defun serge:remove (what from)
(apply 'append (subst nil (list what) (mapcar 'list from)))
)

or this one:
;;; REMOVE-IF - Conditional remove from flat list
(defun remove-if (pred from)
(cond
((atom from) from)
((apply pred (list (car from))) (remove-if pred (cdr from)))
(t (cons (car from) (remove-if pred (cdr from))))
)
)

against:
;;; REMOVE-IF-NOT
;;; it need not be defined recursively, also this way:
(defun remove-if-not (pred lst)
(apply 'append
(mapcar '(lambda(e) (if (apply pred (list e)) (list e)))
lst)))

In real world cl sources i never saw such code so far. (besides the matcher)

2) the fact that functions are no lists anymore i have to accept for
performance reasons, but it's quite hard to understand self-modifying code
now. and all this types: functionp, closurep,

BTW: what is the exact definition of a special form?
my guess:
a builtin function, which takes either a variable number or arguments, or
doesn't evaluate at least one of its arguments.


3) I really miss a better introduction to common lisp like winston/horn or
similar books which create their own lisps out of a few basic functions (as i
did with autolisp to be able to work with it). The sheer mass of functionality
overwhelmes me every single day. (same as with msvc++)
I prefer simple small languages like perl or python which you could learn in
one week and then collect libs over the time to enhance your needs.
Then it's easy to know what is useful and what is sugar only for some specific
tasks or types. of course I appreciate the standardization as all of us, but
its simply quite hard to learn with such a lot of functionality builtin.


--
To prevent spam replies, my newsreader has misspelled my email address. To
send me email, remove the final "!". Sorry for the inconvenience!

Reini -- Always searching...

Rainer Joswig

unread,
Feb 6, 1997, 3:00:00 AM2/6/97
to
(Kevin Russell) wrote:

I'm doing development in Common Lisp and have used Scheme
quite a lot.


> Two mantras that my ideal computer language would live by are:
>
> 1) 95% of what I type is ordinary English. The other 5% should be too.
> 2) The shift key is the work of the devil.

Basically I think you are right. The development environment
also should support this style. I simply don't believe
in cryptic character combinations

> On this count, I find Common Lisp to be far closer to C than I'd like.
> "rplaca"? Give me a break. (Or was that "replca"? I'd have to look it
> up.) "progn"? "princ"? In the time it takes me to type "defsetf" (let
> alone look it up to make sure I've got it right), I'd be all the way
> through "call-with-current-continuation" and well into the next
> S-expression. (Not, I hasten to add, that I'd have any idea what to *do*
> with "call-with-current-continuation".)

Well, we don't live in the ideal world. Common Lisp is a compromise
and it has some baggage. But most of the time I find it quite
acceptable. One of Common Lisp design goals were backward
compatibility with older Lisp dialects. Some millions
of lines of code already existed. So they decided to
make transition less painful by incoporating old
features and names. But you don't need to use RPLACA,
CAR and CDR. If you use Common Lisp, you will need
to learn the newer stuff that is in there and you
will need to learn, that you should avoid older features
(few people are actually using TAGBODY).

> Real English words are useful for debugging too. If 95% of what I type is
> ordinary English, 99.99% of what I *read* is (undergraduate term papers
> excepted). "strng-copy" leaps off the screen as being wrong in a way that
> "strngcpy" can never hope to. With most languages, I have to spend
> irritating ages cleaning up bonehead typing errors before I can even think
> about finding the logic flaws. In Scheme, this initial phase almost
> completely disappears.

A lot of real software is running on top of CL. CL then
is only one (even small) part of your software.
I find it quite acceptable to type "PRINT-OBJECT" or "WRITE-STRING"
in Common Lisp. I also find it quite acceptable to use
"HTML:DECLARE-TITLE" in my CL-HTTP (a web server written
in Common Lisp by John Mallery from MIT AI Lab) code.

So you really beating a strawman. Newer code most of the time
uses similar conventions like some Scheme programmers do.
In Common Lisp this is very important, since there
are some very large libraries out there and nobody
would understand cryptic names. Remember, the
Lisp machine OS from Symbolics has some 30000
(no typing mistake, its thirty thousand) functions built in.
Image what would have happened it they
named FORMAT-GRAPH-FROM-ROOT something like
"frmgrprt". Shudder.

> having to type "(map 'list #'sin a-list)" instead of "(map sin a-list)".
> And don't get me going on those **GLOBAL-VARIABLES**.

In CL you can use MAPCAR.

> impossible in CL. I know I could go and write a "define" macro.

Some people actually have done this. There is
a lot of code that uses DEFINE-GENERIC or DEFINE-METHOD
or DEFINE.


> I could write a "map" macro that did without the 'list flag

This already is MAPCAR.

> real work to do. How do the Common Lisp advocates put it? Why bother
> writing your own library if the other language already has built-in
> everything you need?

The single biggest advantage is that there is a *standard*
for a lot of the functionality you'd need to add in Scheme.
Even ill-named documented standard functions are better than
nonexistent or nonstandard ones.

> Yeah, small is beautiful. Children should be seen and not heard. If
> they're small enough, you don't even have to see them.

I like Scheme. But for my professional tasks CL is much more
useful.

So what does a CL programmer do to cope with typing code?
The Lisp machine has hundreds of cute features in this
areas, but I'd like to mention Macintosh Common Lisp.

So when I'm typing code, like 'M A P C A R SPACE'
after the space MCL automagically displays the argument
list in a small status area at the bottom of an editor
window. You never have to look that up. I have written
my own extension for the editor, so that you can
press control-i on a word and the editor looks
and finds matching symbols in your Lisp system. So
you never have to type real long words, you can look them
up. Another very useful utility is "mouse copy":
Just point with the cursor at some Lisp code and
press the command key while clicking -> the code
will be entered at the current text cursor position.
So you have typed the really long words and you
can reuse them by just clicking on them. This
also works for complete expression. It greatly
reduces typing. Then I have written a small
utility, where typing "control-x f" looks
up the ANSI CL documentation in a Netscape window.
So you have the reference for the complete
ANSI CL just two keystrokes away. No more searching.
Then you have tools like the APROPOS window,
look up symbols, etc.

To give you an example, I just list the exported
symbols of a Common Lisp package (written by me) that extends
the CL-HTTP webserver:

? (loop for symbol being the external-symbols
in (find-package "WWW-PAGES")
do (print symbol))

WWW-PAGES:DO-UPDATE-ONE-PAGE-PAGE
WWW-PAGES:PAGE-HEADER-FILE
WWW-PAGES:SECURITY-MIXIN
WWW-PAGES:READ-HTML-FILE-AS-STRING
WWW-PAGES:BASIC-TEXT-FILE-PAGE
WWW-PAGES:PAGE
WWW-PAGES:BASIC-COMPUTED-FORM-PAGE
WWW-PAGES:CONVERT-FILEMAKER-STRING-TO-HTML
WWW-PAGES:IMPORT-DATA-FOR-PAGE
WWW-PAGES:CONVERT-MAC-STRING-TO-HTML
WWW-PAGES:UPDATE-ONE-PAGE-PAGE
WWW-PAGES:SHOW-ALL-PAGES-PAGE
WWW-PAGES:GENERATE-NAVIGATION-HTML
WWW-PAGES:HEADER-PAGE-MIXIN
WWW-PAGES:OUTPUT-PAGE
WWW-PAGES:DEREGISTER-PAGE
WWW-PAGES:ALL-REGISTERED-PAGES
WWW-PAGES:UPDATE-MIXIN
WWW-PAGES:SUBPAGE-SHARES-LOCK-MIXIN
WWW-PAGES:MAKE-FORM-FUNCTION
WWW-PAGES:PAGE-TITLE
WWW-PAGES:REGISTER-PAGE
WWW-PAGES:GENERATE-HTML
WWW-PAGES:SPLIT-STRING
WWW-PAGES:PAGE-UPDATE-LOCK
WWW-PAGES:FOOTER-PAGE-MIXIN
WWW-PAGES:PAGE-UPDATEABLE-P
WWW-PAGES:PAGE-FOOTER-FILE
WWW-PAGES:PAGE-SOURCES
WWW-PAGES:GENERATE-BODY-HTML
WWW-PAGES:BASIC-SEARCHABLE-PAGE
WWW-PAGES:UPDATE-ALL-PAGES-PAGE
WWW-PAGES:ADMINISTRATION-PAGE
WWW-PAGES:FIND-PAGE
WWW-PAGES:UPDATE-PAGE
WWW-PAGES:GENERATE-PREAMBLE-HTML
WWW-PAGES:PAGE-URL
WWW-PAGES:BASIC-COMPUTED-PAGE
WWW-PAGES:BASIC-HTML-FILE-PAGE
WWW-PAGES:BASIC-HTML-RAW-FILE-PAGE
WWW-PAGES:MAKE-RESPONSE-FUNCTION


You won't find a single really cryptic name there. ;-)

Greetings from Hamburg,

Rainer Joswig

--
http://www.lavielle.com/~joswig/

It is loading more messages.
0 new messages