Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

How Common Lisp sucks

4,542 views
Skip to first unread message

Ron Garret

unread,
Apr 17, 2006, 3:44:03 PM4/17/06
to

Two things to get out of the way at the outset:

1. Note that the title of this post is *HOW* CL sucks, not *WHY* it
sucks. The difference is significant. Please take the time to
understand it before you flame me.

2. I'm not going to say anything I haven't said a thousand times
before, so those of you who know me (that means you, Kenny) will not
find anything new here so you may as well not even bother.

I am writing this because of the debate surrounding Steve Yegge's recent
blog entries on Lisp. It is unfortunate that he made so many technical
mistakes in his posts because they distract people from the fact that
underneath all the errors he is actually making a valid point, that
being that CL has very significant problems that are barriers to its
adoption. (Some people think this is a feature, that having a few
obstacles to overcome keeps out the rif raf. I suppose this is a
defensible position, but I don't subscribe to it.)

I'm going to point out just three problems with CL. There are more.
None of these are original observations.

1. CL lacks standardized support for many operations that are
necessities in today's world (e.g. sockets, database connectivity,
foreign functions). Moreover, it lacks any mechanism by which these
features could be standardized. It is claimed that there are portable
libraries that work across implementations that provide de facto
standards, e.g. UFFI, but these claims are false. I don't have time to
get into details at the moment, but the fact of the matter is that
trying to use Lisp for e.g. writing a Web server is an incredibly
painful experience compared to doing the same thing in e.g. Python.

The Balkanization of the CL implementation space also has the
consequence that one must choose between using implementation-specific
features and thus limiting the potential audience for one's code to a
niche within a niche, or writing to the least common denominator, which
generally means writing an awful lot of #+ reader macros.

2. Even for the one thing that CL claims to be particularly good at --
as a platform for embedding domain-specific languages -- it has
significant limits. To embed languages that differ from CL's semantics
in certain ways requires significant effort. To cite but one example: I
would like to embed a language that is very similar to Common Lisp, but
which differs in how it handles global variable references (to use
global lexical environments) and ((function-returning-a-function)
arguments) syntax. The former can be done using symbol macros, but only
if the top-level definitions precede their first use. If you reference
a global before defining it then you're screwed. The latter cannot be
done at all within CL unless you write a full code walker. But adding
this capability is utterly trivial within an implementation. In MCL it
takes two lines of code. And if it were done it would result in
strictly greater expressive power. Furthermore, it is not even
necessary to agree on the semantics of ((...) ...). One could simply
add a new macro defining form (or even a global variable) to set a
user-definable hook for transforming expressions whose CARs are lists
that do not begin with LAMDBA. All that would need to be agreed upon is
the name of this form. Furthermore, this would result in strictly
greater expressive power. It would be strictly backwards-compatible.
And It would serve the needs of a number of users who are not currently
being served (e.g. those who prefer to do functional-style programming
without having to type FUNCALL all the time.)

But despite the fact that this change is easy and only good could come
of it, it does not happen because there is no process by which this
change can be effected (which is, I believe, a direct consequence of the
fact that the realities of CL politics are that CL is utterly resistant
to all change, though I would dearly love to be proven wrong on that).

(Oh, and anyone who wishes to prove me wrong, please not that there is a
big big difference between effecting change in CL and effecting change
in an implementation of CL.)

3. Much of CL's core is badly designed. For example, consider NTH and
ELT. The functionality of ELT is a strict superset of NTH, so why have
NTH cluttering up the language? (To say nothing of the fact that the
order of the arguments in these two functions are gratuitously
reversed.) Why is the function that computes the difference of two sets
called SET-DIFFERENCE, but the function that computes the intersection
of two sets called simply INTERSECTION? And why do all of these
functions operate on lists, not sets? It's because there are no sets in
CL, which means that CL leads one to prematurely "optimize" sets as
lists. (I put optimize in scare quotes because in fact this is rarely
an optimization, especially when your sets get big, and most of the time
you have to go back and rip out huge chunks of code to replace your
lists with hash tables or binary trees.) I could go on and on.

Now, for those of you who wish to respond I ask you to keep in mind the
following:

1. The details of my criticisms are mostly irrelevant. What matters is
that CL is far from perfect, and that it has no mechanism for change.
So don't bother picking a nit about one of my specific criticisms unless
you wish to argue that CL is perfect and doesn't need to change.

2. I know a lot more about Lisp that Steve Yegge. I spent twenty years
programming in Lisp for a living. I have authored some highly
referenced papers on Lisp. I am far from the world's foremost expert,
but I'm no newbie. If you think I'm wrong about a technical point you
should think twice.

3. I do not hate Lisp. It is and has always been my favorite
programming languages. My love for Lisp pretty much destroyed my career
as a programmer. My motivation for criticising Lisp is not to convince
people not to use it. It is to effect changes that I believe are
necessary to get more people to use it. To quote Paul Graham, "It's not
Lisp that sucks, it's Common Lisp that sucks." And actually, I would
soften that somewhat: it's not Common Lisp that sucks, it's some parts
of Common Lisp that suck. But make no mistake, some parts of Common
Lisp really do suck, and unless they are fixed a lot of people -- myself
included -- won't be able to use it even though they may want to really
badly.

rg

Stefan Scholl

unread,
Apr 17, 2006, 4:11:39 PM4/17/06
to
Ron Garret <rNOS...@flownet.com> wrote:
> trying to use Lisp for e.g. writing a Web server is an incredibly
> painful experience compared to doing the same thing in e.g. Python.

I could list some web servers written in Common Lisp. And one of
them was the first HTTP 1.1 compliant server and used by the W3C
to debug the HTTP 1.1 reference implementation.

And the youngest of the Common Lisp web servers was first
released on 2005-12-31.

Stefan Scholl

unread,
Apr 17, 2006, 4:13:41 PM4/17/06
to
Ron Garret <rNOS...@flownet.com> wrote:
> niche within a niche, or writing to the least common denominator, which
> generally means writing an awful lot of #+ reader macros.

You'll receive the #+ reader macros from interested users.

bradb

unread,
Apr 17, 2006, 4:24:52 PM4/17/06
to
Well written. I'm am almost a complete newbie at Lisp, so please take
what I say with big grains of salt.
It appears to me that the CL community has a lot of inertia, the
language and the spec are old (read mature if you prefer), but few
implementations actually have full ANSI compliance. Isn't there
something wrong when there are no ANSI compliant free implementations
20 years after the spec was written? Most implementations are probably
very close, but there are probably still weird little cracks where
stuff falls out.
I don't think that CL will ever change, there is too much inertia. I
personally think that the only way to get a better Lisp is to start
fresh, like Paul Graham is supposedly doing with Arc. But he should
have released it rather than just talking about it. I recently read
about ISLISP, which looks like it was trying to shed some of CL's
history, get a smaller spec and become a modern Lisp - but it appears
to have not gotten far from the ground.

In my very humble opinion, the only way to get a better CL is to take
the good bits of CL, Scheme, ML, etc and write a new Lisp
implementation. Make the goals of the language sexy and cool so you
get nerds interested in the language and in helping implement it. Make
interfacing to existing C/C++ and CL libraries as easy as possible to
leverage existing code. Make performance a goal, not a side effect -
some people want to write code fast, lots of others want to write fast
code. Many newer languages like Python don't get traction in places
where they would work well, not because the actual performance would
suck too bad, but because the perception that Python is slow.

Anyhow, my thoughts are that pushing for change in CL is like trying to
push water up hill and the only way to make a better Lisp is to not use
the name Common Lisp.

Cheers
Brad

Bill Atkins

unread,
Apr 17, 2006, 4:30:16 PM4/17/06
to
Ron Garret <rNOS...@flownet.com> writes:

Add a compat.lisp file to your source tree. Keep all the code that
relies on conditional read macros in there. In the one project I've
done where cross-Lisp compatability mattered, this was sufficient.
You're too busy to go into the details, so I'm not sure where the
trivial-* packages, etc. fall short for you, but they've worked pretty
well in my own experience.

Yes, this is an issue that often comes up when programming. I want to
get an element in a sequence by index. I start to use ELT, but then I
realize that NTH is in the language, too, and I spend a couple of
hours considering the profound philosophical implications of this.

> order of the arguments in these two functions are gratuitously
> reversed.) Why is the function that computes the difference of two sets

The reversal of arguments _is_ truly annoying.

> called SET-DIFFERENCE, but the function that computes the intersection
> of two sets called simply INTERSECTION? And why do all of these
> functions operate on lists, not sets? It's because there are no sets in
> CL, which means that CL leads one to prematurely "optimize" sets as
> lists. (I put optimize in scare quotes because in fact this is rarely
> an optimization, especially when your sets get big, and most of the time
> you have to go back and rip out huge chunks of code to replace your
> lists with hash tables or binary trees.) I could go on and on.

This is not a fault in the language, as far as I'm concerned. The
operations you mention are often useful for lists. Is it really a
problem that set terminology is used to describe these functions? A
programmer ought to understand that if he or she wants the behavior of
an actual set data structure, then these functions will not do. But
they are often convenient because these operations are fairly commonly
performed on lists.

This point seems comically pedantic. What is your proposal? Take
out the set functions because their names are confusing? Add a
full-fledged set data structure? How does this "problem" demonstrate
the bad design of CL's core?

Pascal Costanza

unread,
Apr 17, 2006, 4:35:34 PM4/17/06
to
Ron Garret wrote:
> Two things to get out of the way at the outset:
>
> 1. Note that the title of this post is *HOW* CL sucks, not *WHY* it
> sucks. The difference is significant. Please take the time to
> understand it before you flame me.

The title "How some aspects of Common Lisp suck" would have been even
more appropriate, especially considering your own final remarks.

> I am writing this because of the debate surrounding Steve Yegge's recent
> blog entries on Lisp. It is unfortunate that he made so many technical
> mistakes in his posts because they distract people from the fact that
> underneath all the errors he is actually making a valid point, that
> being that CL has very significant problems that are barriers to its
> adoption. (Some people think this is a feature, that having a few
> obstacles to overcome keeps out the rif raf. I suppose this is a
> defensible position, but I don't subscribe to it.)

There must be a bottom line somewhere. I don't think you would want the
authors of the entries in the Daily WTF as a target audience

> I'm going to point out just three problems with CL. There are more.
> None of these are original observations.
>
> 1. CL lacks standardized support for many operations that are
> necessities in today's world (e.g. sockets, database connectivity,
> foreign functions). Moreover, it lacks any mechanism by which these
> features could be standardized.

It lacks any _sanctioned_ mechanism for standardization. There is
certainly a mechanism for creating defacto standards.

> It is claimed that there are portable
> libraries that work across implementations that provide de facto
> standards, e.g. UFFI, but these claims are false.

You probably don't mean what you say here. There are definitely portable
libraries out there. Maybe not for foreign function interfaces (I can't
judge this), but certainly for other things.

> The Balkanization of the CL implementation space also has the
> consequence that one must choose between using implementation-specific
> features and thus limiting the potential audience for one's code to a
> niche within a niche, or writing to the least common denominator, which
> generally means writing an awful lot of #+ reader macros.

...or writing compatibility layers.

> 2. Even for the one thing that CL claims to be particularly good at --
> as a platform for embedding domain-specific languages -- it has
> significant limits. To embed languages that differ from CL's semantics
> in certain ways requires significant effort. To cite but one example: I
> would like to embed a language that is very similar to Common Lisp, but
> which differs in how it handles global variable references (to use
> global lexical environments) and ((function-returning-a-function)
> arguments) syntax. The former can be done using symbol macros, but only
> if the top-level definitions precede their first use.

The same holds for global dynamic variables: If you use them before you
have defined them, you are invoking undefined behavior. So there is no
difference between global dynamic and global lexical variables here.

> If you reference
> a global before defining it then you're screwed. The latter cannot be
> done at all within CL unless you write a full code walker. But adding
> this capability is utterly trivial within an implementation. In MCL it
> takes two lines of code. And if it were done it would result in
> strictly greater expressive power.

The difference between (funcall (some-expression)) and
((some-expression)) is not that of fundamentally different expressive
power. You get an increase in expressiveness when you can avoid having
to touch various places in your source code by using a single construct.
The switch from (funcall (some-expression)) to ((some-expression)) is a
strictly local change.

For example, the addition of call/cc would indeed be an increase in
expressive power.

(Note that increased expressive power is not necessarily a good thing.
Consider the "come from" statement in Intercal as a counter example,
which also mean an increase in expressive power when added to Common Lisp.)

> Furthermore, it is not even
> necessary to agree on the semantics of ((...) ...). One could simply
> add a new macro defining form (or even a global variable) to set a
> user-definable hook for transforming expressions whose CARs are lists
> that do not begin with LAMDBA. All that would need to be agreed upon is
> the name of this form. Furthermore, this would result in strictly
> greater expressive power. It would be strictly backwards-compatible.
> And It would serve the needs of a number of users who are not currently
> being served (e.g. those who prefer to do functional-style programming
> without having to type FUNCALL all the time.)

You would have to define a way to delimit the scope of the different
possible hooks, otherwise it becomes a nightmare to try to mix and match
different third-party libraries.

> But despite the fact that this change is easy and only good could come
> of it, it does not happen because there is no process by which this
> change can be effected (which is, I believe, a direct consequence of the
> fact that the realities of CL politics are that CL is utterly resistant
> to all change, though I would dearly love to be proven wrong on that).

The change you propose is easy to make, but the consequences of it are
not necessarily easy to deal with.

I don't have the impression that the CL community is resistant to
change. See the various projects in various places that are quite
healthy, as far as I can tell. (If I remember corretly,
http://cl-user.net counts more than 600 entries.)

> (Oh, and anyone who wishes to prove me wrong, please not that there is a
> big big difference between effecting change in CL and effecting change
> in an implementation of CL.)

Sure. But do note that the language constructs that are part of Common
Lisp have been tried in other Lisp dialects before. I think that picking
a single Common Lisp implementation and experimenting with language
constructs there to see whether they pay off before proposing them as
official features is the healthier approach.

> 3. Much of CL's core is badly designed. For example, consider NTH and
> ELT. The functionality of ELT is a strict superset of NTH, so why have
> NTH cluttering up the language? (To say nothing of the fact that the
> order of the arguments in these two functions are gratuitously
> reversed.) Why is the function that computes the difference of two sets
> called SET-DIFFERENCE, but the function that computes the intersection
> of two sets called simply INTERSECTION? And why do all of these
> functions operate on lists, not sets? It's because there are no sets in
> CL, which means that CL leads one to prematurely "optimize" sets as
> lists. (I put optimize in scare quotes because in fact this is rarely
> an optimization, especially when your sets get big, and most of the time
> you have to go back and rip out huge chunks of code to replace your
> lists with hash tables or binary trees.) I could go on and on.

These features exist all for backwards compatibility. They could at most
be deprecated, otherwise you would break a lot of existing code. I am
certain that this would do more harm than bring any benefits because the
community is, I think, not large enough to rewrite the large amount of
useful code that does exist.

It's trivial to define your own package with the name and argument
conventions that you prefer. There is no need to force anyone else to
use the same conventions.

> Now, for those of you who wish to respond I ask you to keep in mind the
> following:
>
> 1. The details of my criticisms are mostly irrelevant. What matters is
> that CL is far from perfect, and that it has no mechanism for change.
> So don't bother picking a nit about one of my specific criticisms unless
> you wish to argue that CL is perfect and doesn't need to change.

These nits would have to be picked in case we had an official mechanism
for change. So it's a good exercise to do this already, in order to be
able to estimate whether the installation of an official mechanism would
be worthwhile.

I think there are better examples than the ones you propose. See for
example
http://www.cliki.net/Proposed%20ANSI%20Revisions%20and%20Clarifications

> 2. I know a lot more about Lisp that Steve Yegge. I spent twenty years
> programming in Lisp for a living. I have authored some highly
> referenced papers on Lisp. I am far from the world's foremost expert,
> but I'm no newbie. If you think I'm wrong about a technical point you
> should think twice.
>
> 3. I do not hate Lisp. It is and has always been my favorite
> programming languages. My love for Lisp pretty much destroyed my career
> as a programmer. My motivation for criticising Lisp is not to convince
> people not to use it. It is to effect changes that I believe are
> necessary to get more people to use it. To quote Paul Graham, "It's not
> Lisp that sucks, it's Common Lisp that sucks." And actually, I would
> soften that somewhat: it's not Common Lisp that sucks, it's some parts
> of Common Lisp that suck. But make no mistake, some parts of Common
> Lisp really do suck, and unless they are fixed a lot of people -- myself
> included -- won't be able to use it even though they may want to really
> badly.

Maybe.


Pascal

--
3rd European Lisp Workshop
July 3-4 - Nantes, France - co-located with ECOOP 2006
http://lisp-ecoop06.bknr.net/

Erik Enge

unread,
Apr 17, 2006, 4:36:05 PM4/17/06
to
Ron,

Where's your proposed solution?

Pissing on people's legs to get their attention is rarely a great way
of convincing them to change. Stop alienating yourself if you want to
bring some value into the community.

Erik.

Pascal Costanza

unread,
Apr 17, 2006, 4:41:33 PM4/17/06
to
bradb wrote:

> In my very humble opinion, the only way to get a better CL is to take
> the good bits of CL, Scheme, ML, etc and write a new Lisp
> implementation.

You mean like lush, Stella, Dylan, EuLisp, L Sharp, Le Lisp, ISLISP,
NewLisp, etc.?

karsten

unread,
Apr 17, 2006, 4:44:30 PM4/17/06
to
Ron Garret wrote:
> 3. I do not hate Lisp. It is and has always been my favorite
> programming languages. My love for Lisp pretty much destroyed my career
> as a programmer. My motivation for criticising Lisp is not to convince
> people not to use it. It is to effect changes that I believe are
> necessary to get more people to use it.
Since you wrote this to effect changes, do you have any proposals on
how to achieve these changes?

When I first looked at clrfi I thought this would be the way to do
additions to the ansi-standard (processes, sockets, ...) , but it seems
that it is simply not working for whatever (legal ?) reason.

I spent this afternoon hacking cl-http on a platform it was not
developed for, so I can confirm, that the situation is improvable but
not hopeless.

salud2

Karsten

bradb

unread,
Apr 17, 2006, 4:49:45 PM4/17/06
to
Yes, like those ones. But good enough that it is so much better than
CL that the CL community wants to use it. Maybe the very nature of the
CL community would make it impossible to convert them to something
better - who knows?

Brad

Dmitry Gorbatovsky

unread,
Apr 17, 2006, 5:05:57 PM4/17/06
to
Ron Garret wrote:
>>>....

Sorry to point on obvious, but there is no
such thing like "Perfect Language" in existence.
So all and every language is sucks on their own
way.

From my perspective, I find CL a very good complement
to my toolbox, which include Fortran for numerics and
C for system tasks and interfacing.

PS.from my experience cross compiler portability for
any significant project is an urban legend.

dg

--
*It is easy to lie with statistics, but it's a lot easier to lie without
them.

Rainer Joswig

unread,
Apr 17, 2006, 5:14:01 PM4/17/06
to
In article <1145306669.9...@i39g2000cwa.googlegroups.com>,
"karsten" <karste...@gmail.com> wrote:

He, I also was hacking today CL-HTTP related code (web log analysis)
on a platform CL-HTTP is (!) developed
for: LispWorks on Mac OS X. I was reviving some code
I wrote maybe eight years ago on my Symbolics MacIvory.
There were almost zero porting issues. I spend
the day enhancing the code and writing web log reports.
That's also the beauty of Common Lisp: I can come
back to old code, understand easily it after years and
enhance it.

I would expect that much of my code runs mostly
unchanged on, say, five different operating systems
(Windows, Mac OS X, Genera, Linux, ...)
and, say. five different Common Lisp implementions
(CMUCL , Genera, LispWorks, Allegro CL, MCL, ...).
Well, for me this is enough.

For me CL-HTTP provides the networking library on these
platforms.

Other than that I was just checking in a OODB database
written in a more than ten thousand lines of Common Lisp
in my home Subversion repository. The software sits
on top of a very extensive layer of compatibility
code (locks, processes, data types, byte-order,
defsystem, once-only, queues, semaphores, ...).

That's the way many larger Lisp software tends to be written:
based on some portability layer. With that it is not
too difficult to move Common Lisp code around between
platforms. It takes just a bit of Lisp software engineering.

Currently I have zero interest in new Lisp dialects.

What currently interests and faszinates me: how to make
use of the tremendous power of current Common Lisp
implementations.

--
http://lispm.dyndns.org/

Ken Tilton

unread,
Apr 17, 2006, 5:14:33 PM4/17/06
to

Pascal Costanza wrote:
> Ron Garret wrote:
>
>> Two things to get out of the way at the outset:
>>
>> 1. Note that the title of this post is *HOW* CL sucks, not *WHY* it
>> sucks. The difference is significant. Please take the time to
>> understand it before you flame me.
>
>
> The title "How some aspects of Common Lisp suck" would have been even
> more appropriate, especially considering your own final remarks.

Or try the title Yegge seems to have in the end come round to: "Lisp
Rocks! Now if only there were standard sockets, threads, and GUI!"

Well, sure. It seems we all agree, except for thread titles, and it will
be hard to get people seeking attention to eschew "Lisp Sucks!".

btw, I think LTk is a nice start on the GUI bit, because Tcl/Tk offers
other platform-independent goodies and nothing about it is
Lisp-implementation specific.

My question is, why is it such a big deal for the other stuff to be
standardized? I mean, how many app developers are bouncing from Lisp
implementation to Lisp implementation? As for library authors, shucks,
we work pretty hard anyway, a little featurization is not going to kill
us, esp. since, as someone already noted, those come from others on
other platforms if one has built a fun library.

I say again, the good news is that Lisp has so much mindshare now that
people can use it to draw attention to themselves. Hmmm...

ken

--
Cells: http://common-lisp.net/project/cells/

"Have you ever been in a relationship?"
Attorney for Mary Winkler, confessed killer of her
minister husband, when asked if the couple had
marital problems.

Pascal Costanza

unread,
Apr 17, 2006, 5:22:46 PM4/17/06
to

It's hard to develop a Lisp dialect with the feature set that Common
Lisp already provides. In spite of its quirks, Common Lisp integrates
all these features very well. See http://www.lisp.org/table/objects.htm
for a characterization that hits the nail on the head.

Sure, some languages are better at functional programming. Sure, some
languages provide objects from the ground up. Sure, some programming
languages make it easier to write efficient code. Sure, some programming
languages make it safer to provide your own language extensions. But I
am not aware of another language that provides all of these things and
is so complete and so well integrated.

This is very hard to beat and probably not worth the effort. It's likely
that the end result of any new attempt will have similar messy odds and
ends as Common Lisp currently has. I think this is an inevitable side
effect of the goal to integrate a large number of features, and an
inevitable side effect of a language that has been organically grown
over several decades. Not even Scheme seems to be able to avoid these
issues, if you look closely enough. The idea that choosing a minimalist
approach is a remedy here doesn't seem to work either. [1]

Maybe it's a hard idea to grasp, but there are people who choose Common
Lisp _because_ it is the way it is.


Pascal

[1] Maybe Smalltalk is an exception here, but the idea of Smalltalk
seems to be that there should be no language. So instead of adding new
language construct, they add new tools to their development
environments, which poses similar issues.

Dmitry Gorbatovsky

unread,
Apr 17, 2006, 5:28:13 PM4/17/06
to
Pascal Costanza wrote:


> Maybe it's a hard idea to grasp, but there are people who choose Common
> Lisp _because_ it is the way it is.
>
>
> Pascal
>

Thank you.

It is Exactly like that.

bradb

unread,
Apr 17, 2006, 5:32:09 PM4/17/06
to
Thank you for the very well written and reasoned post. I suspect you
may be right - for all the warts and bumps that CL has, it is pretty
darned good & any new language will probably get it's own ugliness at
around the same time it really useful.

Cheers
Brad

Peter Seibel

unread,
Apr 17, 2006, 6:02:34 PM4/17/06
to
Ron Garret <rNOS...@flownet.com> writes:

> 2. I'm not going to say anything I haven't said a thousand times
> before, so those of you who know me (that means you, Kenny) will not
> find anything new here so you may as well not even bother.

Which raises the question: why? Last time I saw you, you were running
around looking for somewhere to invest your Googlebucks. So presumably
you have both free time and some financial resources to play with.
Given that, and assuming you really do care about the increasing the
adoption of Common Lisp, is posting the same argument to
comp.lang.lisp every six months really the best strategy you can come
up with for effecting some sort of change? Or to put it another way,
what, in the best case scenario, would you like to see happen as a
result of this most-recent post?

-Peter

--
Peter Seibel * pe...@gigamonkeys.com
Gigamonkeys Consulting * http://www.gigamonkeys.com/
Practical Common Lisp * http://www.gigamonkeys.com/book/

Ken Tilton

unread,
Apr 17, 2006, 6:24:07 PM4/17/06
to

Peter Seibel wrote:
> Ron Garret <rNOS...@flownet.com> writes:
>
>
>>2. I'm not going to say anything I haven't said a thousand times
>>before, so those of you who know me (that means you, Kenny) will not
>>find anything new here so you may as well not even bother.
>
>

> Which raises the question: why? <snip>....is posting the same argument to


> comp.lang.lisp every six months really the best strategy you can come
> up with for effecting some sort of change?

He must be looking for support from noobs. Trouble is they are either
code-nothing groupies (and what a fun army that would be to lead) or
they have work to do and have brushed off any trivial annoyances having
discovered what a joy and powerhouse is Lisp.

I am reminded of doing my first FFI bindings. A week of misery even with
UFFI, which was a great help, but drawing me on was the prospect of a
portable event-loop (FreeGlut). Now I can bind up a library in a couple
of days, and Swig or Vzn could make that even faster some day.

Anyway, if someone is not actually trying to do anything, bindings seem
like a problem. Not really, just part of the job. And of course Lisp
makes it easier -- a little macrology and I was able to do a poor-man's
FFI autogen.

Frank Buss

unread,
Apr 17, 2006, 6:40:32 PM4/17/06
to
Ken Tilton wrote:

> I am reminded of doing my first FFI bindings. A week of misery even with
> UFFI, which was a great help, but drawing me on was the prospect of a
> portable event-loop (FreeGlut). Now I can bind up a library in a couple
> of days, and Swig or Vzn could make that even faster some day.

BTW: with callbacks there are interesting issues with stdcall versus cdecl.
GLUT uses cdecl, while the GLU tesselation callback functions use stdcall.
I've talked to Luís Oliveira on #lisp about this and maybe there will be
the ability to specify the calling convention in a new CFFI release.

With tesselation for concave polygons and my own line join implementation
(OpenGL doesn't do line joins at all for thick lines) now my 2D OpenGL code
is much more useful:

http://www.frank-buss.de/tmp/opengl2d3.png
http://www.frank-buss.de/tmp/flags.gif
http://www.frank-buss.de/tmp/opengl2d3.lisp.txt

--
Frank Buss, f...@frank-buss.de
http://www.frank-buss.de, http://www.it4-systems.de

Barry Margolin

unread,
Apr 17, 2006, 9:13:33 PM4/17/06
to
In article <rNOSPAMon-179F6...@news.gha.chartermi.net>,
Ron Garret <rNOS...@flownet.com> wrote:

> 1. CL lacks standardized support for many operations that are
> necessities in today's world (e.g. sockets, database connectivity,
> foreign functions).

So do C, C++, and Perl. To get these things in most languages you need
to make use of libraries or standards beyond the core language. For
instance, sockets are not part of C, they're part of POSIX.

--
Barry Margolin, bar...@alum.mit.edu
Arlington, MA
*** PLEASE post questions in newsgroups, not directly to me ***
*** PLEASE don't copy me on replies, I'll read them in the group ***

Peter Seibel

unread,
Apr 17, 2006, 10:32:11 PM4/17/06
to
"bradb" <brad.be...@gmail.com> writes:

> Well written. I'm am almost a complete newbie at Lisp, so please
> take what I say with big grains of salt. It appears to me that the
> CL community has a lot of inertia, the language and the spec are old
> (read mature if you prefer), but few implementations actually have
> full ANSI compliance. Isn't there something wrong when there are no
> ANSI compliant free implementations 20 years after the spec was
> written? Most implementations are probably very close, but there are
> probably still weird little cracks where stuff falls out.

So what areas of non-compliance in what implementations are causing
you pain? As far as I can tell, all the serious commercial and free
implementations work quite hard to conform to the spec and I can't
think, offhand, of any areas where implementations intentionally
deviate from the spec with no intention of fixing it when they get
around to it.

bradb

unread,
Apr 17, 2006, 10:53:50 PM4/17/06
to
There are none that bother me, and none I can specifically put my
finger on. But most implementation docs that I have read use phrases
like "we aim to adhere to the ANSI spec". My real point was that the
hyperspec is a fairly lengthy document - does it really need to be
quite so big? But, as I say - I'm a noob :) I've never read the C
ANSI documents, maybe they are huge too.

Don't get me wrong, I really enjoy learning about and using CL - it's
just that in a few places I've been left with the impression that life
could be better. For the curious, my personal pain is in not knowing
how the underlying code is generated. I know C pretty well, I can tell
from C code roughly how the assembly output will look, and from C
structs exactly how the memory will be laid out. I can't tell this
with Lisp. Most of the time it doesn't matter, but sometimes I'd
really like to be able to use statically typed code within Lisp, so
that the assembler output is predictable.

Probably I am just new to Lisp though and it will come with time :)

Brad

Ken Tilton

unread,
Apr 17, 2006, 11:05:41 PM4/17/06
to

bradb wrote:
> There are none that bother me, and none I can specifically put my
> finger on. But most implementation docs that I have read use phrases
> like "we aim to adhere to the ANSI spec". My real point was that the
> hyperspec is a fairly lengthy document - does it really need to be
> quite so big?

Well this is the funny part. First they complain the spec is too big,
then they complain that it does not cover FFI, sockets, DB access, threads.

Paging Yogi Berra....

As a salesman once told me, "If you are not being shot at, then you are
not over the target."

Peter Seibel

unread,
Apr 17, 2006, 11:10:24 PM4/17/06
to
"bradb" <brad.be...@gmail.com> writes:

> There are none that bother me, and none I can specifically put my
> finger on. But most implementation docs that I have read use phrases
> like "we aim to adhere to the ANSI spec".

Since there's no test suite as part of the standard there's no much
else they *can* say. To then run around saying "there are no ANSI
compliant free implementations 20 years after the spec was written" is
a bit inflamatory. The ANSI CL spec is a lot more useful than, say,
the ANSI C99 spec which essentially can't be used for writing portable
code because Microsoft's C compilers don't, and apparently, never will
even try to conform to it.

> My real point was that the hyperspec is a fairly lengthy document -
> does it really need to be quite so big? But, as I say - I'm a noob
> :) I've never read the C ANSI documents, maybe they are huge too.

Well, most folks seem to be complaining that it's too short--that they
didn't find room to standardize threading, networking, etc, etc.
Anyway, speaking as someone who tends to read specifications when
they're available, the Common Lisp spec is a remarkably high-quality
piece of work.

> Don't get me wrong, I really enjoy learning about and using CL -
> it's just that in a few places I've been left with the impression
> that life could be better.

Life can always be better. The question is what can one do about it.

> For the curious, my personal pain is in not knowing how the
> underlying code is generated. I know C pretty well, I can tell from
> C code roughly how the assembly output will look, and from C structs
> exactly how the memory will be laid out. I can't tell this with
> Lisp.

Well, Lisp can tell you:

CL-USER> (defun foo (x y) (+ x y))
FOO
CL-USER> (disassemble 'foo)
;; disassembly of #<Function (:ANONYMOUS-LAMBDA 12) @ #x107aecca>
;; formals: X Y

;; code start: #x107aec5c:
0: 7c0802a6 [mfspr] mflr r0
4: 9421ffc0 stwu r1,-64(r1)
8: 90010048 stw r0,72(r1)
12: 91a1000c stw r13,12(r1)
16: 0f100002 twnei r16,2 "number of args"
20: 82520007 lwz r18,7(r18)
24: 7c732378 or r19,r3,r4
28: 72600003 andi. r0,r19,3
32: 4f800000 mcrf 7,0
36: 7e632415 addco. r19,r3,r4
40: 409e0024 bne cr7,76 lb2
44: 41830020 bso 76 lb2
48: 62630000 [ori] lr r3,r19
52: 3a000001 [addi] lil r16,1
lb1:
56: 80010048 lwz r0,72(r1)
60: 30210040 addic r1,r1,64
64: 7c0803a6 [mtspr] mtlr r0
68: 81a1000c lwz r13,12(r1)
72: 4e800020 blr
lb2:
76: 7ea903a6 [mtspr] mtctr r21 "symbol_trampoline"
80: 828fff8f lwz r20,-113(r15) EXCL::+_2OP
84: 3a000002 [addi] lil r16,2
88: 4e800421 bctrl
92: 4bffffdc b 56 lb1
; No value

Now let's see what effect some declarations have:

CL-USER> (defun foo (x y)
(declare (optimize (speed 3) (safety 0)))
(declare (fixnum x y))
(+ x y))
FOO
CL-USER> (disassemble 'foo)
;; disassembly of #<Function (:ANONYMOUS-LAMBDA 13) @ #x107c84c2>
;; formals: X Y

;; code start: #x107c84a4:
0: 7c632014 addc r3,r3,r4
4: 3a000001 [addi] lil r16,1
8: 81a1000c lwz r13,12(r1)
12: 4e800020 blr

; No value

bradb

unread,
Apr 17, 2006, 11:24:20 PM4/17/06
to

Peter Seibel wrote:

> "bradb" <brad.be...@gmail.com> writes:
> else they *can* say. To then run around saying "there are no ANSI
> compliant free implementations 20 years after the spec was written" is
> a bit inflamatory. The ANSI CL spec is a lot more useful than, say,
You're right - I didn't mean for it to be inflamatory. As I've said
before, I'm new here - though I may express my naive opinion a little
freely :)

> the ANSI C99 spec which essentially can't be used for writing portable
> code because Microsoft's C compilers don't, and apparently, never will
> even try to conform to it.

I guess that the reason the C world has less implementation issues is
because 95%+ of the market uses either MSVC or GCC, Lisp has many more
choices.

> Well, most folks seem to be complaining that it's too short--that they
> didn't find room to standardize threading, networking, etc, etc.
> Anyway, speaking as someone who tends to read specifications when
> they're available, the Common Lisp spec is a remarkably high-quality
> piece of work.

Good point. But you have to admit there is at least _some_ legacy and
redundant code, NTH vs ELT.

> > exactly how the memory will be laid out. I can't tell this with
> > Lisp.
>
> Well, Lisp can tell you:

Yes, but I don't think this is a real solution. Very rarely do I
examine the disassembly output of C code, because the output is
generally pretty predictable.
I don't want to have to mess with the machine code to confirm that a
statically typed section of code has compiled to more or less what I
expected - that doesn't scale up. Though this could just be a learning
curve thing, I _am_ starting to understand the optimisations a little
more.

Cheers
Brad

Ron Garret

unread,
Apr 17, 2006, 11:53:50 PM4/17/06
to
In article <m2irp76...@gigamonkeys.com>,
Peter Seibel <pe...@gigamonkeys.com> wrote:

> Ron Garret <rNOS...@flownet.com> writes:
>
> > 2. I'm not going to say anything I haven't said a thousand times
> > before, so those of you who know me (that means you, Kenny) will not
> > find anything new here so you may as well not even bother.
>
> Which raises the question: why?

To annoy Kenny of course. That is my new purpose in life.

Seriously, it's because I thought SteveY raised a valid point, and I
didn't want that to get overshadowed by his technical errors.

> Last time I saw you, you were running
> around looking for somewhere to invest your Googlebucks. So presumably
> you have both free time and some financial resources to play with.

Less and less with every passing day.

> Given that, and assuming you really do care about the increasing the
> adoption of Common Lisp, is posting the same argument to
> comp.lang.lisp every six months really the best strategy you can come
> up with for effecting some sort of change?

Yep. And believe me, it's not for lack of trying that I can't come up
with anything better.

> Or to put it another way,
> what, in the best case scenario, would you like to see happen as a
> result of this most-recent post?

I don't have a single best-case scenario in mind. I would like to see
*some* process for managing change in CL emerge from somewhere and be
adopted by a big enough chunk of the community to matter. But for that
to happen I think the first step is to convince a critical mass of
people that this is even desirable. At the moment I feel like a voice
in the wilderness.

rg

Ron Garret

unread,
Apr 17, 2006, 11:56:19 PM4/17/06
to
In article <e20vrd$vu1$1...@emma.aioe.org>,
Dmitry Gorbatovsky <finc...@yahoo.com> wrote:

> Ron Garret wrote:
> >>>....
>
> Sorry to point on obvious, but there is no
> such thing like "Perfect Language" in existence.
> So all and every language is sucks on their own
> way.

Of course. Nothing is ever perfect. But that's no excuse for not
trying to improve things.

> PS.from my experience cross compiler portability for
> any significant project is an urban legend.

Cross-compiler portability for C and C++ is pretty good. It's even
pretty good for CL as long as you don't have to interface with the
outside world, which is exactly the problem.

rg

Ron Garret

unread,
Apr 17, 2006, 11:57:47 PM4/17/06
to

> Ron Garret wrote:
> > 3. I do not hate Lisp. It is and has always been my favorite
> > programming languages. My love for Lisp pretty much destroyed my career
> > as a programmer. My motivation for criticising Lisp is not to convince
> > people not to use it. It is to effect changes that I believe are
> > necessary to get more people to use it.
> Since you wrote this to effect changes, do you have any proposals on
> how to achieve these changes?

Nope. CLRFI seems like a step in the right direction, but this is not
my call. My mission is just to convince people that this is a goal
worth striving for.

> When I first looked at clrfi I thought this would be the way to do
> additions to the ansi-standard (processes, sockets, ...) , but it seems
> that it is simply not working for whatever (legal ?) reason.

Yep, I don't understand this either.

> I spent this afternoon hacking cl-http on a platform it was not
> developed for, so I can confirm, that the situation is improvable but
> not hopeless.

That's the spirit!

rg

Ron Garret

unread,
Apr 18, 2006, 12:00:26 AM4/18/06
to
In article <joswig-AB889B....@news-europe.giganews.com>,
Rainer Joswig <jos...@lisp.de> wrote:

> That's the way many larger Lisp software tends to be written:
> based on some portability layer.

Why "some" portability layer? Why not just one? Maintaining multiple
portability layers seems like a waste of effort.

BTW, your portability layer sounds pretty wizzy. Is it available?

rg

Pascal Bourguignon

unread,
Apr 18, 2006, 12:02:44 AM4/18/06
to
Ron Garret <rNOS...@flownet.com> writes:

I don't think so. Try to link a library compiled for the Darwin/ppc
virtual machine with your program compiled for the Linux/x86 virtual
machine.

By the same token, it's hard to interface a library compiled for the
SBCL/x86 virtual machine with a program compiled for the clisp virtual
machine. Or a library compiled for the Linux/x86 virtual machine with
a program compiled for the clisp virtual machine, even if it's easier
(and possible) than linking a library compiled for the SBCL/x86
virtual machine.

I'd love to see them C programmers struggling in a world of Lisp
Machines...

--
__Pascal Bourguignon__ http://www.informatimago.com/

This is a signature virus. Add me to your signature and help me to live.

Ron Garret

unread,
Apr 18, 2006, 12:05:20 AM4/18/06
to
In article <1145306165.9...@e56g2000cwe.googlegroups.com>,
"Erik Enge" <erik...@gmail.com> wrote:

> Ron,
>
> Where's your proposed solution?

I don't have one. I'm still at the stage of trying to convince people
that there is a problem.

> Pissing on people's legs to get their attention is rarely a great way
> of convincing them to change. Stop alienating yourself if you want to
> bring some value into the community.

Sorry, I don't accept the proposition that pointing out problems equates
to "pissing on people's legs." That's one of the problems with CL. The
community has this
you're-either-with-us-or-you're-with-the-Perl-terrorists point of view.
This is an impediment to progress (and not just in the realm of
programming languages).

rg

matth...@gmail.com

unread,
Apr 18, 2006, 12:24:21 AM4/18/06
to
Look, it seems that you want to improve, rather than replace, CL. If
that's correct, then you can split your proposed changes in 3
categories: those that can be done in portable CL, those that can be
done in CL but must use implementation-specific stuff within each
individual lisp, and those that must be within implementations. The
first category you can probably knock out in a few months/weeks, maybe
less. The second is libraries, which we are going to work on in SOC.
Help us! The third seems to me to be CLRFI territory. It would be
helpful, both to people who want to improve CL and to those who want to
design something better, if you wrote up the whole list of things that
are broken about CL, why it matters, and how they could be fixed. Then
there could be a huge discussion, historical context for the various
brokennesses would be brought forth, and maybe CLRFI would get started.
If nothing else, getting all the brokennesses in one spot would mean
that there is a central place to go to find things to fix.

Ron Garret

unread,
Apr 18, 2006, 12:28:39 AM4/18/06
to
In article <4aicgnF...@individual.net>,
Pascal Costanza <p...@p-cos.net> wrote:

> > I'm going to point out just three problems with CL. There are more.
> > None of these are original observations.


> >
> > 1. CL lacks standardized support for many operations that are
> > necessities in today's world (e.g. sockets, database connectivity,

> > foreign functions). Moreover, it lacks any mechanism by which these
> > features could be standardized.
>
> It lacks any _sanctioned_ mechanism for standardization. There is
> certainly a mechanism for creating defacto standards.

That's news to me. What is it?

> > It is claimed that there are portable
> > libraries that work across implementations that provide de facto
> > standards, e.g. UFFI, but these claims are false.
>
> You probably don't mean what you say here. There are definitely portable
> libraries out there. Maybe not for foreign function interfaces (I can't
> judge this), but certainly for other things.

Of course there are portable libraries, just not for the things that
matter in today's world.

> The same holds for global dynamic variables: If you use them before you
> have defined them, you are invoking undefined behavior. So there is no
> difference between global dynamic and global lexical variables here.

Aw, geez, let's not descend into quibbling over these details. You know
perfectly well that every implementation handles an undeclared free
variable reference by assuming it's a dynamic reference.

> > If you reference
> > a global before defining it then you're screwed. The latter cannot be
> > done at all within CL unless you write a full code walker. But adding
> > this capability is utterly trivial within an implementation. In MCL it
> > takes two lines of code. And if it were done it would result in
> > strictly greater expressive power.
>
> The difference between (funcall (some-expression)) and
> ((some-expression)) is not that of fundamentally different expressive
> power.

You're right, I misspoke. It's not a matter of expressiveness, it's a
matter of verbosity. ((...) ...) is spelled (funcall (...) ...). But
my point is that if you don't like all that extra verbiage there's no
way to change it, and no good reason why you shouldn't be able to.

> (Note that increased expressive power is not necessarily a good thing.
> Consider the "come from" statement in Intercal as a counter example,
> which also mean an increase in expressive power when added to Common Lisp.)

Yes, a good point.

> > Furthermore, it is not even
> > necessary to agree on the semantics of ((...) ...). One could simply
> > add a new macro defining form (or even a global variable) to set a
> > user-definable hook for transforming expressions whose CARs are lists
> > that do not begin with LAMDBA. All that would need to be agreed upon is
> > the name of this form. Furthermore, this would result in strictly
> > greater expressive power. It would be strictly backwards-compatible.
> > And It would serve the needs of a number of users who are not currently
> > being served (e.g. those who prefer to do functional-style programming
> > without having to type FUNCALL all the time.)
>
> You would have to define a way to delimit the scope of the different
> possible hooks, otherwise it becomes a nightmare to try to mix and match
> different third-party libraries.

Fine, make it a hook function then. Now all we need to agree on is the
name of the variable.

> > But despite the fact that this change is easy and only good could come
> > of it, it does not happen because there is no process by which this
> > change can be effected (which is, I believe, a direct consequence of the
> > fact that the realities of CL politics are that CL is utterly resistant
> > to all change, though I would dearly love to be proven wrong on that).
>
> The change you propose is easy to make, but the consequences of it are
> not necessarily easy to deal with.
>
> I don't have the impression that the CL community is resistant to
> change.

Really? Then why has there been no change in CL in a decade?

> See the various projects in various places that are quite
> healthy, as far as I can tell. (If I remember corretly,
> http://cl-user.net counts more than 600 entries.)

Non-sequitur. There is a big difference between writing code in the
language and making changes to the language.

> > (Oh, and anyone who wishes to prove me wrong, please not that there is a
> > big big difference between effecting change in CL and effecting change
> > in an implementation of CL.)
>
> Sure. But do note that the language constructs that are part of Common
> Lisp have been tried in other Lisp dialects before. I think that picking
> a single Common Lisp implementation and experimenting with language
> constructs there to see whether they pay off before proposing them as
> official features is the healthier approach.

Indeed. I'm not saying that we should dive wholesale into a redesign.
All I'm saying is that there ought to be an end-game for the process.

> > 3. Much of CL's core is badly designed. For example, consider NTH and
> > ELT. The functionality of ELT is a strict superset of NTH, so why have
> > NTH cluttering up the language? (To say nothing of the fact that the
> > order of the arguments in these two functions are gratuitously
> > reversed.) Why is the function that computes the difference of two sets
> > called SET-DIFFERENCE, but the function that computes the intersection
> > of two sets called simply INTERSECTION? And why do all of these
> > functions operate on lists, not sets? It's because there are no sets in
> > CL, which means that CL leads one to prematurely "optimize" sets as
> > lists. (I put optimize in scare quotes because in fact this is rarely
> > an optimization, especially when your sets get big, and most of the time
> > you have to go back and rip out huge chunks of code to replace your
> > lists with hash tables or binary trees.) I could go on and on.
>
> These features exist all for backwards compatibility. They could at most
> be deprecated, otherwise you would break a lot of existing code. I am
> certain that this would do more harm than bring any benefits because the
> community is, I think, not large enough to rewrite the large amount of
> useful code that does exist.

IMO, just having the ability to (semi-)officially deprecate cruft, even
if it never actually goes away, I think would have enormous payoffs.
But we don't currently have that ability.

> It's trivial to define your own package with the name and argument
> conventions that you prefer. There is no need to force anyone else to
> use the same conventions.

Yes there is: it makes life VASTLY easier for newcomers if the actual
language (as opposed to someone's private library) is not full of random
crap.

This, by the way, is the crux of SteveY's point. It often gets
overlooked so it bears repeating: it's all about how accessible the
language is to newcomers. If the language is full of random crap then
it becomes less accessible to newcomers. A library that covers up the
crap doesn't help nearly as much as actually cleaning the crap up.

> > Now, for those of you who wish to respond I ask you to keep in mind the
> > following:
> >
> > 1. The details of my criticisms are mostly irrelevant. What matters is
> > that CL is far from perfect, and that it has no mechanism for change.
> > So don't bother picking a nit about one of my specific criticisms unless
> > you wish to argue that CL is perfect and doesn't need to change.
>
> These nits would have to be picked in case we had an official mechanism
> for change. So it's a good exercise to do this already, in order to be
> able to estimate whether the installation of an official mechanism would
> be worthwhile.

No, because in the absence of a process the argument can be sustained
indefinitely with no resolution. Those who oppose change will falsely
claim that this is evidence that adopting a process for managing change
is hopeless (or useless, or some such thing).

> I think there are better examples than the ones you propose.

Of course there are. If you think that is relevant then you have
completely missed the point.

rg

Message has been deleted

Ron Garret

unread,
Apr 18, 2006, 12:33:22 AM4/18/06
to
In article <87odz0x...@rpi.edu>, Bill Atkins <NOatki...@rpi.edu>
wrote:

> > order of the arguments in these two functions are gratuitously
> > reversed.) Why is the function that computes the difference of two sets
>

> The reversal of arguments _is_ truly annoying.


>
> > called SET-DIFFERENCE, but the function that computes the intersection
> > of two sets called simply INTERSECTION? And why do all of these
> > functions operate on lists, not sets? It's because there are no sets in
> > CL, which means that CL leads one to prematurely "optimize" sets as
> > lists. (I put optimize in scare quotes because in fact this is rarely
> > an optimization, especially when your sets get big, and most of the time
> > you have to go back and rip out huge chunks of code to replace your
> > lists with hash tables or binary trees.) I could go on and on.
>

> This is not a fault in the language, as far as I'm concerned. The
> operations you mention are often useful for lists. Is it really a
> problem that set terminology is used to describe these functions?

Yes. It confuses newcomers. It encourages them to prematurely
"optimize" set operations by implementing them using lists. And it
consumes valuable namespace real estate. SET-DIFFERENCE ought to be a
function that computes the difference of sets, nothing else. And
forcing it to be something else IN THE STANDARD is an abomination.

> A programmer ought to understand that if he or she wants the behavior of
> an actual set data structure, then these functions will not do.

Of course. Programmers ought to understand all sorts of things. But
they don't, especially not when they are new.

rg

Jack Unrue

unread,
Apr 18, 2006, 12:40:12 AM4/18/06
to
On Mon, 17 Apr 2006 21:28:39 -0700, Ron Garret <rNOS...@flownet.com> wrote:
>
> In article <4aicgnF...@individual.net>,
> Pascal Costanza <p...@p-cos.net> wrote:
>
> >
> > It lacks any _sanctioned_ mechanism for standardization. There is
> > certainly a mechanism for creating defacto standards.
>
> That's news to me. What is it?

defacto standards come into being; they are not created.

--
Jack Unrue

Ron Garret

unread,
Apr 18, 2006, 12:42:52 AM4/18/06
to
In article <0T333hfsIo3nNv8%ste...@parsec.no-spoon.de>,
Stefan Scholl <ste...@no-spoon.de> wrote:

> Ron Garret <rNOS...@flownet.com> wrote:
> > trying to use Lisp for e.g. writing a Web server is an incredibly
> > painful experience compared to doing the same thing in e.g. Python.
>
> I could list some web servers written in Common Lisp.

Yes, and it's probably a very long list. (And one of the items on that
list would be http://www.cliki.net/HTTP%20dot%20LSP.)

Now, compare any of those to the effort required to write a web server
in Python, where you can do:

import BaseHTTPServer

rg

Bill Atkins

unread,
Apr 18, 2006, 1:29:29 AM4/18/06
to
Ron Garret <rNOS...@flownet.com> writes:

> In article <87odz0x...@rpi.edu>, Bill Atkins <NOatki...@rpi.edu>
> wrote:
>
>> > order of the arguments in these two functions are gratuitously
>> > reversed.) Why is the function that computes the difference of two sets
>>
>> The reversal of arguments _is_ truly annoying.
>>
>> > called SET-DIFFERENCE, but the function that computes the intersection
>> > of two sets called simply INTERSECTION? And why do all of these
>> > functions operate on lists, not sets? It's because there are no sets in
>> > CL, which means that CL leads one to prematurely "optimize" sets as
>> > lists. (I put optimize in scare quotes because in fact this is rarely
>> > an optimization, especially when your sets get big, and most of the time
>> > you have to go back and rip out huge chunks of code to replace your
>> > lists with hash tables or binary trees.) I could go on and on.
>>
>> This is not a fault in the language, as far as I'm concerned. The
>> operations you mention are often useful for lists. Is it really a
>> problem that set terminology is used to describe these functions?
>
> Yes. It confuses newcomers. It encourages them to prematurely
> "optimize" set operations by implementing them using lists. And it
> consumes valuable namespace real estate. SET-DIFFERENCE ought to be a
> function that computes the difference of sets, nothing else. And
> forcing it to be something else IN THE STANDARD is an abomination.

Many things confuse newcomers - that's hardly sufficient grounds to
remove them from the language. I don't think anyone is trying to pass
off the set functions as "optimizations" nor do I think anyone could
reasonably come to the conclusion that treating lists as sets should
somehow be more "optimized" than using an actual set data structure.
I'm not sure where you see optimization fitting into this.

I think having these functions in the standard is as far from an
abomination as possible. These are useful functions. If you have
problems with the naming, then maybe you can recommend some new names.
But I think naming these after the set operations makes it immediately
clear what these functions will do to their arguments. I do not think
they give any impression that they will provide greater performance.

>> A programmer ought to understand that if he or she wants the behavior of
>> an actual set data structure, then these functions will not do.
>
> Of course. Programmers ought to understand all sorts of things. But
> they don't, especially not when they are new.
>
> rg

--
Bill

Bill Atkins

unread,
Apr 18, 2006, 1:34:27 AM4/18/06
to
Ron Garret <rNOS...@flownet.com> writes:

That has nothing to do with the flaws or merits of Common Lisp, and by
no stretch of the imagination can that be considered "writing" a web
server.

--
Bill

Friedrich Dominicus

unread,
Apr 18, 2006, 1:54:13 AM4/18/06
to
Ron Garret <rNOS...@flownet.com> writes:

>
> 1. CL lacks standardized support for many operations that are
> necessities in today's world (e.g. sockets, database connectivity,
> foreign functions). Moreover, it lacks any mechanism by which these

> features could be standardized. It is claimed that there are portable

> libraries that work across implementations that provide de facto

> standards, e.g. UFFI, but these claims are false. I don't have time to
> get into details at the moment, but the fact of the matter is that

> trying to use Lisp for e.g. writing a Web server is an incredibly
> painful experience compared to doing the same thing in e.g. Python.

Very funny, I don't know of any standardization in C about Sockets,
Database connectivity and probably a few others, does that harm the
success of C? That it just happens that the modern scripting languages
have this stuff is nice, but none of them has anything but a reference
implementation.


>
> The Balkanization of the CL implementation space also has the
> consequence that one must choose between using implementation-specific
> features and thus limiting the potential audience for one's code to a
> niche within a niche, or writing to the least common denominator, which
> generally means writing an awful lot of #+ reader macros.
So what if you don't care about it what stop you from choosing one
implementation and "use" it?

Regards
Friedrich

--
Please remove just-for-news- to reply via e-mail.

Christophe Rhodes

unread,
Apr 18, 2006, 2:13:01 AM4/18/06
to
Peter Seibel <pe...@gigamonkeys.com> writes:

> "bradb" <brad.be...@gmail.com> writes:
>
>> Well written. I'm am almost a complete newbie at Lisp, so please
>> take what I say with big grains of salt. It appears to me that the
>> CL community has a lot of inertia, the language and the spec are old
>> (read mature if you prefer), but few implementations actually have
>> full ANSI compliance. Isn't there something wrong when there are no
>> ANSI compliant free implementations 20 years after the spec was
>> written? Most implementations are probably very close, but there are
>> probably still weird little cracks where stuff falls out.
>
> So what areas of non-compliance in what implementations are causing
> you pain? As far as I can tell, all the serious commercial and free
> implementations work quite hard to conform to the spec and I can't
> think, offhand, of any areas where implementations intentionally
> deviate from the spec with no intention of fixing it when they get
> around to it.

PROG2.

Christophe

Christophe Rhodes

unread,
Apr 18, 2006, 2:16:37 AM4/18/06
to
Peter Seibel <pe...@gigamonkeys.com> writes:

> Now let's see what effect some declarations have:
>
> CL-USER> (defun foo (x y)
> (declare (optimize (speed 3) (safety 0)))
> (declare (fixnum x y))
> (+ x y))
> FOO
> CL-USER> (disassemble 'foo)
> ;; disassembly of #<Function (:ANONYMOUS-LAMBDA 13) @ #x107c84c2>
> ;; formals: X Y
>
> ;; code start: #x107c84a4:
> 0: 7c632014 addc r3,r3,r4
> 4: 3a000001 [addi] lil r16,1
> 8: 81a1000c lwz r13,12(r1)
> 12: 4e800020 blr

Isn't this quite a bad example to use when demonstrating things for a
newbie, given that it is nonconforming output? [ I believe Allegro
documents this deviation, but try (foo most-positive-fixnum
most-positive-fixnum) ]

Christophe

Bill Atkins

unread,
Apr 18, 2006, 2:17:58 AM4/18/06
to
Christophe Rhodes <cs...@cam.ac.uk> writes:

>> So what areas of non-compliance in what implementations are causing
>> you pain? As far as I can tell, all the serious commercial and free
>> implementations work quite hard to conform to the spec and I can't
>> think, offhand, of any areas where implementations intentionally
>> deviate from the spec with no intention of fixing it when they get
>> around to it.
>
> PROG2.
>
> Christophe

What does this mean?

Friedrich Dominicus

unread,
Apr 18, 2006, 2:33:32 AM4/18/06
to
Rainer Joswig <jos...@lisp.de> writes:

>
> He, I also was hacking today CL-HTTP related code (web log analysis)
> on a platform CL-HTTP is (!) developed
> for: LispWorks on Mac OS X. I was reviving some code
> I wrote maybe eight years ago on my Symbolics MacIvory.
> There were almost zero porting issues. I spend
> the day enhancing the code and writing web log reports.
> That's also the beauty of Common Lisp: I can come
> back to old code, understand easily it after years and
> enhance it.
Eeks how old fashioned you are ;-). Your dare to come back to your
code?

Ah, yes that's something different with "modern" languages. Write
once, run maybe, maybe not, maybe ....


Happy lisping

Ron Garret

unread,
Apr 18, 2006, 2:52:29 AM4/18/06
to
In article <87d5ffc...@rpi.edu>, Bill Atkins <NOatki...@rpi.edu>
wrote:

> Ron Garret <rNOS...@flownet.com> writes:


>
> > In article <0T333hfsIo3nNv8%ste...@parsec.no-spoon.de>,
> > Stefan Scholl <ste...@no-spoon.de> wrote:
> >
> >> Ron Garret <rNOS...@flownet.com> wrote:
> >> > trying to use Lisp for e.g. writing a Web server is an incredibly
> >> > painful experience compared to doing the same thing in e.g. Python.
> >>
> >> I could list some web servers written in Common Lisp.
> >
> > Yes, and it's probably a very long list. (And one of the items on that
> > list would be http://www.cliki.net/HTTP%20dot%20LSP.)
> >
> > Now, compare any of those to the effort required to write a web server
> > in Python, where you can do:
> >
> > import BaseHTTPServer
> >
> > rg
>
> That has nothing to do with the flaws or merits of Common Lisp,

Of course it does.

In CL there are a lot of web servers because CL doesn't come with one
and none of the ones people have written are sufficient, otherwise
people would not keep writing new ones.

> and by
> no stretch of the imagination can that be considered "writing" a web
> server.

What difference does it make? The indisputable point is that to get a
web server up and running using Python takes substantially less effort
than to get one up and running using CL.

rg

Ron Garret

unread,
Apr 18, 2006, 2:55:49 AM4/18/06
to
In article <871wvve...@rpi.edu>, Bill Atkins <NOatki...@rpi.edu>
wrote:

> Many things confuse newcomers - that's hardly sufficient grounds to


> remove them from the language.

It is if you care about attracting new users.

rg

Ron Garret

unread,
Apr 18, 2006, 2:56:38 AM4/18/06