> Perhaps all languages have their zealots. Lisp certainly has a few.
> I am trying to figure out why Lisp has a cult like following without
> making it into the "mainstream" as a computer language. That is, Lisp
> doesn't seem, from my parochial point of view, to have the market
> penetration of C/C++, VB (yuck!), or even the controversial Java
> language.
There are lots of people that can give a good answer to this, and in fact
have. Try searching through DejaNews. You will probably find extensive
discussions of lisp vs the rest of the world.
There is also a great essay you ought to read through to understand the
nature of lisp (and some history) a little better.
http://cbl.leeds.ac.uk/nikos/tex2html/examples/good-bad-win/good-bad-win.html
Another good site to obtain materials for lisp is
http://www.elwoodcorp.com/alu/
Finally, Paradigms of AI Programming contains an example of a partial
scheme interpreter in lisp. It might prove to be a useful starting
point. You can find out more about PAIP from
> I am hoping that some of the Lisp devotees here can explain to me why
> Lisp is such a great language and why the rest of the world hasn't
> caught on to it yet, after thirty or forty years of existence.
It did, and it let go. Read the essay to get a sense for why. I forgot to
mention Richard Gabriel's book, something like Patters of Software Design,
which has a lot more about his personal experience with Lucid, one of the
first common lisp vendors. It tells a whole lot about what happened to
lisp.
> Why would an outsider such as my self be interested? I am not a
> troll. I am seriously considering using Lisp. I have Paul Graham's
> "ANSI Common Lisp" book, and one from Guy Steele. As an exercise, I
> am planning to implement a Lisp interpreter. If I can wrap my brain
> around Lisp and find out what is so great about it, I would like to
> embed it in another application as a control and extension language,
> not unlike Emacs's usage. My project, however, is much smaller than
> Emacs. I also don't expect to be able to implement the entire ANSI
> standard, let alone CLOS. That is unless the interpreter doesn't have
> to provide to much primitive functionality and the rest can be
> implemented in Lisp itself.
>
> For the record, I am from a C++ background. C++ is the language I am
> most comfortable using, and I can think in it reasonably well. Lisp
> is quite different. I understand the syntax, but that doesn't mean I
> can write poetry or appreciate it.
>
> With luck, I will grock this thing soon. It will give me a new way to
> think which is always a good thing.
It most certainly is a new way to think. Not that I would compare *any*
experienced programmers with the undergraduates I have to TA for, but the
general tendency is to think in C/C++ and translate to lisp, once you give
them the chance to. Lisp is indeed significantly different, and I doubt you
would gain a real appreciation for it unless you have worked with it for a
somewhat extended period.
Good luck!
Sunil
I am hoping that some of the Lisp devotees here can explain to me why
Lisp is such a great language and why the rest of the world hasn't
caught on to it yet, after thirty or forty years of existence.
Why would an outsider such as my self be interested? I am not a
troll. I am seriously considering using Lisp. I have Paul Graham's
"ANSI Common Lisp" book, and one from Guy Steele. As an exercise, I
am planning to implement a Lisp interpreter. If I can wrap my brain
around Lisp and find out what is so great about it, I would like to
embed it in another application as a control and extension language,
not unlike Emacs's usage. My project, however, is much smaller than
Emacs. I also don't expect to be able to implement the entire ANSI
standard, let alone CLOS. That is unless the interpreter doesn't have
to provide to much primitive functionality and the rest can be
implemented in Lisp itself.
For the record, I am from a C++ background. C++ is the language I am
most comfortable using, and I can think in it reasonably well. Lisp
is quite different. I understand the syntax, but that doesn't mean I
can write poetry or appreciate it.
With luck, I will grock this thing soon. It will give me a new way to
think which is always a good thing.
--
David Steuber
http://www.david-steuber.com
To reply by e-mail, replace trashcan with david.
If you can't trust an anonymous person on the Internet, who can you trust?
>tras...@david-steuber.com (David Steuber "The Interloper") writes:
[snip]
>> I am hoping that some of the Lisp devotees here can explain to me why
>> Lisp is such a great language and why the rest of the world hasn't
>> caught on to it yet, after thirty or forty years of existence.
>
>It did, and it let go. Read the essay to get a sense for why. I forgot to
>mention Richard Gabriel's book, something like Patters of Software Design,
>which has a lot more about his personal experience with Lucid, one of the
>first common lisp vendors. It tells a whole lot about what happened to
>lisp.
"Patterns of Software", Richard Gabriel, Oxford University Press, 1996,
ISBN 0-19-510269-X
[snip]
--
David B. Lamkins <http://www.teleport.com/~dlamkins/>
> I am trying to figure out why Lisp has a cult like following without
> making it into the "mainstream" as a computer language. That is, Lisp
> doesn't seem, from my parochial point of view, to have the market
> penetration of C/C++, VB (yuck!), or even the controversial Java
> language.
For the purposes of this message, I'll accept your branding of Lisp
as cult even though I don't agree 100%. While a cult movie might
serve only a certain audience, Lisp is Turing powerful and highly
expressive and would suffice fine for any programmer unable to find
one of those other languages. You might similarly ask what's so
great about the Macintosh if PC's are beating them everywhere, but
the answer isn't found in a technical analysis of the PC's design,
for the most part. VHS beat out Beta Max in video tape format, too,
in spite of many saying Beta was a superior technology. What wins
the amrket wars in the marketplace is not always technical superiority.
But ok, "cult" you say and let's go with that.
Cult is about serving a need that is not served by the mainstream.
Exactly because of the prevalance of C/C++, etc. people are driven to
Lisp all the more strongly. Not all, but those who are cling
strongly. And not to the language, I think, but to the ideas. The
greater Lisp community sometimes uses other languages, but I think
never really forgets its roots. Lisp was born of the AI research
culture supporting specific features it needed particularly. Slowly
some of its features are adopted into other languages, and that's good.
But I think if you want one central reason Lisp is still not "killed"
yet it's that the other languages still don't have it right and,
sometimes "like it or not", people come back to Lisp to take refuge
in a certain kind of sanity. Cult sanity, if you insist. But
sanity nevertheless.
Features like dynamic redefinition--the obvious and yet startlingly
uncopied idea that static languages seem to hold at bay--which is that
it's useful in mid-application to be able to change a data structure
definition and still run the same image, the same code, without
restarting. Features like GC which make painstaking headway into
other languages but are still clumsy by comparison. Features like
multiple inheritance which aren't absolutely needed for simple
applications and so tend to cause people to simplistically say they
can be done without, but which are a safe haven for people who are
building applications big and complex enough that they can't be easily
shoehorned into a single-inheritance system. Features like
unwind-protect which in spite of their incredible simplicity and
elegance are missing in other languages. Features like making
integers be really integers (arbitrary precision) if they want them.
Features like optional, not required, type declarations so that
programs can be production fast if they need to be, but in prototyping
you don't have to waste time declaring things that may never be used
in production. Features like a program-callable compiler so that code
can be not only generated on the fly but compiled and used in the same
environment without loss of efficiency. On and on. Lisp is full of
features still not adopted by other languages.
And then some fortuitous accidents of chance that are often, I think,
misunderstood, but are no less powerful for the doing: The famous
"program is data" thing is an example. Many languages can represent
themselves. BASIC could represent itself with an array of strings.
But what Lisp does that is critically important in power is to choose
a representation--maybe not even the best representation--but one that
is standard. The willingness to decide not to fight over program
representation and simply to define it simply and understandably means
a whole army of automatic programming and program manipulation tools
(not the least of which is Lisp's rich macro packages and its long
tradition of research in compiler design) have grown up simply because
everyone understands and shares a common interchange format for
programs in a structured form that requires no special and parsing and
no weird ad-hoc quotation rules. It's not enough for another language
to tell you you CAN define a program representation; before they match
Lisp, they must do it and do it once and for all. Most other
languages, trying to be "small", insist on optional add-on libraries
and then have endless disputes over whose add-on library is the one
and only official standard one. This is pain for programmers because
they have to know whose C they are using. C claims to be portable
because it is ubiquitous, but Lisp (by which I mean Common Lisp
particularly) was designed to be and I think truly is way more
portable than C will ever dream of being. In the modern world where
the PC dominates, one may think portable is less and less important,
but portable is REALLY about not making dependencies on the environment,
or about capturing those dependencies in an abstract way. So CL code
speaks to the future not just to other machines. If I have an integer
of a certain range, I can declare it (INTEGER 37 416) meaning
"integer in the range 37-416, inclusive". How that is optimized
may depend on the machine. But I do not have to say "the size of
a machine word" and just hope to win. That's only one tiny example.
Pathnames, too, are structured objects that have been designed so
that the same portable conception can work on Unix, PC, MAC, VMS,
Linux, CDC 7600, Lisp Machines, ISO 9000 file system, etc.
> I am hoping that some of the Lisp devotees here can explain to me why
> Lisp is such a great language and why the rest of the world hasn't
> caught on to it yet, after thirty or forty years of existence.
Lisp has on and off been more popular than it is now. I don't see
popularity as its only measure of success. Some amount of market is
needed to keep Lisp alive and growing, and we're always happy for more
than the baseline minimum but the practical fact is that just about
any language can suffice for the low-end, and what Lisp excels at is
the harder problems. As you yourself note, VB is very popular but no
one speaks kindly of it. I wouldn't trade Lisp's place for VB's--I'm
proud of where Lisp is. Lisp tends to attract people with problems
that they've given up ever solving in other languages. At Symbolics,
where I used to work before I came to Harlequin, there was a sign on
the wall (half in jest, but there was some truth to it) saying "At
Symbolics we make hard problems easy -- and vice versa". Maybe that's
the trade, I don't know. I think it's a fundamental fact about the
universe that a single solution can't be good for every possible
situation. But that Lisp is the thing that does the hard problems
is enough. Consider that there are probably more ten dollar pocket
cameras in the world than fancy theatrical movie cameras, but it's
not a failure for the theatrical movie guys that they make what
they do.
One reason Lisp solves the hard problems is that initial workout
the AI community gave it. It required a lot of memory and disk
in those days and it was expensive compared to other alternatives.
So people turned to other things. But memory and disk get
cheaper, other alternatives get bigger, and meanwhile Lisp has held
the line on size so that it is becoming respectable again just by
waiting for others to catch up in "apparent bloat"--only Lisp still
has power packed into that size, and many other tools have grown
without offering all the power that Lisp has, so don't be surprised
if Lisp has more life in it in the future. I think the possibilities
are really just opening up for Lisp, frankly. It's a bit of a chicken
and egg problem because it needs market belief to do that, but
I think it can and should be done.
> Why would an outsider such as my self be interested? I am not a
> troll. I am seriously considering using Lisp. I have Paul Graham's
> "ANSI Common Lisp" book, and one from Guy Steele. As an exercise, I
> am planning to implement a Lisp interpreter.
Sounds like a good start. Might wanna download a copy of the
Common Lisp HyperSpec(TM). See
http://www.harlequin.com/education/books/HyperSpec/
for (free) download info and
http://www.harlequin.com/education/books/HyperSpec/FrontMatter/
to browse it online at Harlequin.
See also my articles
http://world.std.com/~pitman/PS/Hindsight.html
where I discuss the benefits of Lisp in rapid prototyping and
http://world.std.com/~pitman/PS/Lambda.html
where I discuss the choice of language as a political rather than
technical issue. (The remarks about status of certain standards are
now somewhat dated, but the general thrust of the piece is still
sound, I think.)
> If I can wrap my brain
> around Lisp and find out what is so great about it, I would like to
> embed it in another application as a control and extension language,
> not unlike Emacs's usage. My project, however, is much smaller than
> Emacs. I also don't expect to be able to implement the entire ANSI
> standard, let alone CLOS. That is unless the interpreter doesn't have
> to provide to much primitive functionality and the rest can be
> implemented in Lisp itself.
Nothing wrong with subsets. And yes, by all accounts you're best
using as small a kernel as you can and writing the rest in Lisp.
> For the record, I am from a C++ background. C++ is the language I am
> most comfortable using, and I can think in it reasonably well. Lisp
> is quite different. I understand the syntax, but that doesn't mean I
> can write poetry or appreciate it.
Just being willing to try is a good start. If you have trouble
figuring out how to think about something naturally, this is a good
forum to discuss that. I think you'll find people quite helpful,
and we're always interested to hear honest accounts of people's
experiences.
> With luck, I will grock this thing soon. It will give me a new way to
> think which is always a good thing.
I saw a talk by Gerry Susman in which he suggested that indeed the most key
contribution of computer science to the 20th century is not computation but
the terminology for describing process. That is, the ability to talk about
algorithms. That it stretches the mind, he said, and gives us ways to talk
about how that has happened rather than just repeating things over and over
until other people can parrot our actions is truly important. So I agree
with your goal and wish you luck.
--Kent
p.s. My recollection from the 1970's when the word seemed popular was that
"grok" has no "c" in it. I could be wrong, and went to check, but
(alas) it's not in my dictionary under either spelling. Anyone
got a pointer to an authority on this? It IS just the right
word, and it's a shame it's not better accepted in the language.
>Perhaps all languages have their zealots. Lisp certainly has a few.
>I am trying to figure out why Lisp has a cult like following without
>making it into the "mainstream" as a computer language. That is, Lisp
>doesn't seem, from my parochial point of view, to have the market
>penetration of C/C++, VB (yuck!), or even the controversial Java
>language.
>
>I am hoping that some of the Lisp devotees here can explain to me why
>Lisp is such a great language and why the rest of the world hasn't
>caught on to it yet, after thirty or forty years of existence.
>
I'll second Sunil's suggestion to read comp.lang.lisp on DejaNews. There
have been a lot of discussions of this nature in the past. I personally
have nothing new to add, being a relative newbie to Lisp (I got interested
in the mid-80s, after having been underwhelmed by my first exposure -- in
a college programming language survey course, of course -- in the
mid-70s). FWIW, I'd prefer not to see yet another resurrection of the
perennial "why hasn't Lisp supplanted <insert your favorite language
here>?" thread...
>Why would an outsider such as my self be interested? I am not a
>troll. I am seriously considering using Lisp. I have Paul Graham's
>"ANSI Common Lisp" book, and one from Guy Steele. As an exercise, I
>am planning to implement a Lisp interpreter. If I can wrap my brain
>around Lisp and find out what is so great about it, I would like to
>embed it in another application as a control and extension language,
>not unlike Emacs's usage. My project, however, is much smaller than
>Emacs. I also don't expect to be able to implement the entire ANSI
>standard, let alone CLOS. That is unless the interpreter doesn't have
>to provide to much primitive functionality and the rest can be
>implemented in Lisp itself.
Those are both good books. Graham's is a concise but thorough
introduction. Steele's is a reasonable reference, but you should see the
HyperSpec <http://www.harlequin.com/books/HyperSpec/> for the definitive
reference to ANSI CL. I'd also recommend Wilensky's "Common LISPcraft",
which offers more explanation, in general, than Graham's book; it is
closer to being a beginner's book, but without the patronizing
simplifications found in so many beginner's texts.
If you want to build an interpreter for the intellectual exercise, that's
great. However, if your ultimate goal is to introduce a Lisp-ish
extension language into another project, I'd suggest that you first take a
look at freely-available source for such interpreters; you'll even find
some reasonably good compilers for both Scheme and CL.
The best modern reference for Lisp implementation, IMO, is "Lisp in Small
Pieces" by Christian Quiennec (Cambridge University Press, 1996, ISBN
0-521-56247-3). This covers interpreters and compilers. It has a Scheme
orientation, but makes appropriate and informative digressions to include
techniques appropriate to CL as needed.
Be aware that a lot of your implementation effort will go into runtime
support. One of the most difficult areas of runtime and implementation
tradeoffs involves garbage collection. There is one excellent text on
this subject: "Garbage Collection" by Jones and Lins (Wiley, 1996, ISBN
0-471-94148-4).
AFAIK, Lisp is one of the few languages that, in a typical implementation,
allows implementation of "precise" garbage collection. This is so because
there are typically a small number of well-know pointers (in the
underlying implementation, not available to the Lisp programmer) from
which all other accessible storage can be reached. This allows the GC to
reclaim "precisely" everything that is unused by the running program. The
alternative "conservative" approach uses hueristics to decide which
storage is still live; the hueristics must guess "conservatively" to keep
live storage from being reclaimed -- typically these GCs overestimate by a
small percentage.
>
>For the record, I am from a C++ background. C++ is the language I am
>most comfortable using, and I can think in it reasonably well. Lisp
>is quite different. I understand the syntax, but that doesn't mean I
>can write poetry or appreciate it.
>
>With luck, I will grock this thing soon. It will give me a new way to
>think which is always a good thing.
I believe that Samuel Kamin (?) has published a book (sorry, no reference
available) of interpreters implemented in C++. I'm pretty sure that there
is a Lisp among them.
One thing I think you'll find as you learn to think and program in Lisp is
that it provides a remarkable economy of expression, which will help you
to solve real problems in less time. (Disclaimer: If you have a problem
that's solved by simple, repetitive load-operate-store sequences, then
you'll probably solve it more concisely in C or FORTRAN. But systems
programs -- to which Lisp is well-suited -- are generally not as tidy as
benchmarks, image processing, and numerical programs.) Having been
through the exercise quite a few times, I'm _still_ pleasantly surprised
when a Common Lisp program is a fraction (typically 1/5 or smaller) of the
size of an equivalent C/C++ program.
:grok: /grok/, var. /grohk/ /vt./ [from the novel
"Stranger in a Strange Land", by Robert A. Heinlein, where it
is a Martian word meaning literally `to drink' and metaphorically
`to be one with'] The emphatic form is `grok in
fullness'. 1. To understand, usually in a global sense. Connotes
intimate and exhaustive knowledge. Contrast {zen}, which is
similar supernal understanding experienced as a single brief flash.
See also {glark}. 2. Used of programs, may connote merely
sufficient understanding. "Almost all C compilers grok the
`void' type these days."
[JARGON 4.0.0]
Hope i could help :)
-forcer
--
;; Build a system that even a fool can use and only a fool will use it.
;; email: for...@mindless.com -><- www: http://webserver.de/forcer/
;; IRC: forcer@#StarWars (IRCnet) -><- PGP/GPG: available on my website
<snip>
> As an exercise, I am planning to implement a Lisp interpreter.
I assume you mean to implement one in C or C++, in which case, don't
do that. You'll end up with an understanding of how a Lisp
interpreter works & how to implement one in C or C++, but you won't
end up with an understanding of programming in Lisp.
To understand Lisp, you should program in it. As you repeatedly
become amazed at how easy & simple it is to do things that'd be hard &
complex in C or C++ then you'll start understanding why Lisp has its
following. When you start doing meta-programming in Lisp then
you'll start to really understand it.
Most lisp books are pretty decent, and others have recommended some.
But, I'd recommend Structure and Interpretation of Computer Programs
by Abelson and Sussman. It's scheme & not common lisp, but it does a
good job of illustrating a variety of lisp programming paradigms.
--
Harvey J. Stein
BFM Financial Research
hjs...@bfr.co.il
>> If I can wrap my brain
>> around Lisp and find out what is so great about it, I would like to
>> embed it in another application as a control and extension language,
>> not unlike Emacs's usage. My project, however, is much smaller than
>> Emacs. I also don't expect to be able to implement the entire ANSI
>> standard, let alone CLOS. That is unless the interpreter doesn't have
>> to provide to much primitive functionality and the rest can be
>> implemented in Lisp itself.
>
>Nothing wrong with subsets. And yes, by all accounts you're best
>using as small a kernel as you can and writing the rest in Lisp.
>
You may also wish to look at Guile, the GNU extension language. It is
an implementation of Scheme built for the kind of problem you are
discussing.
Common Lisp would be nice, but it may not be worth the overhead in a
small application.
--
Kenneth P. Turvey <ktu...@pug1.SprocketShop.com>
An atheist is a man with no invisible means of support.
-- John Buchan
>p.s. My recollection from the 1970's when the word seemed popular was that
> "grok" has no "c" in it. I could be wrong, and went to check, but
> (alas) it's not in my dictionary under either spelling...
American Heritage (electronic) Dictionary v.4 says:
grok (grÄk) v. tr. grok€ked grok€king groks
Slang 1. To understand profoundly through intuition or empathy.
[ Coined by Robert A. Heinlein in his Stranger in a Strange Land]
(I don't know whether AHD is on-line.)
An on-line meta-dictionary is at http://www.onelook.com/
Merriam-Webster is at http://www.m-w.com/cgi-bin/dictionary
(which does not contain the word grok)
----
Many have explained why lisp isn't in mainstream. A self-serving meta-answer
is that that is the result of theory of economy. (basically, it just means
that lisp failed to become mainstream by social forces) One significant
reason is that supreme ideas takes time for the average to realize. Or,
cynically: "people are stupid". In an ideal word, suppose every programer
has deep interest and knowledge in computer science, mathematics, and
philosophy, then it is likely that lisp ideas will be the mainstream. In the
real word, not all programers are properly trained, and of the many degree
holders (a dime a dozen these days), lots are unthinking products of mass
education, whose interests in sciences are mostly career based.
It is worth noting that it is a fact that lisp is not in mainstream, and
this is so because of people. (your Providence does not play favoritism on
ideas good or bad.)
The section "the rise of worse-is-better" of Richard Gabriel's essay at
http://cbl.leeds.ac.uk/nikos/tex2html/examples/good-bad-win/node9.html
explains some aspects of why lisp's good ideas may be less popular. A more
detailed
analysis along the same idea is the article "Big Ball of Mud" at
http://www-cat.ncsa.uiuc.edu/~yoder/papers/patterns/BBOM/mud.html
This long article analyze the survival characteristics of codes we normally
call hacking (big ball of mud), and in general condone such practices. This
article fails to indicate a significant reason (the social element) in why
"big ball of mud" are popular: Most programers are average. The Average
lacks knowledge in advanced mathematics, imposing philosophy, or rigors of
discipline. It is easy to see that the Average will sloppily code away
without giving a fart to theories, especially with respect to many of the
technical reasons cited in the article (in essence, the reward of discipline
is elusive. (especially so with regards to the infant stage of computer
science.)).
With hardware faster and faster and advances in communication (esp.
internet), I think the technical reasons that hinders lisp popularity is
decreasing, and the superiority of lisp (or lisp-like modern languages (i.e.
functional languages)) will _eventually_ become mainstream. (java is not one
of them.)
(it is a folklore that "good ideas will eventually win", but it is worth
noting that if true, the process is never automatic. Mass education takes
time and effort: Lispers cannot just sit there and expect gospels to knock
the Average head. Evangelize, fight, do something!)
Xah, x...@best.com
http://www.best.com/~xah/PageTwo_dir/more.html
Perl: all unix's stupidity in one.
1. Lisp incorporates many (all?) of the good ideas in programming, in the
areas of:
+ programming style: object-oriented, functional, etc.
+ environment: libraries, interactive/dynamic development, etc.
+ computation: control structure, memory management, lambda calculus,
etc.
etc.
2. Lisp incorporates all these ideas not in an ad-hoc way, but in a
coordinated, integrated fashion.
3. The above would be enough to make Lisp a "darn good language", but
there are others to choose
from. Each Lisp devotee has further found at least one "killer
feature" of the language which other
languages cannot compete with. (For many, myself included, this
feature is the ability to use the
full language to create program-transformation programs. This is
enabled by the combination
of macros, program-is-data, and the dynamic compilation
environment. Perhaps any lack of
market pervasiveness comes ultimately from the failure of most
programmers to find their own
"killer feature" in Lisp.)
Some of these issues are mentioned at
http://www.elwood.com/alu/table/lisp.htm, and in particular, in the essays
referenced from http://www.elwood.com/alu/table/compare.htm and other places
on the site. You may find the C++ comparisons particularly useful.
With respect to using Lisp as as an extension language. I think this is a
great idea to learn more about Lisp once you have some background, but may
not be the best way to start. I also believe that if you want something for
serious or commercial use, you might consider one of the existing systems.
See the ALU website, referenced above. Of course, I also recommend my
company's Eclipse product, which was designed for this purpose
(http://www.elwood.com/eclipse-info).
> p.s. My recollection from the 1970's when the word seemed popular was that
> "grok" has no "c" in it. I could be wrong, and went to check, but
> (alas) it's not in my dictionary under either spelling. Anyone
> got a pointer to an authority on this? It IS just the right
> word, and it's a shame it's not better accepted in the language.
Grok comes from Robert Heinlein's novel "Stranger in a Strange Land" and
it indeed has no C.
Erann Gat
g...@jpl.nasa.gov
% Grok comes from Robert Heinlein's novel "Stranger in a Strange Land" and
% it indeed has no C.
This is where my usage comes from. Unfortunately, I lack conventional
spelling skills. Sometimes words get past the spell checker.
--
David Steuber
http://www.david-steuber.com
To reply by e-mail, replace trashcan with david.
When the long night comes, return to the end of the beginning.
--- Kosh (???? - 2261 AD) Babylon-5
% tras...@david-steuber.com (David Steuber "The Interloper") writes:
%
% > I am trying to figure out why Lisp has a cult like following without
% > making it into the "mainstream" as a computer language.
%
% For the purposes of this message, I'll accept your branding of Lisp
% as cult even though I don't agree 100%.
I hope I didn't bring offense with the word cult. I meant it in the
sense of a small, enthusiastic following, not the religious sense.
Minority opinions are not always wrong. Heck, I wouldn't be surprised
if the minority views on most any issue were better thought out.
% Cult is about serving a need that is not served by the mainstream.
% Exactly because of the prevalance of C/C++, etc. people are driven to
% Lisp all the more strongly. Not all, but those who are cling
% strongly. And not to the language, I think, but to the ideas. The
% greater Lisp community sometimes uses other languages, but I think
% never really forgets its roots. Lisp was born of the AI research
% culture supporting specific features it needed particularly.
The AI connection is what attracts me to the language. Also the
simplicity of the syntax for parsing and evaluating purposes. My
interest is to implement an interpreter of Lisp in Java, leveraging
Java's garbage collection. I would hate to write my own garbage
collector with my meager talents. Also, Lisp appears to be a language
that would be exceptionally good for describing complex data
structures. That is, I could leverage the Lisp interpreter by making
Lisp my file format. Additional possibilities are also in my mind.
It may be useful to attach Lisp functions (or objects, I haven't seen
CLOS yet) to my data objects as properties. I haven't got a fully
formed design in my mind yet, but Lisp should be able to completely
describe a model and its behavior. This I may be able to do in Java
(it's like a third language to me), but I have an inkling that Lisp
may do the job better and in a more portable fashion.
% And then some fortuitous accidents of chance that are often, I think,
% misunderstood, but are no less powerful for the doing: The famous
% "program is data" thing is an example. Many languages can represent
% themselves. BASIC could represent itself with an array of strings.
I've done this in Perl. Well, sort of. I used Perl's eval function
to evaluate dynamically loaded code. I also have dynamically created
regular expressions in Perl. But Perl is not a language I wish to use
for my project.
% But what Lisp does that is critically important in power is to choose
% a representation--maybe not even the best representation--but one that
% is standard. The willingness to decide not to fight over program
% representation and simply to define it simply and understandably means
% a whole army of automatic programming and program manipulation tools
% (not the least of which is Lisp's rich macro packages and its long
% tradition of research in compiler design) have grown up simply because
% everyone understands and shares a common interchange format for
% programs in a structured form that requires no special and parsing and
% no weird ad-hoc quotation rules. It's not enough for another language
% to tell you you CAN define a program representation; before they match
% Lisp, they must do it and do it once and for all. Most other
This is one of the features that attracts me to Lisp.
% languages, trying to be "small", insist on optional add-on libraries
% and then have endless disputes over whose add-on library is the one
% and only official standard one. This is pain for programmers because
% they have to know whose C they are using. C claims to be portable
% because it is ubiquitous, but Lisp (by which I mean Common Lisp
% particularly) was designed to be and I think truly is way more
Me too (Common Lisp).
% portable than C will ever dream of being. In the modern world where
% the PC dominates, one may think portable is less and less important,
% but portable is REALLY about not making dependencies on the environment,
% or about capturing those dependencies in an abstract way. So CL code
% speaks to the future not just to other machines. If I have an integer
% of a certain range, I can declare it (INTEGER 37 416) meaning
% "integer in the range 37-416, inclusive". How that is optimized
% may depend on the machine. But I do not have to say "the size of
% a machine word" and just hope to win. That's only one tiny example.
% Pathnames, too, are structured objects that have been designed so
% that the same portable conception can work on Unix, PC, MAC, VMS,
% Linux, CDC 7600, Lisp Machines, ISO 9000 file system, etc.
I don't get the need for an integer type in a certain range. But that
is my inexperience. However, portability is important for me. I want
people to have sufficient reason to move away from Microsoft OS.
Therefor, I don't want to target Windows specifically. At the same
time, I only want to do one build. For now, Java offers me that
capability. However, I am not certain of the success of Java. If I
can incorporate portions of my application in Lisp, there is less to
port if Java fails. The c.l.j.a group would hate me for saying that.
That brings up another point. I must say that this is about the
friendliest news group I've read. I haven't read more than about four
dozen postings. In that small sample, all the talk has been at a high
level of politeness. I'm not sure how else to put it. In any of the
other news groups I've been in, I would have read several flames.
If I was in comp.lang.java.advocacy asking the same questions about
Java's following and lack of real market success, I would have had to
wear an asbestos suit! This to me speaks very highly of the Lispers.
% Lisp has on and off been more popular than it is now. I don't see
% popularity as its only measure of success. Some amount of market is
% needed to keep Lisp alive and growing, and we're always happy for more
% than the baseline minimum but the practical fact is that just about
% any language can suffice for the low-end, and what Lisp excels at is
% the harder problems. As you yourself note, VB is very popular but no
But if Lisp was a mainstream language, it would be much easier to
distribute software written with it. Even more so if there was a
standard p-code form that it could be compiled to (like Java) so that
it could be distributed in binary form.
% one speaks kindly of it. I wouldn't trade Lisp's place for VB's--I'm
% proud of where Lisp is. Lisp tends to attract people with problems
% that they've given up ever solving in other languages. At Symbolics,
And VB, IMNSHO, is probably one of the biggest causes (if not the
biggest) of bad software. I'm all for languages that are easy to
learn. I hope Lisp falls into that category. There is just one
syntax. I just need to hear the song (Vorlon reference).
% the trade, I don't know. I think it's a fundamental fact about the
% universe that a single solution can't be good for every possible
% situation. But that Lisp is the thing that does the hard problems
% is enough. Consider that there are probably more ten dollar pocket
% cameras in the world than fancy theatrical movie cameras, but it's
% not a failure for the theatrical movie guys that they make what
% they do.
I wouldn't expect Lisp to perform well in an embedded system like the
Engine Control Unit in my car. But for general computing (even
serious number crunching), why shouldn't Lisp or some other language
be the best at everything? If you can say everything in the language,
and work with all the data types and representations, then all that is
left is to convert the human readable representation into the optimum
machine readable representation. Or am I missing something? I don't
have a CS degree, so it is entirely possible. Actually, I am a
fallible human. That is why I can't see everything.
% One reason Lisp solves the hard problems is that initial workout
% the AI community gave it. It required a lot of memory and disk
% in those days and it was expensive compared to other alternatives.
% So people turned to other things. But memory and disk get
% cheaper, other alternatives get bigger, and meanwhile Lisp has held
% the line on size so that it is becoming respectable again just by
% waiting for others to catch up in "apparent bloat"--only Lisp still
% has power packed into that size, and many other tools have grown
% without offering all the power that Lisp has, so don't be surprised
% if Lisp has more life in it in the future. I think the possibilities
% are really just opening up for Lisp, frankly. It's a bit of a chicken
% and egg problem because it needs market belief to do that, but
% I think it can and should be done.
Emacs is written in Lisp. Unfortunately, PC people don't get to play
with it except for the NT Emacs at Volker's site. I saw XEmacs on a
Sun box today. Very nice.
AutoCad has AutoLisp for doing things. I haven't messed with that
monstrosity in some years now.
It is as if Lisp is lurking around the corner, but just won't show
itself. I think you are probably right except for one thing. I have
a rule that I used to just apply to the stock market. But I have
found that it is almost universal. Never bet against stupidity. I
can't help but think how much smaller and simpler Microsoft Word would
be if it was written in Lisp with a built in interpreter so that a
user could write Lisp extensions to customize Word. I guess that
would be too much like a graphical version of Emacs! Anyway, who
would want CaptiveLisp?
% Sounds like a good start. Might wanna download a copy of the
% Common Lisp HyperSpec(TM). See
% http://www.harlequin.com/education/books/HyperSpec/
% for (free) download info and
% http://www.harlequin.com/education/books/HyperSpec/FrontMatter/
% to browse it online at Harlequin.
%
% See also my articles
% http://world.std.com/~pitman/PS/Hindsight.html
% where I discuss the benefits of Lisp in rapid prototyping and
% http://world.std.com/~pitman/PS/Lambda.html
% where I discuss the choice of language as a political rather than
% technical issue. (The remarks about status of certain standards are
% now somewhat dated, but the general thrust of the piece is still
% sound, I think.)
Thanks for the links. I'll check them out as soon as I can.
% I saw a talk by Gerry Susman in which he suggested that indeed the most key
% contribution of computer science to the 20th century is not computation but
% the terminology for describing process. That is, the ability to talk about
% algorithms. That it stretches the mind, he said, and gives us ways to talk
% about how that has happened rather than just repeating things over and over
% until other people can parrot our actions is truly important. So I agree
% with your goal and wish you luck.
I have found that verbal skills seem to be more important than math
skills in computing. Sure, math is important. But both math and
communication require abstract thinking. We think about a process in
the abstract and then try to communicate it in a way that the receiver
forms the same abstraction. A modern programming language is nothing
more than an interface between the human and the machine. It does
have to pull of a neat trick though. It has to be both machine
readable and human readable. Natural language is full of ambiguity
that just can't exist in a computer language. Also, it seems that the
computer language we think in shapes they way we approach a problem
much like the natural language we speak shapes the way we communicate
our ideas. I'm sure this could be stated better.
In the English language, the phrase "the answer, my friend, is blowing
in the wind" can be interpreted at least three different ways.
Thanks to everyone else who replied to my post, both in this news
group and by e-mail. You have bolstered my determinism to learn Lisp.
Some of you have recommended Scheme. I was considering Scheme until I
was able to find a source for the Common Lisp standard. It may be
myopic and parochial of me, but I want to deal with just one Lisp.
Even Emacs Lisp is an additional burden my finite mind doesn't wish to
cope with at this time. Then again, the Scheme spec is much smaller.
Laziness might make me reconsider.
What is it that makes an engineer? Laziness, impatience, and hubris?
Or is that why Unix and Perl are the way they are?
Well, I'll be squatting here for a while.
* g...@jpl.nasa.gov (Erann Gat)
| Grok comes from Robert Heinlein's novel "Stranger in a Strange Land" and
| it indeed has no C.
FWIW, it was included in Merriam-Webster's Ninth Collegiate Dictionary,
but had been removed in the Tenth edition.
#:Erik
--
http://www.naggum.no/spam.html is about my spam protection scheme and how
to guarantee that you reach me. in brief: if you reply to a news article
of mine, be sure to include an In-Reply-To or References header with the
message-ID of that message in it. otherwise, you need to read that page.
> * Kent M Pitman
> | p.s. My recollection from the 1970's when the word seemed popular was that
> | "grok" has no "c" in it. I could be wrong, and went to check, but
> | (alas) it's not in my dictionary under either spelling. Anyone
> | got a pointer to an authority on this? It IS just the right
> | word, and it's a shame it's not better accepted in the language.
>
> * g...@jpl.nasa.gov (Erann Gat)
> | Grok comes from Robert Heinlein's novel "Stranger in a Strange Land" and
> | it indeed has no C.
>
> FWIW, it was included in Merriam-Webster's Ninth Collegiate Dictionary,
> but had been removed in the Tenth edition.
>
> #:Erik
FTR its also in "The New Hackers Dictionary" (2nd Edition) without the "c".
According to this source the (Martian) work literally means "to drink" and
metaphorically means "to be one with".
__Jason
If you're willing to sacrifice raw speed for size (typical for some
scripting uses - mine at least) you might also want to look at
TinyScheme
at my homepage below. It's a single C module around 4000 kloc,
but near-R5RS Scheme nevertheless.
--
Dimitrios Souflis dsou...@altera.gr
Altera Ltd. http://www.altera.gr/dsouflis
*** Reality is what refuses to disappear when you stop believing
*** in it (VALIS, Philip K. Dick)
> The AI connection is what attracts me to the language. Also the
> simplicity of the syntax for parsing and evaluating purposes. My
> interest is to implement an interpreter of Lisp in Java, leveraging
> Java's garbage collection.
Check out http://www.norvig.com/SILK.html
Scheme in Fifty KB. (in Java).
You're welcome.
--
Bradford W. Miller
(I have a job, but you're talking to me)
Disclaimer: There are no states, corporations, or other pseudo-individuals.
There are only people. Justify your actions without resorting to your
orders.
"Unlike me, many of you have accepted the situation of your imprisonment
and will die here like rotten cabbages."
- Number Six's speech from "Free for All" _The Prisoner_
% Check out http://www.norvig.com/SILK.html
%
% Scheme in Fifty KB. (in Java).
%
% You're welcome.
Come up with a demented idea, and someone else has already implemented
it.
% If you're willing to sacrifice raw speed for size (typical for some
% scripting uses - mine at least) you might also want to look at
% TinyScheme
% at my homepage below. It's a single C module around 4000 kloc,
% but near-R5RS Scheme nevertheless.
Thanks. I'll take a look. I am really more interested in Common Lisp
though. Still, the scanning should be the same. As far as I know, no
one has been dumb enough to do the interpreter in Java like I want to.
Do you mean to say 4 MLOC, or 4 KLOC or what? Just wondering.
--
Greg Pfeil --- Software Engineer --- (pfeilgm@|http://)technomadic.org
"I was a victim of a series of accidents, as are we all."
--Malachi Constant in _The Sirens of Titan_
I take this to mean that you think the lexical analysis of Scheme and
Common Lisp are similar or the same. this is not so. Common Lisp has a
programmable reader that is used to read Lisp forms, while Scheme uses a
very static syntax that is even outside of Scheme itself. (i.e., Common
Lisp uses READ and uses the value of *READTABLE* even for code, while
Scheme cannot use READ because the syntax is specified as sequences of
characters, not as forms.)
I also don't think you really want an interpreter for Common Lisp. it
seems like less work to compile Common Lisp to the JVM elsewhere, and
then compile enough Common Lisp code to sustain a development enviroment
on the JVM itself. I haven't seen any reports from the several people
who have expressed interest in such a stunt over the past couple years,
but you will certainly be a hero if you do it.
Erik, I assume you're wearing some sort of "standards purist" hat while
saying this...? ;-} Yes, of course, in principle you are absolutely
correct, but in practice I don't know of a single Scheme implementation
that does *NOT* use the same lexer for the user-visible READ procedure and
the reader buried in the LOAD procedure and the top-level REPL. In fact,
now that EVAL is in R5RS, one could even make Scheme's LOAD an optional
procedure, which could be coded by the user as:
(define (load fname)
(with-input-from-file fname
(lambda ()
(do ((exp (read) (read)))
((eof-object? exp)) ; note: returns unspec. value
(eval exp (interaction-environment))))))
And, except for the lack of error-handling, this:
(let loop ()
(display "my-repl> ")
(write (eval (read) (interaction-environment)))
(newline)
(loop))
is a perfectly functional REPL in every Scheme I've ever used. [O.k.,
so you have to leave off the 2nd arg to "eval" in the older ones...]
In fact, I can't see any way to *avoid* this, since R5RS explictly says
that READ "...is a parser for the nonterminal <datum>" [6.6.2], and
also, "Note that any string that parses as an <expression> also parses
as a <datum>." [7.1.2] So it should be perfectly correct to use READ
for Scheme expressions (well-formed ones, that is).
On the other hand... There is no defined way in Scheme to extend or modify
the syntax accepted by READ. With only one or two ad-hoc exceptions (dare I
say "hacks"?), I haven't seen anything in the Scheme world that comes close
to the Common Lisp notion of *READTABLE*, and certainly nothing that works
the same way in more than one implementation. And that's quite unfortunate,
given the wide differences in lexical "extensions" supported by various
Schemes. Some allow |CaseSensitiveSymbols With Spaces InVerticalBars| and
some don't. Some allow #|multi-line comments|# and some don't. A very few
support "#."; most don't. Were there a widely-accepted readtable-equivalent,
one could gloss over these differences. But as it is, if one is trying to
write Scheme code which will be even barely portable, one must simply avoid
*everything* that's not explcitly mentioned in R5RS [or even R4RS]. (*sigh*)
Hmmmm... But on the third hand [as Niven & Pournelle say], in R5RS "READ"
is listed as a "library" procedure, which means that while conforming
implementations must provide one [it's not an "optional" library procedure]
it need not be primitive -- it can be implemented in terms of other, more
primitive features. Just start with "READ-CHAR", and parse it yourself.
(I've done that before, actually, as an exercise.) There are enough required
type-conversion routines [I'm thinking specifically of "string->symbol"
and "char->integer"] to allow building any Scheme object up out of pieces.
Maybe *that's* supposed to be "the Scheme way" to get reader extensibility...?
-Rob
-----
Rob Warnock, 8L-855 rp...@sgi.com
Applied Networking http://reality.sgi.com/rpw3/
Silicon Graphics, Inc. Phone: 650-933-1673
2011 N. Shoreline Blvd. FAX: 650-964-0811
Mountain View, CA 94043 PP-ASEL-IA
Stalin uses different code to implement READ and to read in the source program.
When the source program is read in, the reader keeps track of source file
name, line, and character positions to include in error messages. READ does
not do this. In fact, the source program reader doesn't produce a typical
S expression. Rather it produces a structure of type S-EXPRESSION which
includes the encapsulated file, line, and character position.
% * tras...@david-steuber.com (David Steuber "The Interloper")
% | I am really more interested in Common Lisp though. Still, the scanning
% | should be the same. As far as I know, no one has been dumb enough to do
% | the interpreter in Java like I want to.
%
% I take this to mean that you think the lexical analysis of Scheme and
% Common Lisp are similar or the same. this is not so. Common Lisp has a
% programmable reader that is used to read Lisp forms, while Scheme uses a
% very static syntax that is even outside of Scheme itself. (i.e., Common
% Lisp uses READ and uses the value of *READTABLE* even for code, while
% Scheme cannot use READ because the syntax is specified as sequences of
% characters, not as forms.)
Does this mean that while CL can read CL and eval it, Scheme can not
read Scheme and eval it? I am running into confusion here. I was
under the impression that Scheme was a dialect of Lisp. From what you
say, Scmeme is a whole other language (like JavaScript isn't Java).
% I also don't think you really want an interpreter for Common Lisp. it
% seems like less work to compile Common Lisp to the JVM elsewhere, and
% then compile enough Common Lisp code to sustain a development enviroment
% on the JVM itself. I haven't seen any reports from the several people
% who have expressed interest in such a stunt over the past couple years,
% but you will certainly be a hero if you do it.
I still don't know if the Java byte code is flexible enough to allow
CL to be compiled to it. Also, the JVM only accepts .class files that
have to pass the byte code verifier. I have certainly thought of
doing that. It seems the more I learn about Lisp, the more it makes
sense for a Lisp interpreter to be written in Lisp. In either case,
there would have to be support classes written in Java for presenting
a runtime for Lisp in the JVM. It also makes sense to allow the Lisp
to use the other Java APIs via the Class class and reflect package.
This figment of my imagination has been labeled jLisp and is about as
vaporware as you can get.
This brings up another question. It seems that Lisp is not case
sensitive. How many people would be upset if a Lisp implementation
came out that was case sensitive? It sure makes string comparison
easier. This is just my view from a C++ background.
> This brings up another question. It seems that Lisp is not case
> sensitive. How many people would be upset if a Lisp implementation
> came out that was case sensitive? It sure makes string comparison
> easier. This is just my view from a C++ background.
Common Lisp IS case-sensitive.
The Lisp reader, however, is case-translating.
Internally, these symbols are all indistinguishable (i.e., EQ) and
all-uppercase:
foo Foo FOO |FOO|
The following symbols are indistinguishable and are all-lowercase:
|foo| \f\o\o \f|oo|
These symbol pairs are different (i.e., not EQ):
Foo |Foo| ; the former is all-uppercase, the latter is uppercase initial
foo |foo| ; the former is all-uppercase, the latter is all-lowercase
FOO |foo| ; the former is all-uppercase, the latter is all-lowercase
Further, there is no reason to make a Lisp implementation that is
case-sensitive. It would break enormous amounts of code to do so.
The language already provides these features:
*print-case* - special variable
readtable-case - function on readtables
Look them up in the Common Lisp HyperSpec(TM) for details. They are
plenty powerful enough to obviate the need for a non-conforming Lisp.
Further: string comparison has nothing to do with symbols. It's true
that FOO and |foo| are different symbols but not because they have
different names; rather because they are different objects (implicit
pointer compare). Of course, they have different pointers because
they have different names; but that's just coincidence. One shouldn't
confuse felicity with causality. For example, #:FOO and FOO and :FOO
have the same name and that doesn't make them the same symbol.
They are different pointers because one has no package, one is
in the prevailing package, and one is in the keyword package.
Then once they have different pointers, you compare them with EQ
and string compares are irrelevant.
But if you were using strings, you can either choose to compare them
as objects (in which case "Foo" and "FOO" are different under EQ
since they can't have the same pointer if they have different
elements) or you can compare them as strings, in which case you
have to choose STRING-EQUAL (case-insensitive) or STRING=
(case-sensitive). I don't see that either is easier than the
other. They are different. And you should use the right one for
the right purpose.
A problem caused by c-insensitivity mentioned in R*RS is the string-symbol
conversion.
Guile scheme is case sensitive by default (it can be turned off optionally.).
Klaus Schilling
Yes, well, o.k., I misspoke (or perhaps wasn't specific enough). Yes, I
know of other Scheme readers (well, at least one) which annotate the source
similarly (e.g., the one in the Rice PLT group's "MrSpidey"). But what I was
referring to is that the user-visible READ *does* accept all legal source
programs, so that Erik's distinction between "prgrams defined by lexical
rules" and "programs defined by S-exprs" is, for all practical purposes, moot.
Now it is certainly the case that one could make a *mistake* in defining a
Lisp (or Scheme) by its lexical representation in such a way that there was
*not* a guarantee of equivalency, but I also suspect that most participants
in this group would consider such a definition badly broken (if only because
of the negative implications on macros).
+---------------
| READ does not do this. In fact, the source program reader doesn't produce
| a typical S expression. Rather it produces a structure of type S-EXPRESSION
| which includes the encapsulated file, line, and character position.
+---------------
Call me naive, but I don't see how this affects the issue... unless there
are files which Stalin [or some other Lisp or Scheme] will accept as programs
which its READ *won't* accept when a user program tries to read them (e.g.,
say, allowing allows "#+<sym>" or "#.<exp>" in "programs" but not in files
parsed with READ). And if there are, I'd call it broken.
-Rob
[p.s. Apologies in advance: Replies to this via email may get a
"vacation" bounce messages while I'm on sabbatical from SGI...]
You are right in wondering... It is 4 kloc, or else it would be
MegaScheme ;-)
More precicely it is 4110 loc (.c+.h), which is either 4.1 kloc
or 4.0 kloc, depending on whether you use "kilo" for "thousand"
or 2^10=1024.
well, no, it means that you need a whole different machinery to read
Common Lisp than you need to read Scheme. Scheme was designed in the
fashion of static grammars, and people frequently implement the reader in
some other language. Common Lisp's reader is best written in Common Lisp
-- it dispatches on characters according to a table, the READTABLE, and
call out to functions that assemble the various types of objects. this
is overkill per excellence for Scheme. also, Scheme's grammar is defined
so that (foo 1 2 . ()) is different from (foo 1 2), although the Scheme
reader cannot detect this difference. therefore, you will not catch very
important errors if you use the naive approach to Scheme. (no, I'm not
talking in the "standards purist" mode -- Scheme is deliberately designed
this way, and second-guessing designers is a Bad Thing. of course, it
was a gross and very annoying mistake, but the designers should fix that,
not users.)
| I was under the impression that Scheme was a dialect of Lisp. From what
| you say, Scmeme is a whole other language (like JavaScript isn't Java).
Scheme is not a dialect of Common Lisp. the common heritage, if any, is
so old and so far removed from today's reality that we're almost talking
about a missing link. Scheme owes more to Algol than to Lisp, in my
view, just like Java owes more to Smalltalk than to C++. the syntactic
similarity is irrelevant for meaningful comparisons. Scheme adherents
want to tell you Scheme is a dialect of Lisp, but I don't get it. what
do _they_ get from confusing people like that? is it a hope for a piece
of the market that Common Lisp has created? some Lispers (well, at least
one, but he's also a Schemer) will insist very strongly that Common Lisp
is not the only Lisp to consider. the only merit of this latter attitude
is that it makes the Scheme wish come true, but there aren't as many
Lisps as there used to be 10-15 years ago. if you spend your time
learning any other Lisp these days, I'd say you're wasting your time
unless, of course, the Lisp is an embedded Lisp in some product you need
to use. (this includes the sadly deficient Emacs Lisp.)
| This brings up another question. It seems that Lisp is not case
| sensitive.
well, I can't speak for "Lisp" in the sense that it is something that
Scheme is (also) a dialect of, but Common Lisp is certainly case
sensitive. symbols in the COMMON-LISP package are all uppercase, but you
would normally set *PRINT-CASE* to :DOWNCASE and (readtable-case
*readtable*) to :UPCASE, so you could read and write them in lowercase.
both |lowercase| and \l\o\w\e\r\c\a\s\e all have lowercase symbol names,
so it's not like you cannot get it. also, to maintain backward
compatibility, you can set the readtable-case to :INVERT, which upcases
all-lowercase symbol names and downcases all-uppercase symbol names (not
counting characters escaped with | or \).
| How many people would be upset if a Lisp implementation came out that was
| case sensitive?
if you make the reader preserve case, many would object to having to
write symbol names in all uppercase. if you invert the case of all the
symbol names, you create a new dialect that doesn't talk to other Common
Lisp implementations. amazingly, there is no standard function to do
case conversion the way the symbol reader and printer does it, so you
have to call out to the reader or printer to get that behavior.
case is a personal preference in my view, and Common Lisp has done the
right thing in allowing each programmer the ability to choose his own
style. and it's not like it's a hard thing to change.
| It sure makes string comparison easier.
what does? case sensitivity or case insensitivity? I think the latter
is strongly preferable when dealing with protocol or user input. btw,
STRING= is case sensitive, while STRING-EQUAL is case insensitive.
Others have corrected your misunderstanding about Common Lisp's case
sensitivity (it's case sensitive internally, but canonicalizes case of
symbols on input by default). But to answer your second question, the
difference in how the reader processed the case of input never seemed to
hurt the popularity of Franz Lisp or Emacs Lisp, which both have
case-preserving readers (Franz's may have been configurable as well, but
if so the default was to preserve).
--
Barry Margolin, bar...@bbnplanet.com
GTE Internetworking, Powered by BBN, Burlington, MA
*** DON'T SEND TECHNICAL QUESTIONS DIRECTLY TO ME, post them to newsgroups.
> But to answer your second question, the
> difference in how the reader processed the case of input never seemed to
> hurt the popularity of Franz Lisp or Emacs Lisp, which both have
> case-preserving readers (Franz's may have been configurable as well, but
> if so the default was to preserve).
First, let's be clear: a case-preserving reader is nonconforming.
Programs which don't do anything tricky don't notice this
nonconformance, but programs that do will notice this nonconformance.
As with the Clinton situation (where some are blaming the rulemakers
and some are blaming the rulebreakers), so goes the situation with
this readtable issue. Some will blame the language designers and some
will blame the implementors who don't implement the designed language.
Hmm. I guess I've accidentally compared Clinton to Franz. That wasn't
really intentional, but now I'm left with an ethical dilemma between
apologizing for it and asking Harlequin if that's bonus-worthy.
Maybe I should take a poll. (No, just kidding. My mailbox is still
not clear of all the dozen or two e-mails that people sent telling me
the proper citation for "grok". Please no followup on my bad use of
metaphor.)
Anyway, what I started out to say was that (as anyone who's done
prioritization of bug reports knows), "frequency" of problems (and
consequently of pain felt and reported by users) is a
multi-dimensional quantity. Saying that something happens "a lot"
doesn't make clear what axis of this multi-dimensional quantity you're
saying has a high magnitude. For example, I was once at Symbolics
when just before a major release, I found a bug in special binding of
lambda variables in a lambda combination (not just a lambda
expression; in an actual ((lambda ...)...). "It's all packaged and
ready to ship," someone protested. "How frequently does the bug
happen?" What was I to answer? Every time anyone runs my code, I
told them. But then, I don't know how often they do that. Nor do I
know how many developers are like me. So at minimum frequency is "how
often is code developed which exercises a given bug" along one axis vs
"how often is code containing the buggy code executed" along another
axsis vs "how often in the buggy code that's being executed is the
path reached which tickles the bug". Maybe there are even more axes.
In the case in question, it just seemed embarrassing to have lambda
broken when you're a lisp machine company, so Symbolics fixed it. But
oddly enough, hardly anyone then or now used lambda combinations at
all, even though they're retained in the language. Most people use
LET instead. And of those, VERY few use specials in the binding
list because it's mostly classical programmers and scheme programmers
who like lambda combinations, and they all hate specials. So really
the bug was not likely to occur at all. Except when it did. And
then always. So it goes.
Why does all this matter? Well, I guess I'm just trying to point out
that while what Barry said is probably true--that it probably hasn't
materially hurt Franz's implementation in the market, it doesn't follow
that it doesn't cause individuals grief. I bet it probably has, though
I have no stats to back me up. And I just feel a little bad for
the people who do get hit by this that they ALSO find people implying
it's a matter of no consequence. To the people it happens to, it is
of consequence. Whether it's of consequence to the world that it's of
consequence to these people--well, there's an open question.
Emacs on the other hand is another story. Emacs is not part of any
standard and since the design choice is entirely arbitrary, I
absolutely agree with Barry that it can't hurt really either way.
The important thing is just that it's defined.
The problem in the CL case is that it's defined a certain way and
(to my understanding) Franz doesn't follow the standard.
I hope Franz doesn't feel I'm picking on them here. I don't try to
play any kind of marketing games here on comp.lang.lisp. At
Harlequin, we're basically content to just beat them in the
marketplace. In this forum I prefer a more collegiate atmosphere. So
much choice of them as an example is just because Barry raised the
issue and because it's one I felt commentworthy. Don't think I'm
telling you not to buy their products for the reasons I've cited
here. (There are plenty of other reasons I'd rather you use.
... heh ... sorry, couldn't resist.)
> Barry Margolin <bar...@bbnplanet.com> writes:
>
> > But to answer your second question, the
> > difference in how the reader processed the case of input never seemed to
> > hurt the popularity of Franz Lisp or Emacs Lisp, which both have
> > case-preserving readers (Franz's may have been configurable as well, but
> > if so the default was to preserve).
>
> First, let's be clear: a case-preserving reader is nonconforming.
> Programs which don't do anything tricky don't notice this
> nonconformance, but programs that do will notice this nonconformance.
Conforming to what standard? Common Lisp? Neither of the lisps
mentioned above are Common Lisps.
--
Duane Rettig Franz Inc. http://www.franz.com/ (www)
1995 University Ave Suite 275 Berkeley, CA 94704
Phone: (510) 548-3600; FAX: (510) 548-8253 du...@Franz.COM (internet)
well, they do now. although it doesn't appear they have forgiven me
completely for all the noise I made about it, ACL 5.0 does come with a
very significantly improved symbol reader and printer -- they both now
respect every relevant printer variable and print-read consistency is
maintained in all cases, which was not the case previously. I rolled my
own symbol reader and printer for ACL 4.3 because I got sufficiently
annoyed by the broken implementation then, but I'm happy to say that
that's history.
src.naggum.no/lisp/symbol-printer.cl (HTTP or FTP) has the code for ACL
4.3+, if anyone's interested. it's _very_ heavily optimized.
> Kent M Pitman <pit...@world.std.com> writes:
>
> > Barry Margolin <bar...@bbnplanet.com> writes:
> >
> > > But to answer your second question, the
> > > difference in how the reader processed the case of input never seemed to
> > > hurt the popularity of Franz Lisp or Emacs Lisp, which both have
> > > case-preserving readers (Franz's may have been configurable as well, but
> > > if so the default was to preserve).
> >
> > First, let's be clear: a case-preserving reader is nonconforming.
> > Programs which don't do anything tricky don't notice this
> > nonconformance, but programs that do will notice this nonconformance.
>
> Conforming to what standard? Common Lisp? Neither of the lisps
> mentioned above are Common Lisps.
Oh, he meant THAT Franz lisp. I know the dialect, but I assumed
he was referring to Allegro. Yes, I agree. non-CL's dont have
a conformance issue.
As to Allegro, I've heard tell it uses lowercase symbol names, which
would be a potential cause of portability problems, but I have no way
of confirming this because I don't have access to your/Franz's
products. If you'd like to correct my notion (and set the record
generally straight for others), I'd certainly welcome that. I'd
rather work with accurate info.
Of course, as I thought I (perhaps clumsily) mentioned, my prior
remarks were not aimed at Franz at all even though it was them I was
using as an example. It was just the example I (perhaps mistakenly)
thought was on the table when I went to react to and extend the
other comments on the subject.\
(Sorry about all the Clinton comparisons, too, btw. I compared
several other unrelated things to Clinton at dinner tonight and
realized this whole US government political nightmare, about which
I've been commenting in other online [non-newsgroup] forums is really
getting to me...)
> Oh, he meant THAT Franz lisp. I know the dialect, but I assumed
> he was referring to Allegro. Yes, I agree. non-CL's dont have
> a conformance issue.
>
> As to Allegro, I've heard tell it uses lowercase symbol names, which
> would be a potential cause of portability problems,
No, Allegro has two modes of operation that can be switched on
the fly. In the Common Lisp mode, symbol names are read and stored
as required in the Common Lisp spec (which may include lower case
symbols if they are escaped). Also, as Erik mentioned, we have had
some bugs in some of the readtable operations regarding downcasing,
*print-escape*, etc, and have finally fixed them as of 5.0.
> but I have no way
> of confirming this because I don't have access to your/Franz's
> products. If you'd like to correct my notion (and set the record
> generally straight for others), I'd certainly welcome that. I'd
> rather work with accurate info.
Sure; the linux version is freely available at our website and is
representative of (i.e. the same as) what you would get if you
were a customer (without the support) ...
> Of course, as I thought I (perhaps clumsily) mentioned, my prior
> remarks were not aimed at Franz at all even though it was them I was
> using as an example. It was just the example I (perhaps mistakenly)
> thought was on the table when I went to react to and extend the
> other comments on the subject.\
>
> (Sorry about all the Clinton comparisons, too, btw. I compared
> several other unrelated things to Clinton at dinner tonight and
> realized this whole US government political nightmare, about which
> I've been commenting in other online [non-newsgroup] forums is really
> getting to me...)
OK, no problem; perhaps we all have a little Clinton in us (did I just
say that?....)
> Sure; the linux version is freely available at our website and is
> representative of (i.e. the same as) what you would get if you
> were a customer (without the support) ...
Ah, you want me to run a 5th operating system on the computers in my
office. I agree that's -possible-....
> OK, no problem; perhaps we all have a little Clinton in us (did I just
> say that?....)
No, no. You're supposed to say: "The apology wasn't good enough.
Try again." ;-) But maybe you shouldn't. It could start a very
dark spiral that spends $40M the industry could better use on other
things.
Error: End of subthread reached.
You may continue by reading or replying to an unrelated post.
I was answering a question that I thought was about Lisp, not Common Lisp.
So I gave some examples of well-known Lisp dialects that are
case-preserving. I had initially started to include Multics Maclisp, which
perhaps would have made it really clear to you what I was doing, but I felt
that would be too obscure for most readers.
Others have already clarified that I was not referring to Allegro CL when I
said Franz Lisp (Kent, did you really think that *I* would be so
imprecise?).
>The problem in the CL case is that it's defined a certain way and
>(to my understanding) Franz doesn't follow the standard.
The case control features of the CL reader were added at the request of
Franz, Inc. IIRC, they were case-preserving by default (probably as a
result of their Franz Lisp heritage), but this was under control of
proprietary variables that could be changed to conform to the CL way. By
the time the ANSI standard came out I expect they changed the default and
switched over to the standard configuration mechanism (*PRINT-CASE* and
READTABLE-CASE).
> Scheme is not a dialect of Common Lisp. the common heritage, if any, is
> so old and so far removed from today's reality that we're almost talking
> about a missing link. Scheme owes more to Algol than to Lisp, in my
> view, just like Java owes more to Smalltalk than to C++. the syntactic
> similarity is irrelevant for meaningful comparisons. Scheme adherents
> want to tell you Scheme is a dialect of Lisp, but I don't get it.
Did you mean to say "Scheme is not a dialect of Lisp" rather
than "not a dialect of Common Lisp"? In any case, it's unlikely
that anyone has ever claimed Scheme was a dialect of Common Lisp.
> what
> do _they_ get from confusing people like that? is it a hope for a piece
> of the market that Common Lisp has created? some Lispers (well, at least
> one, but he's also a Schemer) will insist very strongly that Common Lisp
> is not the only Lisp to consider.
Well, I'll certainly insist that Common Lisp is not the only Lisp.
Even if Common Lisp were the only important Lisp today, there would
still be Lisp's history. Lisp, like Madonna, turns 40 this year.
(I'm assuming the anniversary conference is in the right year
rather than properly checking, though).
The phrase "dialect of Lisp" has been used for many years, and
"dialect of Lisp" was taken to be a fairly broad category.
It was natural to speak of Scheme as a dialect of Lisp, and
one even finds sub-varieties of Scheme (such as T) referred
to as "dialects of Lisp".
That's the way people in the Lisp world talked (and they still do,
though not quite so much as before). What did they mean by "dialect
of Lisp"? One way to find out is to look at how that phrase was used
and hence at what things were said to be dialects of Lisp. If we do
that, then it seems pretty clear that, as a matter of usage, Scheme is
a "dialect of Lisp".
But that way to talking about Lisp can mislead, and it even
became somewhat dangerous once standardisation began, because
some people claimed Lisp was just one language (albeit with
dialects) and so should have only one standard.
The US position in the ISO WG was that Lisp was a family of
languages, and I believe that position was correct. And so
(the argument went) there should not be a standard for "Lisp"
but only for specific Lisp-family languages. John McCarthy
said he would denounce any standard for "Lisp".
An observation that goes back to at least the 70s is that
so-called dialects of Lisp could be as different from each other
as different languages in (say) the Algol family. That's one
reason why "dialect of Lisp" can mislead. Another problem
with "dialect" is that dialects tend not to be taken as
seriously as languages, as in the quote about a language
being a dialect with an army.
So, all things considered, the "dialect of Lisp" way of talking
is probably more trouble than it's worth.
-- jd
but what _is_ the Lisp that both Scheme and Common Lisp are dialects of?
if there is no such thing (and I don't think there is, anymore, although
there might well have been in the distant past), then it is not useful to
talk about it. "dialect" is not a mark of independent quality in my view
and there's a weird interaction between "dialect of Lisp" and the effect
the mark has on various languages that claim it: one language may try to
capitalize on the work of another ("we're _all_ dialects"), yet it may be
used to reduce the value of the work of another at the same time ("we're
all _merely_ dialects").
* Erik Naggum <er...@naggum.no>
| some Lispers (well, at least one, but he's also a Schemer) will insist
| very strongly that Common Lisp is not the only Lisp to consider.
* Jeff Dalton <je...@gairsay.aiai.ed.ac.uk>
| Well, I'll certainly insist that Common Lisp is not the only Lisp.
the two words "to consider" is essential to what I wrote above. I know
that you go ballistic every time somebody forgets to mention the myriads
of other Lisps that you want us to remember, and that's why I attempted
to be very specific. I'm annoyed that you appear to continue unimpeded
with your standard rhetoric.
| So, all things considered, the "dialect of Lisp" way of talking is
| probably more trouble than it's worth.
I agree, including your argumentation.
> The phrase "dialect of Lisp" has been used for many years, and
> "dialect of Lisp" was taken to be a fairly broad category.
> It was natural to speak of Scheme as a dialect of Lisp, and
> one even finds sub-varieties of Scheme (such as T) referred
> to as "dialects of Lisp".
I was reading about Lisp for years before I discovered Common Lisp (in
the mid 80s, FWIW), so I associate the name "Lisp" with a language or set
of languages that predates Common Lisp. The first Lisp book I read used
MacLisp, but the first Lisp code I remember reading (in Byte) was for a
very small dialect that ran on a 16K TRS-80.
So I'm rather bemused by all this fuss.
> So, all things considered, the "dialect of Lisp" way of talking
> is probably more trouble than it's worth.
A good point well put.
--
Remove insect from address to email me | You can never browse enough
"Ahh, aren't they cute" -- Anne Diamond describing drowning dolphins
It may be like what a US Supreme Court justice said about obscenity: I
can't define it, but I know it when I see it.
I remember that there was a debate at one of the first Lisp Users and
Vendors conferences (perhaps the first one after the Symbolics Lisp Users
Group morphed into the Association of Lisp Users) about whether Dylan
should be considered a dialect of Lisp, even though they were abandoning
parentheses.
Not that anyone likely cares, but Perl (Perl 5, in particular, since almost
everything is first-class) comes pretty close to being a dialect of Lisp,
IMHO.