Perhaps I am completely wrong, but it seems to me that programs such as
EMACS need a special Lisp. I don't know ELISP well enough to give actual
examples, but I would think that ELISP uses only a subset of CL, optimizes
the functions differently, and perhaps has some other extensions or
reductions. If they chose to use a subset of CL, they would still have to
somehow differentiate it from the actual CL.
Regards.
--
Richard Krushelnitskiy "I know not with what weapons World War III will
rich...@gmx.net be fought, but World War IV will be fought with
http://rkrush.cjb.net sticks and stones." -- Albert Einstein
You HAVE to throw into this a whopping huge chunk of caveat to the
effect that:
"When ELISP was created, Common Lisp didn't exist yet, Scheme didn't
exist either, and so none of the modern Lisp dialects actually
_existed_ to give any guidance..."
Most pointedly, ELISP appears to predate the introduction of lexical
scoping, which was something that Scheme experimented with, and Common
Lisp subsequently adopted.
I see little reason why CL could not have been used instead of ELISP,
had it existed at the time.
If someone were to reimplement Emacs from scratch, it seems likely to
me that it would be a good candidate for _massive_ amounts of CLOS
definitions, and it would likely be very different under the covers
than it is.
But the fact is that ELISP was designed before the new college
students were _born_, and now, 20 years later, it's rather late to try
to turn it into either CL or Scheme, as there's vastly too much
existing code written in ELISP.
--
(reverse (concatenate 'string "ac.notelrac.teneerf@" "454aa"))
http://www.cbbrowne.com/info/nonrdbms.html
"...once can imagine the government's problem. This is all pretty
magical stuff to them. If I were trying to terminate the operations
of a witch coven, I'd probably seize everything in sight. How would I
tell the ordinary household brooms from the getaway vehicles?"
-- John Perry Barlow
cbbrowne> Richard Krush <rich...@gmx.net> writes:
>> Perhaps I am completely wrong, but it seems to me that programs such
>> as EMACS need a special Lisp. I don't know ELISP well enough to give
>> actual examples, but I would think that ELISP uses only a subset of
>> CL, optimizes the functions differently, and perhaps has some other
>> extensions or reductions. If they chose to use a subset of CL, they
>> would still have to somehow differentiate it from the actual CL.
cbbrowne> You HAVE to throw into this a whopping huge chunk of caveat to the
cbbrowne> effect that:
cbbrowne> "When ELISP was created, Common Lisp didn't exist yet, Scheme didn't
cbbrowne> exist either, and so none of the modern Lisp dialects actually
cbbrowne> _existed_ to give any guidance..."
cbbrowne> Most pointedly, ELISP appears to predate the introduction of lexical
cbbrowne> scoping, which was something that Scheme experimented with, and Common
cbbrowne> Lisp subsequently adopted.
cbbrowne> I see little reason why CL could not have been used
cbbrowne> instead of ELISP, had it existed at the time.
Except that none of that is true!
GNU Emacs (and its Lisp extension language) were created around 1984.
Scheme was 1978, and Common Lisp (CLtL) was published in 1984 (based
on the existing implementations prior to that). And don't forget that
RMS was at the AI lab and was fully aware (and participated to some
degree) in of all the work for the preceeding 10 years leading up
to those languages.
RMS didn't think that lexical scoping was a good idea for Emacs,
because dynamic scoping makes it easier to rebind ("customize")
all the variables in the system.
The Emacs that was written for the Lisp Machine (years before GNU
Emacs was started), called ZWEI, was written in Lisp Machine Lisp
(sometimes called "ZetaLisp") which was one of the main precursors
to and a superset of Common Lisp. It's a somewhat different approach
internally than GNU Emacs (for example, buffer pointers rather than
cursor-oriented) and uses lexical scoping.
> The main example that doesn't make sense to me is Festival.. it comes with
> a LISP interpreter that has an additional function SayText, rather than
> implementing a SayText function as a LISP library..
Well, I think in the case of Festival, it's some huge mass of C++ code
and at some point they felt they needed an ability to script it, so
they used (I think) SIOD to do that, for which it's well suited. I
don't think there was an intention of implementing festival in Lisp
for whatever reason, and they almost certainly did not want to spend
ages sorting out foreign calls into n different Lisp implementations
so they just picked one.
--tim
> >>>>> On Sat, 22 Sep 2001 03:29:42 GMT, cbbrowne ("cbbrowne") writes:
[...]
> cbbrowne> I see little reason why CL could not have been used
> cbbrowne> instead of ELISP, had it existed at the time.
>
> Except that none of that is true!
>
> GNU Emacs (and its Lisp extension language) were created around 1984.
> Scheme was 1978, and Common Lisp (CLtL) was published in 1984 (based
> on the existing implementations prior to that). And don't forget that
> RMS was at the AI lab and was fully aware (and participated to some
> degree) in of all the work for the preceeding 10 years leading up
> to those languages.
>
> RMS didn't think that lexical scoping was a good idea for Emacs,
> because dynamic scoping makes it easier to rebind ("customize")
> all the variables in the system.
The meme that Elisp predates CL has been spreading for a number of
years since CL was fully standardized and people began to wonder why
their Emacs wasn't compatible with their Lisp system.
RMS isn't exactly the _source_ of this meme, but suffice to say that
he hasn't said much to prevent its spread. It's now widely understood
that dynamic scoping, particularly in Elisp, is a *Bad Thing* because
of its surprising effects in certain situations. Since Emacs can Do
No Wrong, RMS isn't exactly willing to mention that he made a dumb
decision back in 1984.
One annoyance about the continued persistence of dynamic scoping in
Elisp is that it has encouraged others to implement dynamic scoping in
their own Lisps. The 'librepl' that is used in (I think) the Sawmill
X window manager is also dynamically scoped, and lots of code written
in it unfortunately relies on the peculiarities of dynamic scoping.
The XEmacs people have discussed reimplementing the Lisp engine in
XEmacs over and over. The major target languages are Scheme and CL.
Using Scheme would conveniently allow the use of Guile as the
implementation for XEmacs, but some people (myself included) think
that Guile needs some major cleanup before it gets used for something
as large as XEmacs (its use in SCWM is a good example of some of its
current deficiencies).
There are a number of other *NASTY* problems with Elisp, particularly
the way it interfaces with its garbage collector. This requires hairy
macro placement (using GCPRO()) to keep pointers from going missing in
the divide between C and Elisp.
Everyone pretty much agrees that the current guts of Elisp in XEmacs
suck, but nobody really wants to start down the road of rewriting the
whole thing *and then* porting the megs of software written in Elisp.
Compounded with that is that RMS will probably never allow GNU Emacs
to be rewritten to use a different Lisp, and hence the software that
is shared between the two most popular Emacsen (eg, Gnus, W3, PCL-CVS,
AUCTeX, etc) would no longer be portable between the two. Maintainers
would be forced to decide between one or the other Emacs, or to
maintain two distinct versions of their code.
XEmacs developers came up with the idea of an Elisp compatibility
package, but the details of this never arose, IIRC.
> The Emacs that was written for the Lisp Machine (years before GNU
> Emacs was started), called ZWEI, was written in Lisp Machine Lisp
> (sometimes called "ZetaLisp") which was one of the main precursors
> to and a superset of Common Lisp. It's a somewhat different approach
> internally than GNU Emacs (for example, buffer pointers rather than
> cursor-oriented) and uses lexical scoping.
ZWEI actually makes more sense in its internals than Emacs does. But
both of them still leave me running away. :-)
After that rant I should say that I still respect RMS for the huge
amount of work he's done, but he does get a bit annoying at times with
his recalcitrance... Can't be a figurehead without people disliking
you, anyway. :-P
'james
--
James A. Crippen <ja...@unlambda.com> ,-./-. Anchorage, Alaska,
Lambda Unlimited: Recursion 'R' Us | |/ | USA, 61.2069 N, 149.766 W,
Y = \f.(\x.f(xx)) (\x.f(xx)) | |\ | Earth, Sol System,
Y(F) = F(Y(F)) \_,-_/ Milky Way.
> The meme that Elisp predates CL has been spreading for a number of
> years since CL was fully standardized and people began to wonder why
> their Emacs wasn't compatible with their Lisp system.
>
> RMS isn't exactly the _source_ of this meme, but suffice to say that
> he hasn't said much to prevent its spread.
He seems to have some serious beef against CL. I say "seems", because
I'm not sure that he's the source, but GNU Emacs doesn't allow the use
of the cl extension package in its distributed code. Which is crazy,
because there's some good stuff in there. Also, the whole choice of
Scheme as the official GNU extension language.
> It's now widely understood
> that dynamic scoping, particularly in Elisp, is a *Bad Thing* because
^^^^^^^^^^^^^^^^^^^^^
> of its surprising effects in certain situations.
I hardly think it's "widely understood" that dynamic scoping in Elisp
is a bad thing. Certainly some people think it is, but then there are
also a lot of proponents. My personal guess is that maintainers of
large packages are the least fond of it, and those trying to customize
others' large packages are the most fond of it. It definately has its
good and bad sides. IMO the only way to get rid of dynamic scoping
without making Elisp a lot less useful for its intended purpose
(customizing Emacs), would be to rewrite the whole thing to use CLOS
and be very disciplined about using OO style. I guess that given the
way advice is implemented, around advice could be used in a
lexically-scoped Emacs to emulate dynamic scoping.
> One annoyance about the continued persistence of dynamic scoping in
> Elisp is that it has encouraged others to implement dynamic scoping in
> their own Lisps. The 'librepl' that is used in (I think) the Sawmill
> X window manager is also dynamically scoped, and lots of code written
> in it unfortunately relies on the peculiarities of dynamic scoping.
I absolutely agree here. Emacs' status has given an impression of
credibility to its design decisions, outside of any context. I think
a lot of the way Elisp works is the right decision for Emacs now, not
because I couldn't think of a better way to do it, but because any
better way would involve rewriting Gnus, VM, AucTeX, etc, etc, making
the cure worse than the disease. Without understanding the context of
the goodness of these design decisions, though, people apparently will
just copy those decisions.
> There are a number of other *NASTY* problems with Elisp, particularly
> the way it interfaces with its garbage collector. This requires hairy
> macro placement (using GCPRO()) to keep pointers from going missing in
> the divide between C and Elisp.
I admit it's nit-picking a bit to differentiate between the language
and its implementation when the language isn't standardized and it
only has two nearly-identical implementations, but this is definitely
an implementation issue, and not one with the language. This is only
visible from within the implementation, and could be fixed without
breaking any existing lisp code. So it's a problem with the Elisp
interpreter, not with Elisp itself. If I'm not mistaken, GNU Emacs is
planning on replacing the current GC with a portable conservative GC
(maybe Boehm's), eventually making the whole, aweful GCPRO mess a
non-issue.
> Everyone pretty much agrees that the current guts of Elisp in XEmacs
> suck, but nobody really wants to start down the road of rewriting the
> whole thing *and then* porting the megs of software written in Elisp.
> Compounded with that is that RMS will probably never allow GNU Emacs
> to be rewritten to use a different Lisp, and hence the software that
> is shared between the two most popular Emacsen (eg, Gnus, W3, PCL-CVS,
> AUCTeX, etc) would no longer be portable between the two. Maintainers
> would be forced to decide between one or the other Emacs, or to
> maintain two distinct versions of their code.
See above about the cure being worse than the disease :). I think at
some point, crufty old Elisp will go away. I don't think it will
happen because of the will of the XEmacs maintainers, though. It's
gonna require a consensus on the part of the maintainers of the major
packages that things have gotten to the point where they'd rather
rewrite their packages than continue on in Elisp. Otherwise, no one
would use the New Improved Got-No-AUCTeX-No-W3-No-ILISP-No-Gnus Emacs.
> ja...@unlambda.com (James A. Crippen) writes:
>
> > The meme that Elisp predates CL has been spreading for a number of
> > years since CL was fully standardized and people began to wonder why
> > their Emacs wasn't compatible with their Lisp system.
> >
> > RMS isn't exactly the _source_ of this meme, but suffice to say that
> > he hasn't said much to prevent its spread.
>
> He seems to have some serious beef against CL. I say "seems", because
> I'm not sure that he's the source, but GNU Emacs doesn't allow the use
> of the cl extension package in its distributed code. Which is crazy,
> because there's some good stuff in there. Also, the whole choice of
> Scheme as the official GNU extension language.
It's probably because lots of people he no longer likes were involved
with the CL standardization effort. RMS is kinda snippety about
holding on to grudges. He's *still* got it in for Symbolics, even
though they're twice dead and resurrected already, 20 years later.
I think he just has some psychological problems that need some good
therapy. But that won't happen, because if we lost the free-software
loudmouth who would we replace him with? :-)
> I hardly think it's "widely understood" that dynamic scoping in Elisp
> is a bad thing. Certainly some people think it is, but then there are
> also a lot of proponents. My personal guess is that maintainers of
> large packages are the least fond of it, and those trying to customize
> others' large packages are the most fond of it. It definately has its
> good and bad sides. IMO the only way to get rid of dynamic scoping
> without making Elisp a lot less useful for its intended purpose
> (customizing Emacs), would be to rewrite the whole thing to use CLOS
> and be very disciplined about using OO style. I guess that given the
> way advice is implemented, around advice could be used in a
> lexically-scoped Emacs to emulate dynamic scoping.
CL has specials. Also, much of the effects of customizability could
be better done using something more rigorous than random variables
that the user setqs. The whole idea of using huge piles of variables
for customization is a bad idea because they proliferate all too
easily, particularly without a coherent namespace management system,
like CL's packages.
I'd prefer to see feature alists or something. One alist per mode, or
the like. Whatever. There are lots of better ways to do it than the
way it's done in Emacs right now.
> > in it unfortunately relies on the peculiarities of dynamic scoping.
>
> I absolutely agree here. Emacs' status has given an impression of
> credibility to its design decisions, outside of any context. I think
> a lot of the way Elisp works is the right decision for Emacs now, not
> because I couldn't think of a better way to do it, but because any
> better way would involve rewriting Gnus, VM, AucTeX, etc, etc, making
> the cure worse than the disease. Without understanding the context of
> the goodness of these design decisions, though, people apparently will
> just copy those decisions.
And unfortunately, copying those decisions in the form of *completely
new*, *nonstandard* Lisps just makes the proliferation and
compatibility problems worse.
It's so easy for someone to write a new Lisp that they never bother to
read up on implementation details, and perpetuate some of the more
ugly kluges and hacks without knowing that they're ugly.
> I admit it's nit-picking a bit to differentiate between the language
> and its implementation when the language isn't standardized and it
> only has two nearly-identical implementations, but this is definitely
> an implementation issue, and not one with the language. This is only
> visible from within the implementation, and could be fixed without
> breaking any existing lisp code. So it's a problem with the Elisp
> interpreter, not with Elisp itself. If I'm not mistaken, GNU Emacs is
> planning on replacing the current GC with a portable conservative GC
> (maybe Boehm's), eventually making the whole, aweful GCPRO mess a
> non-issue.
That's the hope. But it's been the hope to get rid of the GCPRO mess
for a number of years now. But it's really entrenched and poorly
understood.
> > Everyone pretty much agrees that the current guts of Elisp in XEmacs
> > suck, but nobody really wants to start down the road of rewriting the
> > whole thing *and then* porting the megs of software written in Elisp.
> > Compounded with that is that RMS will probably never allow GNU Emacs
> > to be rewritten to use a different Lisp, and hence the software that
> > is shared between the two most popular Emacsen (eg, Gnus, W3, PCL-CVS,
> > AUCTeX, etc) would no longer be portable between the two. Maintainers
> > would be forced to decide between one or the other Emacs, or to
> > maintain two distinct versions of their code.
>
> See above about the cure being worse than the disease :). I think at
> some point, crufty old Elisp will go away. I don't think it will
> happen because of the will of the XEmacs maintainers, though. It's
> gonna require a consensus on the part of the maintainers of the major
> packages that things have gotten to the point where they'd rather
> rewrite their packages than continue on in Elisp. Otherwise, no one
> would use the New Improved Got-No-AUCTeX-No-W3-No-ILISP-No-Gnus Emacs.
It'll never get to this point because, well, it's Lisp, innit? It's a
lot better than C, right? So we'll just live with its problems!
Which is okay, but annoying.
The real solution is to borrow the work done on automatic translation
of Elisp to Scheme that the Edwin people did for MIT Scheme. Snarf
that stuff and figure out how far you'd have to go to get it to do CL.
Then snatch a CL implementation (or roll your own partial-CL) and port
the front end of Emacs to it. A lot of work, which is why nobody's
doing it.
This debate goes on forever, BTW. I'm just saying the same things
people have said before. And yes, I'm too lazy to do the work
myself. If I wasn't as lazy, I'd be a C programmer, not a Lisp
hacker. But if I was any more lazy I'd be using Lazy ML. :-D
> t...@hurricane.OCF.Berkeley.EDU (Thomas F. Burdick) writes:
>
> > I hardly think it's "widely understood" that dynamic scoping in Elisp
> > is a bad thing. Certainly some people think it is, but then there are
> > also a lot of proponents. My personal guess is that maintainers of
> > large packages are the least fond of it, and those trying to customize
> > others' large packages are the most fond of it. It definately has its
> > good and bad sides. IMO the only way to get rid of dynamic scoping
> > without making Elisp a lot less useful for its intended purpose
> > (customizing Emacs), would be to rewrite the whole thing to use CLOS
> > and be very disciplined about using OO style. I guess that given the
> > way advice is implemented, around advice could be used in a
> > lexically-scoped Emacs to emulate dynamic scoping.
>
> CL has specials. Also, much of the effects of customizability could
> be better done using something more rigorous than random variables
> that the user setqs. The whole idea of using huge piles of variables
> for customization is a bad idea because they proliferate all too
> easily, particularly without a coherent namespace management system,
> like CL's packages.
>
> I'd prefer to see feature alists or something. One alist per mode, or
> the like. Whatever. There are lots of better ways to do it than the
> way it's done in Emacs right now.
I agree with all of the above, but it's all orthogonal to
lexical/dynamic scoping (except for the mention of specials,
obviously).
> And unfortunately, copying those decisions in the form of *completely
> new*, *nonstandard* Lisps just makes the proliferation and
> compatibility problems worse.
>
> It's so easy for someone to write a new Lisp that they never bother to
> read up on implementation details, and perpetuate some of the more
> ugly kluges and hacks without knowing that they're ugly.
Yep, apparently it's easier than reading the CLISP or ECLS
documentation :-/
> > See above about the cure being worse than the disease :). I think at
> > some point, crufty old Elisp will go away. I don't think it will
> > happen because of the will of the XEmacs maintainers, though. It's
> > gonna require a consensus on the part of the maintainers of the major
> > packages that things have gotten to the point where they'd rather
> > rewrite their packages than continue on in Elisp. Otherwise, no one
> > would use the New Improved Got-No-AUCTeX-No-W3-No-ILISP-No-Gnus Emacs.
>
> It'll never get to this point because, well, it's Lisp, innit? It's a
> lot better than C, right? So we'll just live with its problems!
Actually, you're probably right. I'd rather write a
`with-lexical-scoping' macro that would make all the binding
primitives fake lexical scoping unless I (declare (special ...))'ed
'em. I guess that makes me part of the problem :)
> The real solution is to borrow the work done on automatic translation
> of Elisp to Scheme that the Edwin people did for MIT Scheme. Snarf
> that stuff and figure out how far you'd have to go to get it to do CL.
> Then snatch a CL implementation (or roll your own partial-CL) and port
> the front end of Emacs to it. A lot of work, which is why nobody's
> doing it.
The problem is that this would force the new Emacs to make a lot of
the same misdecisions as the old Elisp-based one. If I had 50-hour
days, I'd think it would be interesting to design a new Emacs from the
ground up, but if I had to maintain compatibility with Elisp-based
Emacs, even with 50-hour days I wouldn't want to.
> This debate goes on forever, BTW.
Oh, yeah, but it seemed interesting, and this is usenet, after all,
with noexpectation of actually accomplishing anything :)
> The meme that Elisp predates CL has been spreading for a number of
> years since CL was fully standardized and people began to wonder why
> their Emacs wasn't compatible with their Lisp system.
I guess I have to confess mistake on this one. It still seems not
outrageous to consider that there was considerable "flux" going on at
the time, and that it's not obvious that dynamic scoping was
necessarily a backwards and dumb choice at the time.
> RMS isn't exactly the _source_ of this meme, but suffice to say that
> he hasn't said much to prevent its spread. It's now widely
> understood that dynamic scoping, particularly in Elisp, is a *Bad
> Thing* because of its surprising effects in certain situations.
> Since Emacs can Do No Wrong, RMS isn't exactly willing to mention
> that he made a dumb decision back in 1984.
Mind you, what we'd do today, using CL (with CLOS!), is quite
different from what would have been likely back in 1984. So while I
may have been emitting the "usual, mistaken meme," it would be equally
wrong to look at the CLHS as the description of what "ought" to have
been used...
> One annoyance about the continued persistence of dynamic scoping in
> Elisp is that it has encouraged others to implement dynamic scoping
> in their own Lisps. The 'librepl' that is used in (I think) the
> Sawmill X window manager is also dynamically scoped, and lots of
> code written in it unfortunately relies on the peculiarities of
> dynamic scoping.
> The XEmacs people have discussed reimplementing the Lisp engine in
> XEmacs over and over. The major target languages are Scheme and CL.
> Using Scheme would conveniently allow the use of Guile as the
> implementation for XEmacs, but some people (myself included) think
> that Guile needs some major cleanup before it gets used for
> something as large as XEmacs (its use in SCWM is a good example of
> some of its current deficiencies).
Could you elaborate on the SCWM comment?
> There are a number of other *NASTY* problems with Elisp,
> particularly the way it interfaces with its garbage collector. This
> requires hairy macro placement (using GCPRO()) to keep pointers from
> going missing in the divide between C and Elisp.
It's entirely possible that any other implementation would have been
just about as bad...
> Everyone pretty much agrees that the current guts of Elisp in XEmacs
> suck, but nobody really wants to start down the road of rewriting
> the whole thing *and then* porting the megs of software written in
> Elisp. Compounded with that is that RMS will probably never allow
> GNU Emacs to be rewritten to use a different Lisp, and hence the
> software that is shared between the two most popular Emacsen (eg,
> Gnus, W3, PCL-CVS, AUCTeX, etc) would no longer be portable between
> the two. Maintainers would be forced to decide between one or the
> other Emacs, or to maintain two distinct versions of their code.
> XEmacs developers came up with the idea of an Elisp compatibility
> package, but the details of this never arose, IIRC.
Making it cover the simultaneous needs of:
- compatibility
- speed
- cleanliness in the 'new syntax'
makes it hard even in concept to fill in details.
The Guile folk had a different but similarly ambitious vision to make
it simultaneously support several syntaxes, notably a C-like one
called "C-Tax," and then fell afoul of the intent to try to support
Perl syntax. The notion of supporting _that_ seemed to daunt any
consideration of trying to implement _any_ of this stuff...
> > The Emacs that was written for the Lisp Machine (years before GNU
> > Emacs was started), called ZWEI, was written in Lisp Machine Lisp
> > (sometimes called "ZetaLisp") which was one of the main precursors
> > to and a superset of Common Lisp. It's a somewhat different approach
> > internally than GNU Emacs (for example, buffer pointers rather than
> > cursor-oriented) and uses lexical scoping.
>
> ZWEI actually makes more sense in its internals than Emacs does. But
> both of them still leave me running away. :-)
> After that rant I should say that I still respect RMS for the huge
> amount of work he's done, but he does get a bit annoying at times
> with his recalcitrance... Can't be a figurehead without people
> disliking you, anyway. :-P
Well, if he wasn't so "recalcitrant," he'd probably not have stayed
unswervingly behind the GPL for such a long time. Sometimes that's
for the worse, sometimes for the better.
The utter lack of compromise on so many issues certainly shows him off
as "not the most diplomatic person in the world;" that's bad if the
goal is to make treaties, but it's kind of useful to have someone so
steadfast around irrespective of whether or not you agree with them...
--
(concatenate 'string "aa454" "@freenet.carleton.ca")
http://www.cbbrowne.com/info/sap.html
"People are more vocally opposed to fur than leather because it's
easier to harass rich women than motorcycle gangs." [bumper sticker]
> Christopher Stacy <cst...@spacy.Boston.MA.US> writes:
> One annoyance about the continued persistence of dynamic scoping in
> Elisp is that it has encouraged others to implement dynamic scoping in
> their own Lisps. The 'librepl' that is used in (I think) the Sawmill
> X window manager is also dynamically scoped, and lots of code written
> in it unfortunately relies on the peculiarities of dynamic scoping.
This is incorrect. rep (librep is the name of the library that is
linked to for applications that would like to use rep) is lexically
scoped. I don't know if it even supports dynamic scope at all. It's
sort of halfway between elisp and scheme for the most part, which
makes me terribly confused most of the time when I'm trying to use
it. :)
Oh and Sawmill is a web log analyzer. The window manager's name had to
be changed to Sawfish due to a trademark conflict.
--
-> -/- - Rahul Jain - -\- <-
-> -\- http://linux.rice.edu/~rahul -=- mailto:rahul...@usa.net -/- <-
-> -/- "I never could get the hang of Thursdays." - HHGTTG by DNA -\- <-
|--|--------|--------------|----|-------------|------|---------|-----|-|
Version 11.423.999.220020101.23.50110101.042
(c)1996-2000, All rights reserved. Disclaimer available upon request.
I think there must have been something very political that happened
some time in the mists of the past. GCL was out there, and largely
ignored by all _sorts_ of folk for the longest time.
And it seems to me that there _should_ have been some really vastly
serious push to have some form of "Schemely" code generator associated
with GCC in order to encourage wider use of Scheme, and not merely for
"scripting."
> > It's now widely understood
> > that dynamic scoping, particularly in Elisp, is a *Bad Thing* because
> ^^^^^^^^^^^^^^^^^^^^^
> > of its surprising effects in certain situations.
> I hardly think it's "widely understood" that dynamic scoping in
> Elisp is a bad thing. Certainly some people think it is, but then
> there are also a lot of proponents. My personal guess is that
> maintainers of large packages are the least fond of it, and those
> trying to customize others' large packages are the most fond of it.
> It definately has its good and bad sides.
That seems logical.
<http://www.gnu.org/software/emacs/emacs-paper.html#SEC18> describes
this in albeit limited detail.
> IMO the only way to get rid of dynamic scoping without making Elisp
> a lot less useful for its intended purpose (customizing Emacs),
> would be to rewrite the whole thing to use CLOS and be very
> disciplined about using OO style. I guess that given the way advice
> is implemented, around advice could be used in a lexically-scoped
> Emacs to emulate dynamic scoping.
Attaching extra information to objects would be doable... _With Modern
CLOS_, and the various metaobject protocol stuff that isn't quite
formally standardized.
As you say, the need would be to "rewrite the whole thing to use
CLOS;" if we step back a little closer to 1984, CLOS+MOP weren't
nearly as standardized as they are now, and weren't realistically an
option.
>> One annoyance about the continued persistence of dynamic scoping in
>> Elisp is that it has encouraged others to implement dynamic scoping
>> in their own Lisps. The 'librepl' that is used in (I think) the
>> Sawmill X window manager is also dynamically scoped, and lots of
>> code written in it unfortunately relies on the peculiarities of
>> dynamic scoping.
> I absolutely agree here. Emacs' status has given an impression of
> credibility to its design decisions, outside of any context. I
> think a lot of the way Elisp works is the right decision for Emacs
> now, not because I couldn't think of a better way to do it, but
> because any better way would involve rewriting Gnus, VM, AucTeX,
> etc, etc, making the cure worse than the disease. Without
> understanding the context of the goodness of these design decisions,
> though, people apparently will just copy those decisions.
.. And if the point is to build something that's long-term
extensible, there _are_ properties of dynamic scope that have some
value, so that the "cure" might, along with being worse than the
disease, make some extensions harder to do later.
>> There are a number of other *NASTY* problems with Elisp,
>> particularly the way it interfaces with its garbage collector.
>> This requires hairy macro placement (using GCPRO()) to keep
>> pointers from going missing in the divide between C and Elisp.
> I admit it's nit-picking a bit to differentiate between the language
> and its implementation when the language isn't standardized and it
> only has two nearly-identical implementations, but this is
> definitely an implementation issue, and not one with the language.
> This is only visible from within the implementation, and could be
> fixed without breaking any existing lisp code. So it's a problem
> with the Elisp interpreter, not with Elisp itself. If I'm not
> mistaken, GNU Emacs is planning on replacing the current GC with a
> portable conservative GC (maybe Boehm's), eventually making the
> whole, aweful GCPRO mess a non-issue.
You should see Tom Lord's discussion of conservative GC; _greatly_
scathing... He was one of the guys that used to work on Guile, and is
more recently responsible for Systas Scheme.
>> Everyone pretty much agrees that the current guts of Elisp in
>> XEmacs suck, but nobody really wants to start down the road of
>> rewriting the whole thing *and then* porting the megs of software
>> written in Elisp. Compounded with that is that RMS will probably
>> never allow GNU Emacs to be rewritten to use a different Lisp, and
>> hence the software that is shared between the two most popular
>> Emacsen (eg, Gnus, W3, PCL-CVS, AUCTeX, etc) would no longer be
>> portable between the two. Maintainers would be forced to decide
>> between one or the other Emacs, or to maintain two distinct
>> versions of their code.
> See above about the cure being worse than the disease :). I think
> at some point, crufty old Elisp will go away. I don't think it will
> happen because of the will of the XEmacs maintainers, though. It's
> gonna require a consensus on the part of the maintainers of the
> major packages that things have gotten to the point where they'd
> rather rewrite their packages than continue on in Elisp. Otherwise,
> no one would use the New Improved
> Got-No-AUCTeX-No-W3-No-ILISP-No-Gnus Emacs.
I'm not sure when that point in time will be...
--
(concatenate 'string "cbbrowne" "@cbbrowne.com")
http://www.cbbrowne.com/info/emacs.html
``God decided to take the devil to court and settle their differences
once and for all. When Satan heard of this, he grinned and said, "And
just where do you think you're going to find a lawyer?"''
> One annoyance about the continued persistence of dynamic scoping in Elisp
> is that it has encouraged others to implement dynamic scoping in their own
> Lisps. The 'librepl' that is used in (I think) the Sawmill X window
> manager is also dynamically scoped, and lots of code written in it
> unfortunately relies on the peculiarities of dynamic scoping.
That changed a long time ago (a year or more?). From librep's news:
,----
| 0.8
| ===
|
| * Default scoping is now lexical, only variables declared using
| `defvar' are dynamically scoped.
`----
sawmill is known as sawfish these days.
--
Dave Pearson: | lbdb.el - LBDB interface.
http://www.davep.org/ | sawfish.el - Sawfish mode.
Emacs: | uptimes.el - Record emacs uptimes.
http://www.davep.org/emacs/ | quickurl.el - Recall lists of URLs.
Thomas> He seems to have some serious beef against CL. I say
Thomas> "seems", because I'm not sure that he's the source, but
Thomas> GNU Emacs doesn't allow the use of the cl extension
Thomas> package in its distributed code. Which is crazy, because
Thomas> there's some good stuff in there. Also, the whole choice
Thomas> of Scheme as the official GNU extension language.
cd /usr/share/emacs/21.0.104/lisp/
grep -n "(require 'cl)" *.el /dev/null
winner.el:44: (require 'cl))
wid-browse.el:36:(eval-when-compile (require 'cl))
vc.el:363: (require 'cl)
vc-rcs.el:36: (require 'cl)
vc-hooks.el:37: (require 'cl))
uniquify.el:90:(eval-when-compile (require 'cl))
tooltip.el:37: (require 'cl)
sun-curs.el:34:(eval-when-compile (require 'cl))
strokes.el:192:(eval-when-compile (require 'cl))
smerge-mode.el:49:(eval-when-compile (require 'cl))
simple.el:33: (require 'cl))
pcvs.el:125:(eval-when-compile (require 'cl))
pcvs-util.el:32:(eval-when-compile (require 'cl))
pcvs-parse.el:37:(eval-when-compile (require 'cl))
pcvs-info.el:36:(eval-when-compile (require 'cl))
pcvs-defs.el:31:(eval-when-compile (require 'cl))
msb.el:83:(eval-when-compile (require 'cl))
mouse-sel.el:144: (require 'cl))
midnight.el:42: (require 'cl))
log-view.el:36:(eval-when-compile (require 'cl))
log-edit.el:35:(eval-when-compile (require 'cl))
lazy-lock.el:275: (require 'cl)
iswitchb.el:212: (require 'cl))
imenu.el:64:(eval-when-compile (require 'cl))
generic.el:121: (require 'cl))
font-lock.el:673: (require 'cl)
fast-lock.el:192: (require 'cl)
faces.el:28: (require 'cl)
edmacro.el:74: (require 'cl))
diff-mode.el:63:(eval-when-compile (require 'cl))
cvs-status.el:34:(eval-when-compile (require 'cl))
cus-dep.el:27:(eval-when-compile (require 'cl))
autorevert.el:73:(eval-when-compile (require 'cl))
That is perhaps overstated. It is true, I believe, that any packages
that are automatically loaded into Emacs and then dumped with the
Emacs image (cc-mode comes to mind) cannot use the CL library, and I
don't know why that is, but certainly (require 'cl) is not actively
discouraged.
--
Graham Hughes <gra...@sigwinch.org>
(defun whee (n e) (subseq (let ((c (cons e e))) (nconc c c)) 0 n))
Huh? Anybody is "free" to do so! However, RMS might or might not
want to help with such a port/rewrite: you'd have to ask him.
Let us not start YAGFL (Yet Another GPL Flame War) here, though.
That's what gnu.misc.discuss is for!
James> XEmacs developers came up with the idea of an Elisp compatibility
James> package, but the details of this never arose, IIRC.
Someone did a paper (maybe it was an undergraduate thesis) at MIT
which included a Scheme compatability library for GNU Elisp.
I think they then ported GNUS or something big, to demonstrate that it could
be done. If someone wants to do something similar (in either CL or Scheme)
they should take a look at that...........aha..here it is:
AITR-1451
Emacs Lisp in Edwin SScheme
Author[s]: Matthew Birkholz
Date: September 1993
Abstract: sThe MIT-Scheme program development environment includes a
general-purpose text editor, Edwin, that has an extension language,
Edwin Scheme. Edwin is very similar to another general-purpose text
editor, GNU Emacs, which also has an extension language, Emacs Lisp.
The popularity of GNU Emacs has lead to a large library of tools
written in Emacs Lisp. The goal of this thesis is to implement
a useful subset of Emacs Lisp in Edwin Scheme. This subset was chosen
to be sufficient for simple operation of the GNUS news reading program.
PS Download: ftp://publications.ai.mit.edu/ai-publications/1000-1499/AITR-1451.ps.Z
PDF Download: ftp://publications.ai.mit.edu/ai-publications/pdf/AITR-1451.pdf
I'd be more interested in a CL version. One issue is: what compiler/interpreter
and development tools would be distributed with the editor so that people could
load extension libraries, and write their own extensions.
Huh?
Looking at the references in my online copy of CLTL2, the CLOS
references are all dated 1989.
Keene's book indicates that in '86, when CL object models started
coalesceing into _something_, that there was a diverse set of
different object models.
The point? In 2001 (and, for that matter, 1991 :-() we can point to
CLOS as an appropriate object model to mandate. In 1984, it certainly
wasn't an option.
Similarly (though probably less crucial!), the LOOP facility in CL in
1984 differed considerably from the much-extended design voted on in
1989.
Differing object models, differing loop models, all would lead to an
Emacs design based on "CL as of 1984" being a quite different animal
from one based on "CL as of 1989." (Regardless of whether SERIES is
considered at _all_ standardized :-).)
--
(reverse (concatenate 'string "ac.notelrac.teneerf@" "454aa"))
http://www.ntlug.org/~cbbrowne/nonrdbms.html
There is a theory that states: "If anyone finds out what the universe
is for, it will disappear and be replaced by something more bizarrely
inexplicable." There is another theory that states: "This has already
happened..." -Douglas Adams, "Hitch-Hikers Guide to the Galaxy"
Ah, yes, thank you for the reality check. :-)
Do I endorse it on the back?
> Christopher Stacy <cst...@spacy.Boston.MA.US> writes:
> > >>>>> On Wed, 26 Sep 2001 00:54:18 GMT, cbbrowne ("cbbrowne") writes:
> > cbbrowne> Mind you, what we'd do today, using CL (with CLOS!), is quite
> > cbbrowne> different from what would have been likely back in 1984.
> >
> > Huh?
>
> Looking at the references in my online copy of CLTL2, the CLOS
> references are all dated 1989.
The first edition of CLTL was *published* in 1984. It was considered
good enough of a standard for Symbolics to implement a Common Lisp for
Genera, and LMI (TI?) to implement a Common Lisp for their LispM
offering (what the heck did they call their OS anyway? Or did they
call it anything?) as well as Spice Lisp (later CMU CL) to try to
implement as closely as possible *before* the book was even through
publication. And Kyoto Common Lisp was developed straight from this
book, external from the US Lisp community, IIRC.
> Keene's book indicates that in '86, when CL object models started
> coalesceing into _something_, that there was a diverse set of
> different object models.
Keep in mind that this book is _focusing_ on OO, and hence would see
things from an exaggerated perspective. There weren't *that* many
different object models.
> The point? In 2001 (and, for that matter, 1991 :-() we can point to
> CLOS as an appropriate object model to mandate. In 1984, it certainly
> wasn't an option.
Flavors was already in use in 1984. Flavors was an object system
based on ideas in Smalltalk, developed by (I think) Howard Cannon.
Lisp OO systems were not unknown in 1984 by any means. OO systems
were implemented in Lisp as research toys in the late 70s.
> Similarly (though probably less crucial!), the LOOP facility in CL in
> 1984 differed considerably from the much-extended design voted on in
> 1989.
From AI:.INFO.;LISP LOOP:
LOOP is a Lisp macro which provides a programmable iteration
facility. The same LOOP module operates compatibly in both Lisp
Machine Lisp and Maclisp (PDP-10 and Multics).
This Maclisp implementation of LOOP would serve as the model for LOOP
in the CL standard. Its surface syntax is similar enough in a number
of respects that any modern CL programmer would understand its use.
From CMUCL 18c src/code/loop.lisp:
;;;> Portions of LOOP are Copyright (c) 1986 by the Massachusetts Institute of Technology.
;;;> All Rights Reserved.
The LOOP used in CMUCL started life as Maclisp's LOOP. Seeing as
there is a comment from Symbolics in there as well, I'd be surprised
if Genera didn't use a LOOP derived from the same source (my LispM
isn't powered up otherwise I'd say for certain).
> Differing object models, differing loop models, all would lead to an
> Emacs design based on "CL as of 1984" being a quite different animal
> from one based on "CL as of 1989." (Regardless of whether SERIES is
> considered at _all_ standardized :-).)
It is also good to keep in mind that a good chunk of code written for
CLTL1 will work fine in CLTL2 *and* in ANSI CL as well. There are a
number of differences between these, but they have more in common with
each other than any one of them does with say Scheme or InterLisp.
Indeed, look to any large Lisp implementation for code that even
predates CLTL1 which still runs well or with only minor modifications
in a modern CL.
Really, CL isn't that different now from what it was then, just more
stuff has been added to it, and a lot of software libraries that were
floating around in one form or another were standardized so
programmers could depend upon specific behaviors.
If you want a Lisp that's really *different*, try Lisp 1.5. Heck, try
L Peter Deutsch's Lisp for the PDP-1. It runs on Bob Supnik's PDP-1
emulator at some insane multiple of its original speed.
> ja...@unlambda.com (James A. Crippen) writes:
>
> > Christopher Stacy <cst...@spacy.Boston.MA.US> writes:
> > One annoyance about the continued persistence of dynamic scoping in
> > Elisp is that it has encouraged others to implement dynamic scoping in
> > their own Lisps. The 'librepl' that is used in (I think) the Sawmill
> > X window manager is also dynamically scoped, and lots of code written
> > in it unfortunately relies on the peculiarities of dynamic scoping.
>
> This is incorrect. rep (librep is the name of the library that is
> linked to for applications that would like to use rep) is lexically
> scoped. I don't know if it even supports dynamic scope at all. It's
> sort of halfway between elisp and scheme for the most part, which
> makes me terribly confused most of the time when I'm trying to use
> it. :)
>
> Oh and Sawmill is a web log analyzer. The window manager's name had to
> be changed to Sawfish due to a trademark conflict.
Yes, I hadn't looked at it in some time. I see now that librep is
lexically scoped. When I had looked at it earlier (two years ago?) it
was dynamically scoped, because the author borrowed its design from
Emacs Lisp.
> t...@hurricane.OCF.Berkeley.EDU (Thomas F. Burdick) writes:
> > He seems to have some serious beef against CL. I say "seems",
> > because I'm not sure that he's the source, but GNU Emacs doesn't
> > allow the use of the cl extension package in its distributed code.
> > Which is crazy, because there's some good stuff in there. Also, the
> > whole choice of Scheme as the official GNU extension language.
>
> I think there must have been something very political that happened
> some time in the mists of the past. GCL was out there, and largely
> ignored by all _sorts_ of folk for the longest time.
GCL was based on Austin Kyoto Common Lisp (AKCL) which derives from
Kyoto Common Lisp (KCL) which was based on CLTL1 by Taiichi Yuasa and
Masama Hagiya.
> And it seems to me that there _should_ have been some really vastly
> serious push to have some form of "Schemely" code generator associated
> with GCC in order to encourage wider use of Scheme, and not merely for
> "scripting."
The RTL (register transfer language) used in GCC bears some
resemblance to Lisp, but its primitives are not symbols -- they are
hardcoded in meaning. Thus it has a Lisp syntax, but isn't a Lisp.
RMS used the Lisp syntax because it was easy to parse, so he said.
And probably because it was easily edited with Emacs.
> You should see Tom Lord's discussion of conservative GC; _greatly_
> scathing... He was one of the guys that used to work on Guile, and
Guile has a lame GC. It's mark-sweep and the interpreter spends a lot
of time in it. It needs something with a write barrier or treadmill,
I think. This has been an outstanding problem of Guile's for some
time now, several years.
> is more recently responsible for Systas Scheme.
Yah, conservative GC is only good for languages that need
conservativism, which are typically languages that don't have
intrinsic support for GC, like C or C++. Languages which are designed
around the idea of a GC typically benefit more from some form of
generational collector or perhaps a combination of reference counter,
incremental mark-sweep, and a slower ephemeral generational collector.
The hybrid approach works best in large systems with large spaces. In
small systems a small, simple GC like mark-sweep works better. Some
research has shown that compactifying collectors win, other research
shows the opposite (I think it depends on how long you cook the
numbers, and at what temperature).
Hairy GC questions should be taken to the gc-list. Those people
*live* for garbage collection. It's somewhat frightening, really.
And definitely read Paul Wilson's surveys of GC and dynamic
allocation. Those are required reading. And read Jones and Lins's
book on GC as well, which summarizes much of the current research in
the last decade, research that is missing from the more common
treatments of the subject.
I lived and breathed garbage collection for a while. I suppose I will
again when I get my little Lisp system finished so I can play with GC
implementations again.
> >>>>> On 25 Sep 2001 13:07:45 -0800, James A Crippen ("James") writes:
> James> Compounded with that is that RMS will probably never allow GNU Emacs
> James> to be rewritten to use a different Lisp
>
> Huh? Anybody is "free" to do so! However, RMS might or might not
> want to help with such a port/rewrite: you'd have to ask him.
I meant that he wouldn't go recommending it as his 'official' Emacs,
which he does with GNU Emacs (as opposed to its red-headed stepchild,
XEmacs). RMS is a bit protective of his baby, which is why XEmacs
exists in the first place.
> Let us not start YAGFL (Yet Another GPL Flame War) here, though.
> That's what gnu.misc.discuss is for!
YAG(PL)FW, heh, yeah. :-)
> James> XEmacs developers came up with the idea of an Elisp compatibility
> James> package, but the details of this never arose, IIRC.
>
> Someone did a paper (maybe it was an undergraduate thesis) at MIT
> which included a Scheme compatability library for GNU Elisp.
I think I mentioned this elsewhere as well.
> I'd be more interested in a CL version. One issue is: what
> compiler/interpreter and development tools would be distributed with
> the editor so that people could load extension libraries, and write
> their own extensions.
Well, there'd only be a couple to choose from. CMUCL, CLisp, and GCL.
And GCL isn't as well maintained as the other two, so that's a strike
against it. CMUCL has a large memory footprint (but then, so does
Emacs), and it's slow to start (but then, so is Emacs). CLisp doesn't
perform as well as CMUCL, and the compiler is external (don't know
about CLisp's byte compiler or how that's going lately). Both CLisp
and CMUCL have FFIs, so loading extension libraries is a matter of
writing the Lisp interfaces. As far as other development tools
(pretty printer, cross-indexer/who-calls, class browser, etc) they
should all be in the editor! :-)
It's a tossup between CLisp and CMUCL I think. The other alternative,
rolling a new CL, is also a possibility. The 'white pages' part that
goes into a Lisp kernel isn't that difficult, but implementing all the
rest of the standard takes time.
Might this be a start? :
http://groups.google.com/groups?q=elisp+cllib&hl=en&rnum=2&selm=u66uc5d82.fsf%40ksp.com
(I have to admit that I haven't looked at it myself)
cheers,
Erik
--
"Have faith in Darwin... By the looks of it, this guy couldn't reproduce
himself if he had an installation wizard."
-- Andreas Skau in the Monastery
There were enough that it was a pain to write large chunks of code and
port them. At least two kinds of flavors (old and new) in several
variants, as well as things like LOOPS (no not the iteration thing)
which were a whole other ball game.
Of course the differences were just a small matter of syntax, except
they weren't, because some systems had MI, some didn't and those that
did computed precedence lists in various differing ways which could be
quite exciting. And I don't think anyone had multiple dispatch (maybe
LOOPS did).
And even post-CLOS it was not clear to a lot of people that CLOS
implementations would ever be fast enough to really use in anger
because of all the frightening stuff like redefinition and so on. In
practice PCL (which was all that most people had for a fair time) was
just *glacial* at times. Genera didn't have native CLOS till 8.x, ACL
till I'm not sure when (4.x?), and until people saw those
implementations - mostly for us ACL's, because I don't think we really
saw a future in special HW - it really was not obvious that CLOS was
going to be usable.
It all seems clear now in retrospect, and maybe it was even clear then
if you were one of the right 20 people, but it really was not clear to
a random Lisp hacker.
--tim
Well, GCL was KCL and then AKCL before it was GCL and it was
encumbered in various curious ways which probably prevented its free
use. I forget the details, but I think the deal may have been that it
was free for educational use only, and even then you had to physically
sign something. This is trying to remember more than 12 years back,
so I may have the licensing details wrong.
> He seems to have some serious beef against CL. I say "seems", because
> I'm not sure that he's the source, but GNU Emacs doesn't allow the use
> of the cl extension package in its distributed code. Which is crazy,
> because there's some good stuff in there. Also, the whole choice of
> Scheme as the official GNU extension language.
It's not clear to me that not allowing the cl package is because of
anti-CL sentiment: it may be because of history. There have been at
least two versions of cl.el, and the older one was *seriously* awful.
I'm a CL programmer, and *I* have elisp packages I've written which
contain implementations of bits of CL because I absolutely did not
want cl.el loaded into my emacs if I could possibly avoid it because
it broke so much stuff.
The new cl.el is much better - I'm not sure when the transition was
but it's not that long ago (maybe 5-7 years). I still have to remind
myself that (require 'cl) is an OK thing to do, though.
--tim
* Christopher Stacy
> Huh? Anybody is "free" to do so! However, RMS might or might not want
> to help with such a port/rewrite: you'd have to ask him.
Well... the freedom to fork is illusory at best. The animosity from RMS
over the fork between Classic Emacs and Lucid Emacs (later XEmacs) is
legendary. When I tried to help people get access to the new features of
Emacs 20 but get rid of the still seriously braindamaged MULE crap, I
carefully created a backward-compatible "Multi-Byte Survival Kit", which
was picked up by RedHat and had many thousands of users. RMS' response
to this was to introduce an incompatibility in the byte-compiled file
format (it now required MULE crap to load, which really is unnecesasry)
the very next release in order to keep people away from it.
> I'd be more interested in a CL version.
I think it would have been the greatest thing for the spread of Common
Lisp to produce the next generation Emacs based in a real Lisp with a
much better design of the whole application and user-visible language.
Emacs Lisp is hopelessly ancient, and has even gone the way of Scheme
with non-general and type-specific functions.
> One issue is: what compiler/interpreter and development tools would be
> distributed with the editor so that people could load extension
> libraries, and write their own extensions.
Since this is one of those thing that really would move the Common Lisp
community forward, the ideal situation would be for the vendors to gang
up with a funded, but voluntary team and provide the necessary support to
get this going as a demonstration project for the power of Common Lisp.
However, I think it might be as much as a 100-man-year job, the funding
for which might have to come from Osama bin Laden's frozen funds because
there simply is not enough resources to do this for free elsewhere now.
///
--
Why did that stupid George W. Bush turn to Christian fundamentalism to
fight Islamic fundamentalism? Why use terms like "crusade", which only
invokes fear of a repetition of that disgraceful period of Christianity
with its _sustained_ terrorist attacks on Islam? He is _such_ an idiot.
* James A. Crippen
> Well, there'd only be a couple to choose from. CMUCL, CLisp, and GCL.
This is a _very_ counter-productive position. Excluding the commercial
vendors from this project will be the best way ever to destroy Common
Lisp. Instead, write the Emacs on top of sufficiently powerful Common
Lisp and cause the free Common Lisps to become powerful enough to deal
with it. If you start with an insufficiently powerful Common Lisp, you
will get the same kind of rushed disasters that Emacs Lisp is full of.
We have to realize that the reason that Emacs Lisp is the way it is, is
that the core language support is hopelessly insufficient.
cbbrowne> Looking at the references in my online copy of CLTL2,
cbbrowne> the CLOS references are all dated 1989.
I think that if the editor had been written in 1984, it would have
been very much like what a CLOS programmer would do today.
Everything else was!
Flavors, the original first-class object-oriented Lisp language extension
on the Lisp Machine, is more than 10 years before your date. Flavors was
single-dispatch, and originally used the (SEND obj :MESSAGE args) calling
syntax but much later switched to normal function calling syntax.
You said DEFFLAVOR instead of DEFCLASS, you said DEFMETHOD, but there was
no DEFGENERIC and method lambda-list congruency was not enforced, and
Flavors had some features that CLOS lacks. Flavors had no MOP.
CLOS is very much like Flavors in many ways, and it is easy to
mechanically convert programs from Flavors into CLOS.
Why does everyone act like nothing existed before CLTL1?!?
Common Lisp is mostly descended from the Lisp Machine (and MACLISP).
The Lisp Machine itself began around 1975 - the CADR came out around
1978, and Flavors was part of the system by at least 1980. Flavors
was based on some ideas about certain generic operations in MACLISP
(SFAs) and influenced by Smalltalk. and by Hewitt's Actors, and by
a nearby ice cream store (Steve's) where they "mixed in" additional
behaviours (well, specific nuts and candies) with the primary flavor
that you selected.
The original Lisp Machine editor circa 1977 was called EINE, and ZWEI
was around before the end of 1979. My guess for the reason that ZWEI
used DEFSTRUCT (and the "array-leader" hack) instead of Flavors, is that
Flavors may not have been ready for prime-time when DLW began writing
the program. The Lisp Machine window system did not use Flavors until
a little after ZWEI.
> It was considered good enough of a standard for Symbolics to
> implement a Common Lisp for Genera, and LMI (TI?) to implement a
> Common Lisp for their LispM offering (what the heck did they call
> their OS anyway? Or did they call it anything?)
LMI didn't call it anything. They sort of thought naming the OS (such
as it was) was pompous.
> Flavors was already in use in 1984. Flavors was an object system
> based on ideas in Smalltalk, developed by (I think) Howard Cannon.
> Lisp OO systems were not unknown in 1984 by any means. OO systems
> were implemented in Lisp as research toys in the late 70s.
I recall that `new flavors' was a major contender. No one has
mentioned Drescher's `Object Lisp', but that had a few proponents as
well (mostly at LMI).
People skeptical of CLOS thought multi-dispatch would be `too slow'
and that perhaps functions ought to belong to a class. The
self-referential MOP was also considered to be `overkill' (or mental
masturbation). There is quite a bit of work in bootstraping CLOS in
order to present the illusion that CLOS is implemented in CLOS, and
this seemed to be a rather academic issue.
There was also the issue that there was virtually no interesting code
written in CLOS, but the entire LispM window system was written in
Flavors (I think new flavors held the promise of an easier migration
route).
The original KCL had a not-so-restrictive license. The license was
not DFSG compliant AFAIR, and, to actually get it, you had to write
Kyoto. I still have mine somewhere. The same applied for AKCL.
Before AKCL, there where two commercial products based on KCL. Ibuki
CL (from California, a Stanford spin off, headed by Professor
Weyrauch - barring spelling mistakes) and Delphi CL, from Pisa,
originating from the startup of Prof. Attardi's (incidentally, the
first importer of Symbolics and Sun workstations in Italy).
Delphi CL evolved into EcoLisp, which is now being evolved again as
ECL (is that the name?)
Cheers
--
Marco Antoniotti ========================================================
NYU Courant Bioinformatics Group tel. +1 - 212 - 998 3488
719 Broadway 12th Floor fax +1 - 212 - 995 4122
New York, NY 10003, USA http://bioinformatics.cat.nyu.edu
"Hello New York! We'll do what we can!"
Bill Murray in `Ghostbusters'.
It is called ECLS now:
Erik.
> I think that if the editor had been written in 1984, it would have
> been very much like what a CLOS programmer would do today.
> Everything else was!
Indeed.
> Flavors, the original first-class object-oriented Lisp language extension
> on the Lisp Machine, is more than 10 years before your date. Flavors was
> single-dispatch, and originally used the (SEND obj :MESSAGE args) calling
> syntax but much later switched to normal function calling syntax.
> You said DEFFLAVOR instead of DEFCLASS, you said DEFMETHOD, but there was
> no DEFGENERIC and method lambda-list congruency was not enforced, and
> Flavors had some features that CLOS lacks. Flavors had no MOP.
> CLOS is very much like Flavors in many ways, and it is easy to
> mechanically convert programs from Flavors into CLOS.
New Flavors IIRC discarded the (SEND obj :MESSAGE args) stuff in favor
of something more resembling generic functions. Right? Haven't
looked at any Flavors code in a while though.
> Why does everyone act like nothing existed before CLTL1?!?
Too many things existed before CLTL1. That's why CLTL1 was created.
> Common Lisp is mostly descended from the Lisp Machine (and MACLISP).
> The Lisp Machine itself began around 1975 - the CADR came out around
> 1978, and Flavors was part of the system by at least 1980. Flavors
> was based on some ideas about certain generic operations in MACLISP
> (SFAs)
^^^^
Can you explain?
> and influenced by Smalltalk. and by Hewitt's Actors, and by
> a nearby ice cream store (Steve's) where they "mixed in" additional
> behaviours (well, specific nuts and candies) with the primary flavor
> that you selected.
Coldstone is now well known for the same process.
> The original Lisp Machine editor circa 1977 was called EINE, and ZWEI
> was around before the end of 1979. My guess for the reason that ZWEI
> used DEFSTRUCT (and the "array-leader" hack) instead of Flavors, is that
> Flavors may not have been ready for prime-time when DLW began writing
> the program. The Lisp Machine window system did not use Flavors until
> a little after ZWEI.
Was EINE related to Multics Emacs which (as I recall) was written in
Maclisp? Or was EINE created from whole cloth?
> t...@hurricane.OCF.Berkeley.EDU (Thomas F. Burdick) writes:
>
> > I absolutely agree here. Emacs' status has given an impression of
> > credibility to its design decisions, outside of any context. I
> > think a lot of the way Elisp works is the right decision for Emacs
> > now, not because I couldn't think of a better way to do it, but
> > because any better way would involve rewriting Gnus, VM, AucTeX,
> > etc, etc, making the cure worse than the disease. Without
> > understanding the context of the goodness of these design decisions,
> > though, people apparently will just copy those decisions.
>
> .. And if the point is to build something that's long-term
> extensible, there _are_ properties of dynamic scope that have some
> value, so that the "cure" might, along with being worse than the
> disease, make some extensions harder to do later.
I think CLOS with *tasteful* use of special variables would provide
the same power to extend, and those extensions would be easier to
follow. Leaving Elisp as it is, except making it lexically scoped
*would* be terrible. Every function would have 20 pieces of around
advice (*shudder*).
> >> There are a number of other *NASTY* problems with Elisp,
> >> particularly the way it interfaces with its garbage collector.
> >> This requires hairy macro placement (using GCPRO()) to keep
> >> pointers from going missing in the divide between C and Elisp.
>
> > I admit it's nit-picking a bit to differentiate between the language
> > and its implementation when the language isn't standardized and it
> > only has two nearly-identical implementations, but this is
> > definitely an implementation issue, and not one with the language.
> > This is only visible from within the implementation, and could be
> > fixed without breaking any existing lisp code. So it's a problem
> > with the Elisp interpreter, not with Elisp itself. If I'm not
> > mistaken, GNU Emacs is planning on replacing the current GC with a
> > portable conservative GC (maybe Boehm's), eventually making the
> > whole, aweful GCPRO mess a non-issue.
>
> You should see Tom Lord's discussion of conservative GC; _greatly_
> scathing... He was one of the guys that used to work on Guile, and is
> more recently responsible for Systas Scheme.
I think that's only relevant when you have a choice. I don't think
it's possible to have a precise GC for portable C code (although I'd
be happy to be corrected, preferably on the gc-list :). Certainly for
something as messy as the Emacs source code, a conservative GC is
really the only option.
> >>>>> "Thomas" == Thomas F Burdick <t...@hurricane.OCF.Berkeley.EDU> writes:
>
> Thomas> He seems to have some serious beef against CL. I say
> Thomas> "seems", because I'm not sure that he's the source, but
> Thomas> GNU Emacs doesn't allow the use of the cl extension
> Thomas> package in its distributed code. Which is crazy, because
> Thomas> there's some good stuff in there. Also, the whole choice
> Thomas> of Scheme as the official GNU extension language.
>
> cd /usr/share/emacs/21.0.104/lisp/
> grep -n "(require 'cl)" *.el /dev/null
> winner.el:44: (require 'cl))
Well what'd'y'know.
[ more references ... ]
> That is perhaps overstated. It is true, I believe, that any packages
> that are automatically loaded into Emacs and then dumped with the
> Emacs image (cc-mode comes to mind) cannot use the CL library, and I
> don't know why that is
Hmm, that must be it. I'm actually more confused now, because if so
many packages use `cl', it would be nice to have it in pure space, so
it could be shared. It's kind of big, after all...
> * Christopher Stacy
> > I'd be more interested in a CL version. One issue is: what
> > compiler/interpreter and development tools would be distributed with the
> > editor so that people could load extension libraries, and write their own
> > extensions.
>
> * James A. Crippen
> > Well, there'd only be a couple to choose from. CMUCL, CLisp, and GCL.
>
> This is a _very_ counter-productive position. Excluding the commercial
> vendors from this project will be the best way ever to destroy Common
> Lisp. Instead, write the Emacs on top of sufficiently powerful Common
> Lisp and cause the free Common Lisps to become powerful enough to deal
> with it.
Okay, but you'd still need it to ship with a free CL, because I can't
imagine any C++ programmer wanting to buy a commercial CL just to run
Emacs, especially if s/he could just use the old Elisp-based one.
Plus, as far as I know, CLISP is the only CL that's approximately as
portable as GNU/X Emacs.
> If you start with an insufficiently powerful Common Lisp, you will
> get the same kind of rushed disasters that Emacs Lisp is full of.
> We have to realize that the reason that Emacs Lisp is the way it
> is, is that the core language support is hopelessly insufficient.
That's probably true. So the first step would be to beef up CLISP to
around commercial quality. Hell, if you threw all the development
hours going to GNU Emacs and X Emacs into the task of writing a Really
Good CL-Based Emacs, you could probably do it. Alas, I'll probably be
having a snowball fight with the devil by the time this happens,
though...
> * James A Crippen
> > Compounded with that is that RMS will probably never allow GNU Emacs to
> > be rewritten to use a different Lisp
>
> * Christopher Stacy
> > Huh? Anybody is "free" to do so! However, RMS might or might not want
> > to help with such a port/rewrite: you'd have to ask him.
>
> Well... the freedom to fork is illusory at best. The animosity from RMS
> over the fork between Classic Emacs and Lucid Emacs (later XEmacs) is
> legendary. When I tried to help people get access to the new features of
> Emacs 20 but get rid of the still seriously braindamaged MULE crap, I
> carefully created a backward-compatible "Multi-Byte Survival Kit", which
> was picked up by RedHat and had many thousands of users. RMS' response
> to this was to introduce an incompatibility in the byte-compiled file
> format (it now required MULE crap to load, which really is unnecesasry)
> the very next release in order to keep people away from it.
Interesting, I'd wondered why the format had changed. I kind of wish
I didn't know, because that's a really stupid reason and it's really
annoying on a system with several versions of Emacs installed.
At various times I've considered spending some time on Emacs
development, but I always end up deciding that the source is too
nasty, and my skin's not thick enough. Periodically I get that second
point reinforced again.
> > Flavors
> > was based on some ideas about certain generic operations in MACLISP
> > (SFAs)
> ^^^^
> Can you explain?
Software File Arrays. Howard Cannon designed both SFA's in Maclisp and,
later, Flavors for the LispM. Chris is right to point out SFA's as the
conceptual parent of them. They were a special-purpose kind of object
for use in defining I/O (sort of like a class system whose only purpose
was writing streams, though technically nothing forced you to do I/O through
them and some of us perverted them for other purposes once in a while).
They basically had the ability to make an object with n indexed slots for
data and then to do SFA-CALL on the object to access one of several methods
that were defined by the programmer. The system had some pre-defined methods
it looked for, but you could add your own. The method got a pointer to the
object and could do an indexed slot access. It was really just a crude
flavor system where the storage mechanism was all too prominent and needed
to be abstracted away.
> Christopher Stacy <cst...@spacy.Boston.MA.US> writes:
>
> > [Flavors]...You said DEFFLAVOR instead of DEFCLASS, you said
> > DEFMETHOD, but there was no DEFGENERIC and method lambda-list
> > congruency was not enforced, and...
> New Flavors IIRC discarded the (SEND obj :MESSAGE args) stuff in favor
> of something more resembling generic functions. Right? Haven't
> looked at any Flavors code in a while though.
Most of what Chris said is right, but James' correction is also essentially
right. The New Flavors DEFGENERIC was still single-dispatch and didn't have
the arglist congruency, so it allowed some awful overloading.
> > The original Lisp Machine editor circa 1977 was called EINE, and ZWEI
> > was around before the end of 1979. My guess for the reason that ZWEI
> > used DEFSTRUCT (and the "array-leader" hack) instead of Flavors, is that
> > Flavors may not have been ready for prime-time when DLW began writing
> > the program. The Lisp Machine window system did not use Flavors until
> > a little after ZWEI.
>
> Was EINE related to Multics Emacs which (as I recall) was written in
> Maclisp? Or was EINE created from whole cloth?
I'm pretty sure Multics Emacs was written by Bernie Greenberg in
Multics Maclisp.
I'm pretty sure EINE was a completely unrelated project. Having looked at
the code, I'd guess by Mike McMahon. But maybe others, too.
> ja...@unlambda.com (James A. Crippen) writes:
>
> > It was considered good enough of a standard for Symbolics to
> > implement a Common Lisp for Genera, and LMI (TI?) to implement a
> > Common Lisp for their LispM offering (what the heck did they call
> > their OS anyway? Or did they call it anything?)
>
> LMI didn't call it anything. They sort of thought naming the OS (such
> as it was) was pompous.
>
> > Flavors was already in use in 1984. Flavors was an object system
> > based on ideas in Smalltalk, developed by (I think) Howard Cannon.
> > Lisp OO systems were not unknown in 1984 by any means. OO systems
> > were implemented in Lisp as research toys in the late 70s.
>
> I recall that `new flavors' was a major contender. No one has
> mentioned Drescher's `Object Lisp', but that had a few proponents as
> well (mostly at LMI).
There were four contenders: Flavors, Object Lisp, LOOPS (from Xerox), and
another one by Russ Atkinson that I can't remember the name of. Maybe
Common Objects? It had some nice notions of encapsulation I was sad
got lost.
The chief proble with Object Lisp was that it didn't have a class/object
distinction. If I recall, it worked like the MOO language of today, and
perhaps like Logo?, where you can instantiate any instance. I think a lot
of people feel this isn't as clean and leads to problems in describing
behaviors of objects because one must always speak in terms of pedigrees
instead of classes. Classes are really a natural conceptual concept
and one is handicapped without a corresponding implementational object.
I've used MOO and I can appreciate that there are some cool things about
this paradigm, but sort of like the Mac user interface, it feels more
optimized to newbies than serious uses. The more complex things get, the
more you wish for class structure.
> People skeptical of CLOS thought multi-dispatch would be `too slow'
> and that perhaps functions ought to belong to a class. The
> self-referential MOP was also considered to be `overkill' (or mental
> masturbation). There is quite a bit of work in bootstraping CLOS in
> order to present the illusion that CLOS is implemented in CLOS, and
> this seemed to be a rather academic issue.
Offering a freely available PCL (Portable Common LOOPS) that was a
proto-CLOS was essential to gaining community consensus to the idea that
this could be done efficiently.
The condition system had the same problem (and the same solution-- a free
reference implementation).
> There was also the issue that there was virtually no interesting code
> written in CLOS, but the entire LispM window system was written in
> Flavors (I think new flavors held the promise of an easier migration
> route).
The Xerox crowd had been largely omitted from the design of Common
Lisp, primarily because CL's design addressed the issue that Interlisp
almost won ARPA's heart for having more installed base. The Maclisp
community had lots of users, but every installation used a slightly
variant dialect and we had to unify to make CL in order to prove to
ARPA that we had the bigger installed base. So we were at war with
Interlisp, and we left them out. When ANSI CL happened, Xerox showed
up to join. I've heard it said that the price of repatriating a
left-out party is that you have to take some of their ideas and
integrate them, and I heard some people say that letting Xerox have
its way a lot with CLOS was the price of getting the Xerox/Interlisp
community folded back into Lisp community as a whole. To some extent
I think people didn't believe in the ideas but just weren't up to
fighting. But the ideas were sound and it's good we took them, I
think... except it did create some rift between Lisp and the rest of
the "encapsulation" community of programming languages. But c'est la
vie. This is all just my subjective impression and personal opinion;
not the official opinion of anything or anyone I've worked for or
belonged to.
It would probably not be a commercial product based on the commercial
Common Lisps. I would argue that there is so much marketing value in an
Emacs running on a Commercial Lisp that is downloadable over the Net that
it would far outweigh the usefulness of, say, a free Linux trial edition.
> Plus, as far as I know, CLISP is the only CL that's approximately as
> portable as GNU/X Emacs.
I do not think it is useful to aim for maximal portability from day 1.
> So the first step would be to beef up CLISP to around commercial quality.
Well, you cannot do that without the demand and a serious competition to
catch up with. Large free projects die when they have no cometitor, even
though most of the propaganda for Open Source and the like is that people
share their efforts. Linux succeeds so well because many very good and
very smart people hate Microsoft's hegemony so much they want to beat it
into a pulp. Take way Microsoft, and you take away so much of the "fuel"
for Linux's and Open Source development that people will realize that it
was not for anything else they __actually did it. Good thing there will
be yet a few years before they croak.
> Erik Naggum <er...@naggum.net> writes:
>
> > * Christopher Stacy
> > > I'd be more interested in a CL version. One issue is: what
> > > compiler/interpreter and development tools would be distributed with the
> > > editor so that people could load extension libraries, and write their own
> > > extensions.
> >
> > * James A. Crippen
> > > Well, there'd only be a couple to choose from. CMUCL, CLisp, and GCL.
> >
> > This is a _very_ counter-productive position. Excluding the commercial
> > vendors from this project will be the best way ever to destroy Common
> > Lisp. Instead, write the Emacs on top of sufficiently powerful Common
> > Lisp and cause the free Common Lisps to become powerful enough to deal
> > with it.
>
> Okay, but you'd still need it to ship with a free CL, because I can't
> imagine any C++ programmer wanting to buy a commercial CL just to run
> Emacs, especially if s/he could just use the old Elisp-based one.
>
> Plus, as far as I know, CLISP is the only CL that's approximately as
> portable as GNU/X Emacs.
Ah, good point. Hadn't thought of that. It would be. CMUCL doesn't
run on certain platforms. And it has some funny build requirements.
Dunno what the state of SBCL is as far as porting it to platforms that
CMUCL hasn't already been ported to at some point.
I think that bundling CLisp with the CL-based Emacs wouldn't be a bad
idea, but it would be important to design the Emacs such that it could
be easily recompiled and used in some other CL. Thus the interfaces
to things like files, network protocols, graphics displays, etc,
should all be done through clean interfaces so that the internals of
each could be reimplemented without having to untangle the Emacs from
its support systems.
This is only good design, but that's part of the problem -- the two
popular Emacsen suffer from a lack of design forethought.
> > If you start with an insufficiently powerful Common Lisp, you will
> > get the same kind of rushed disasters that Emacs Lisp is full of.
> > We have to realize that the reason that Emacs Lisp is the way it
> > is, is that the core language support is hopelessly insufficient.
>
> That's probably true. So the first step would be to beef up CLISP to
> around commercial quality. Hell, if you threw all the development
> hours going to GNU Emacs and X Emacs into the task of writing a Really
> Good CL-Based Emacs, you could probably do it. Alas, I'll probably be
> having a snowball fight with the devil by the time this happens,
> though...
Nah, it's not important to take on the task of improving CLisp. You'd
be better off leaving that to the people who hack on CLisp regularly.
They'd be quite happy to help get an Emacs running faster in their
system, I'm sure. Don't fall into the trap of 'one person should do
everything' or 'everything should be implemented from scratch'.
That's what kills a lot of public software development projects.
The whole idea of implementing a CL Emacs isn't that terribly
difficult. Someone just needs to start planning it. If the plans and
design are public then anyone can pick up from there. The hard parts
are design and infrastructure. Building the Emacs is just a matter of
borrowing the best Emacs design ideas from the past and implementing
them efficiently, then gluing them all together according to the
design specs. And if it's all done in CL it'd be pretty fast to
develop, as compared to something like C.
What parts already exist?
CLX for the X interface. Design with the idea of using CLIM someday
when a free implementation is available. Perhaps extending CLX's back
end to use the Lucid Emacs widgets that have already been built.
Those are completely free and hackable, and are well ported. Or else
some popular widget set could be used (I don't encourage the use of
the Athena widgets, really) but most of those widget libraries are
more complicated, or carry licensing restrictions.
Sam Steingold appears to have taken the initial baby steps in getting
ELisp code usable under CL. See http://cvs.sourceforge.net/cgi-bin/viewcvs.cgi/~checkout~/clocc/clocc/src/cllib/elisp.lisp
Networking seems to have been started with net.lisp, another work by
Sam Steingold: http://cvs.sourceforge.net/cgi-bin/viewcvs.cgi/~checkout~/clocc/clocc/src/port/net.lisp
The various implementations of networking protocols already done in
ELisp should be coalesced into a general sort of networking layer.
File system handling is fairly well taken care of already in CL. It's
not pretty, and perhaps some sort of Emacs-specific file handling
library should be built on top of it (since the major targets for it
will be Unix, WinDOS, and perhaps MacOS <=9).
The existing ELisp software would need porting, but I don't think that
it would be terribly hard to implement an ELisp REPL under CL, based
on the work already done by Steingold, and perhaps using some of the
ideas from that MIT-AI memo on translating ELisp to Scheme. Then
software could be easily converted over time, or just used in the
emulated ELisp.
However, I think that the users of CL Emacs would soon end up
developing their own software for use with CL Emacs, separate from
that used in RMS Emacs. Things would just be better done differently
in a CL Emacs than they would simply translating ELisp.
All in all, the parts are already out there to build a CL Emacs. All
it needs is some careful work put into design. Then it just takes
time and commitment.
> That is perhaps overstated. It is true, I believe, that any
> packages that are automatically loaded into Emacs and then dumped
> with the Emacs image (cc-mode comes to mind) cannot use the CL
> library, and I don't know why that is, but certainly (require 'cl)
> is not actively discouraged.
An examination of your `grep' hits shows that most, if not all, of
them are wrapped in `eval-when-compile', which means that they are
careful not to require cl*.el at run-time.
Things might have changed recently, but it used to be the case that
Stallman allowed the use of CL macros like `setf' and `push', but
discouraged the use of functions. He requested that often-used Emacs
packages in the Emacs distribution not require cl*.el to be loaded at
run-time.
Unfortunately, this prevents some very useful parts of the interface
from being used.
> Dunno what the state of SBCL is as far as porting it to platforms that
> CMUCL hasn't already been ported to at some point.
SBCL is fundamentally about as hard to port as CMUCL is: i.e. not much
work at all for new OSes on an existing architecture, provided they're
POSIXy; substantially more work to introduce a new backend, much of
which is doing things like transcribing the entire instruction set and
assembler mnemonics thereof.
It's not a candidate for software that you hope to see on Windows any
time soon, unless you'd like to work on porting it to windows.
> All in all, the parts are already out there to build a CL Emacs. All
> it needs is some careful work put into design. Then it just takes
> time and commitment.
I love the way you say "just"
-dan
--
http://ww.telent.net/cliki/ - Link farm for free CL-on-Unix resources
I think the first major system implemented in CLOS (aside from CLOS
itself :-) was CLIM. If memory serves, there was even some shared code
in early versions of each, since Gregor and Ramana were colocated. It
was also the only object system that satisfied the portability
requirements of CLIM.
On Wed, 26 Sep 2001 22:46:43 GMT, Kent M Pitman <pit...@world.std.com>
wrote:
>
>The Xerox crowd had been largely omitted from the design of Common
>Lisp, primarily because CL's design addressed the issue that Interlisp
>almost won ARPA's heart for having more installed base. The Maclisp
>community had lots of users, but every installation used a slightly
>variant dialect and we had to unify to make CL in order to prove to
>ARPA that we had the bigger installed base. So we were at war with
>Interlisp, and we left them out. When ANSI CL happened, Xerox showed
>up to join.
The early history has been pretty well documented already, but I think
Xerox shot their own foot off by thinking they could go it alone.
>I've heard it said that the price of repatriating a
>left-out party is that you have to take some of their ideas and
>integrate them, and I heard some people say that letting Xerox have
>its way a lot with CLOS was the price of getting the Xerox/Interlisp
>community folded back into Lisp community as a whole.
Actually, CLOS is radically different from LOOPS, and shares almost
nothing with it other than the early name of CommonLoops. Gregor did
most of the implementation on a Symbolics, and even porting PCL to
Xerox Common Lisp was non-trivial. I think the analogy to SERIES is
closer, as it was a portable object system that ran on most
implementations, and the bulk of the work on it was done by an
independent third party (Gregor worked for PARC, and by that time
Xerox had rid themselves of the Lisp business).
I think the biggest objection to CLOS was the perception that it was
going to be slow and expensive. I've got a paper around here that
theorised that the fastest slot lookup would take at least 3 cycles,
method dispatch 5, or something like that. PCL of that era was much
slower.
The first native CLOS, if memory serves, was that for the TI Explorer.
In fact TICLOS was faster than Flavors. I think Jonl was next to
depart from PCL as a base for CLOS, and Lucid's native version was a
lot faster than the PCL version. Franz did the same thing, if memory
serves, with similar results. This is also why the MOP never got
standardised, since everyone had started to diverge from the base
code. I seem to remember some sort of "ignore the MOP and you can run
faster CLOS" switches in the early native versions (I remember
something like this in Lucid, I'm not so certain about Franz'
version).
There's a bunch of CLOS history in Andreas Paepcke's book, "Object
Oriented Programming - The CLOS Perspective".
...arun
> This is only good design, but that's part of the problem -- the two
> popular Emacsen suffer from a lack of design forethought.
Don't you think this is a bit arrogant? To what time were the Emacses
implemented? And haven't the grown since then considerably? Have you
considered that all the adding and patches to support the new ideas
have messed up the initial design?
>
> The whole idea of implementing a CL Emacs isn't that terribly
> difficult. Someone just needs to start planning it.
Well I encourage you to start the planning. And of course I hope you
can show a "proof-of-concept" soon it isn't "that terrible difficult"
(your words)
>If the plans and
> design are public then anyone can pick up from there. The hard parts
> are design and infrastructure.
Is design really the problem? I can't see how a design can be so good
right from the start to grow in such strange areas as the Emacs stuff
has. Well according to Peter? Gabriel in software patterns it seems
it's more important to plan for what he calls "piecemeal growth" than
for anything else. The promoters of XP have a simular opinion I guess
and they have some good reasons on their side.
>Building the Emacs is just a matter of
> borrowing the best Emacs design ideas from the past and implementing
> them efficiently, then gluing them all together according to the
> design specs.
Well I really would see one of those design specs. And I bet the end
result will not have much in commoin with it.
> And if it's all done in CL it'd be pretty fast to
> develop, as compared to something like C.
Again, I encourage you to got for it. You can bash C as long as you
like but you will find magnitudes more code and libraries written in C
than in Common Lisp.
>
> What parts already exist?
>
> CLX for the X interface. Design with the idea of using CLIM someday
> when a free implementation is available.
What a work, plan for one implementation but be sure that you can
change it on the fly during later stages without sacrifiying
anything.
I bet that this is hardly possible. And I too think one probably
better start with CLIM right from the start. Even if there isn't a
free implementation available at the moment. Why now CLX as you said
is runs on X. What about Apples? What about Windows? AFAIK does all
the vendors offer CLIM and it's available on all platforms. Well the
decision of course should be if one should care right from the start
about portability or not. Well you mentioned design foresougth, I
can't tell what "the right thing" would be.
>
> The various implementations of networking protocols already done in
> ELisp should be coalesced into a general sort of networking layer.
Oh fine, I assume you volonteer?
>
> The existing ELisp software would need porting, but I don't think that
> it would be terribly hard to implement an ELisp REPL under CL, based
> on the work already done by Steingold, and perhaps using some of the
> ideas from that MIT-AI memo on translating ELisp to Scheme. Then
> software could be easily converted over time, or just used in the
> emulated ELisp.
That might be a feasible way. I can't tell. It could be that a sort of
compiler or so may be a better option. Wel I do not think that the
MIT memo about Elisp and Scheme is all too helpful, they have a bunch
of problems which simply do not exist in Common Lisp...
>
> However, I think that the users of CL Emacs would soon end up
> developing their own software for use with CL Emacs, separate from
> that used in RMS Emacs.
I agree, anyway it will be the question if CL Emacs provides so much
as the "old" Emacses do at the moment. Or at least the provide what
most of the people are expecting from an "Emacs-based" Editor.
Regards
Friedrich
Are you trying to say that the most popular programming language is the
one that will allow you to most quickly develop robust complex applications?
It's Richard P. Gabriel. Or at a pinch Nickieben Bourbaki :)
Peter Gabriel is a musician (Sledgehammer, Don't Give Up, etc).
--
John Paul Wallington
> This is only good design, but that's part of the problem -- the two
> popular Emacsen suffer from a lack of design forethought.
That's an interesting statement. My take on this is rather different;
I think its _because_ emacs wasn't over-designed to start with that
it had all this wonderful life and extensibility. You know, the
"big ball of mud" thing.
I'd take CL over elisp anyday, but I wonder why people think
a re-write would be a good thing? Just speed of execution of
the lisp stuff?
What _important_ changes would YOU like to see to emacs?
Curious minds want to know.
--
It would be difficult to construe Larry Wall, in article
this as a feature. <1995May29....@netlabs.com>
Curiously enough I have this thing lying around. I don't know it's
pedigree, except that I typed this in from a printed version I made
with TEdit, so it may have bugs. I finally threw away the tapes with
my dmachine stuff (including this, probably) on only quite recently, I
suspect I will regret this one day.
--tim
--cut--
* Lisp Hacker
(Tony Simons, LEPS '87)
(Sung to the tune of *Sledge Hammer*, by Peter Gabriel)
... dedicated to the memory of DWIM, the programmer's assistant
You could have a workstation
If you'd just lay down the Vax.
You could have a floating point, floating
If you'd send your Symbolics back.
All you do is call me...
I'll do anything you need.
You could have spaghetti stacks,
Going up and down, all around the bends.
You could have a break package, breaking...
This amusement never ends!!
I want to be ... your lisp hacker.
Why don't you call me by name?
Oh let me be ... your lisp hacker.
I'll do anything you mean (Yeah! ... Yeah!)
Show me round your program,
'Cause I will be your honey bee
Buzzing round your bonmnet,
Fixing all your code where it seems right to me.
Don't mind if you can't spell it...
I'll do anything you mean.
You could have CONDitionals for CONNect;
I could change your variable names.
I could lock you in the default directory,
Catch me if you can, I just love these games!
I want to be ... your lisp hacker.
Why don't you call me by name?
You better call ... the lisp hackler.
Do what I mean for a change.
I want to be ... your lisp hacker.
Put your mind at rest.
I'm going to be ... the lisp hacker,
Let there be no doubt about it.
Lisp ... lisp ... lisp hacker.
I've kicked the habit (kicked the habit),
Shed my skin (shed my skin).
This is the new stuff (this is the new stuff),
I go dancing in (we go dancing in).
Oh, won't you call for me
And I will call for you;
Oh, won't you call for me
And I will call for you.
I've been loading the Lyric,
Been loading the Lyric.
Going to build that power,
Build, build up that power, hey! ...
Curiously enough I have this thing lying around. I don't know its
> ja...@unlambda.com (James A. Crippen) writes:
>
> > This is only good design, but that's part of the problem -- the two
> > popular Emacsen suffer from a lack of design forethought.
>
> That's an interesting statement. My take on this is rather different;
> I think its _because_ emacs wasn't over-designed to start with that
> it had all this wonderful life and extensibility. You know, the
> "big ball of mud" thing.
I'm not smart enought to talk about the design of either GNU Emacs or
XEmacs.
> What _important_ changes would YOU like to see to emacs?
> Curious minds want to know.
The two things I would love to pick up in emacs is multi-threaded
support and FFI. Oh, add a third thing: ability to write directly to
devices (think palm syncing).
--
(__) Doug Alcorn (mailto:do...@lathi.net http://www.lathi.net)
oo / PGP 02B3 1E26 BCF2 9AAF 93F1 61D7 450C B264 3E63 D543
|_/ If you're a capitalist and you have the best goods and they're
free, you don't have to proselytize, you just have to wait.
>
> Are you trying to say that the most popular programming language is the
> one that will allow you to most quickly develop robust complex
>applications?
No, ways beyond what I want to say. Anyway I can get a lot of
libraries for C and I expect that they have run for a while before and
are "stable". But I have the libraries and if I'm unhappy with one of
it I surely can find another implementation. So I think one can write
extremly robust code even in C while using the appropriate libraries.
Regards
Friedrich
My primary concern is the educational value of a Common Lisp application
that could replace the bogus Emacs Lisp as a demonstration of what a good
Lisp can do. There is also the speed and convenience of many sorts, but
that is way below the effect of enhancing the Common Lisp systems because
they suddenly have to face up to a much larger user community who will
have all the needs that are not with the current systems, and hopefully
will be enthusiastic about a "better Emacs" and want to do stuff in it,
which will then be a real Common Lisp environment that can compile their
files and such things, not just an application that is fairly closed.
> Friedrich Dominicus wrote:
> >
> > Is design really the problem? I can't see how a design can be so
> > good right from the start to grow in such strange areas as the Emacs
> > stuff has. Well according to Peter? Gabriel in software patterns it
> > seems it's more important to plan for what he calls "piecemeal
> > growth" than for anything else.
>
> It's Richard P. Gabriel.
Well sorry I wasn't sure and hadn't the book at hand. But I checked
and yes I was wrong. But the idea was to "design" for piecemeal grow
and that's the point. So Emacs was extended extremly but it's users
and so it seems that the design allowed that. So let's assume the
"original" design is more than 20 years. Well I think that's quite a
long time for software.
Regards
Friedrich
> That's an interesting statement. My take on this is rather different;
> I think its _because_ emacs wasn't over-designed to start with that
> it had all this wonderful life and extensibility. You know, the
> "big ball of mud" thing.
Yep. Stallman says it's the reprogrammability, not the command set,
that is the essence of Emacs. So one might say that if you have a
keyboard comtab, you have Emacs.
(Btw, I was told long ago that this "detail" of design was not
Stallman's original idea but Guy Steele's. That is, Steele suggested
and Stallman implemented the idea of extending Teco to have a bunch of
q-registers corresponding to each character you could type and users
could associate whatever macro definitions they wanted with each key.)
If you use ILISP, then you are saying (thanks to me) that RMS is wrong
in this respect. :)
(require 'cl)
is not wrapped in ILISP code (and, as far as I am concerned, it will
never be).
So the question is: have you noticed it?
I'd bet no. Hence, the question is moot. RMS got this one wrong. (He
also got Guile wrong, and that is another story too).
Cheers
--
Marco Antoniotti ========================================================
NYU Courant Bioinformatics Group tel. +1 - 212 - 998 3488
719 Broadway 12th Floor fax +1 - 212 - 995 4122
New York, NY 10003, USA http://bioinformatics.cat.nyu.edu
"Hello New York! We'll do what we can!"
Bill Murray in `Ghostbusters'.
The one really crucial change: Adding in the ability to multithread.
Certain operations take over the Lisp interpreter, potentially for
significant periods of time. [GNUS is probably the best example of
this...]
Having the ability to throw processing into separate threads that run
concurrently would be Quite Valuable.
--
(reverse (concatenate 'string "ac.notelrac.teneerf@" "454aa"))
http://www.cbbrowne.com/info/
"While the Melissa license is a bit unclear, Melissa aggressively
encourages free distribution of its source code." -- Kevin Dalley
<ke...@seti.org>
> I'd bet no. Hence, the question is moot. RMS got this one wrong. (He
> also got Guile wrong, and that is another story too).
Do tell about Guile. What's your opinion?
> Alain Picard <api...@optushome.com.au> writes:
> > What _important_ changes would YOU like to see to emacs? Curious
> > minds want to know.
>
> The one really crucial change: Adding in the ability to multithread.
>
> Certain operations take over the Lisp interpreter, potentially for
> significant periods of time. [GNUS is probably the best example of
> this...]
The garbage collector is another.
> Having the ability to throw processing into separate threads that run
> concurrently would be Quite Valuable.
And most major implementations of CL support threadedness, or
something like it.
Essentially when I talk about design, I mean that the back end, the
internals of the Lisp system, the interfaces to the various external
libraries and OS support, needs serious redesign.
What this entails is in essence very careful consideration of
modularity and separation between the low-level control areas of the
program and the high-level interaction areas of the program.
Some of the major problems in extending RMS's Emacs today stem from the
fact that the Lisp system implementation is not cleanly separated from
the other low-level areas of the system, such as the graphics display,
the character representation, the network interfaces, etc. The ELisp
system uses lots of knowledge of implementation details to squeeze
performance out of the various OS-level interfaces. Its design is
fairly non-portable because the ELisp system knows too much about its
environment and so much of the interaction with a different
environment must be rewritten to match ELisp's conception of how the
environment should behave. As an example, filenames and paths are
explicitly Unix-like in Emacs. This has been one reason why Emacs has
persistenly been difficult to port to the Macintosh, for instance.
The redisplay mechanism is far too smart about the graphics device it
displays on (an X window), and this has made it fairly difficult to
port to both Windows and Mac. Now that's not to say that these ports
don't exist, but they've entailed major overhaul of the internals of
Emacs, because Emacs *internally* wasn't ever designed to be modular.
The whole ELisp system was designed to be one giant lump that you
would stick interfaces to libraries into on the back side, then extend
the Lisp on the front side. Unfortunately the interfaces to the
external libraries were so slow and difficult to deal with because of
the continual conversions back and forth between ELisp and C
datatypes, and the necessity of protecting these from the garbage
collector, that the ELisp implementation needed to know more and more
about the external libraries that it was using, and at this point the
interfaces to the various libraries have become one with the ELisp
implementation. Changing ELisp requires changing the interfaces,
which requires understanding exactly what the hell the interfaces are
doing. None of this is modular.
Since the front end is mostly coded in Lisp it's very easy to hack.
This is Emacs's Big Win. But its back end is *not* easy to hack at
all. It is my firm belief that if the ugliness of the ELisp
implementation was discarded in favor of a robust and well-developed
(and portable) Common Lisp environment, and then the various external
library interfaces were done through clean and independent interfaces
(whose internals could vary depending on things such as the local FFI,
the library involved, etc), then you'd have a back end to Emacs that
would be as easy to hack as the front end.
Basically what I am arguing for is more abstraction in the back end of
Emacs. I've looked at the guts of XEmacs on more than one occasion,
and each time I've been frightened by what I see. The whole thing is
huge piles of hairy C macros and global state that comprises a large
knotted ball of code that is very difficult to untangle. Suppose you
want to address a major problem such as internationalization. You
can't just go replace the character implementation because every
library interface that deals with characters knows about how
characters are implemented and utilizes this knowledge in ways that
aren't portable to a different implementation of characters. Every
single place where characters are used in the back end knows something
internal about the implementation of characters, and uses this
knowledge in often unusual ways.
The idea of developing with a big ball of mud is excellent. It's good
to present everything to the programmer and allow the programmer to
wrap new functionality around old, to extend old functionality in new
directions, and to provide for ample program growth. My argument is
that RMS's Emacs doesn't provide for such growth in its back end. Its
back end isn't a ball of mud, it's a tangled mess of spaghetti --
change one thing and you break others you didn't even know about.
It's too cross-connected. When you want to sculpt a ball of mud into
a different shape that's not topologically equivalent you need to
break the ball into pieces. When the ball is made of string and is
knotted together in complicated ways it's very difficult to break
apart. If however the ball is made up of lots of smaller independent
balls, like clay is made of grains of silt, then you've got something
you can easily mold into new forms.
Basically the back end of Emacs is very difficult to extend or
change. It needs a new design, and a language that supports
extensible design. The back end of Emacs is written in C, a language
that does not encourage extension, and is written in a way that also
does not encourage extension. I am arguing that a new Emacs be
written in a language that is easy to extend, and in ways that
encourage extension in new and unforeseen directions. I think that
Common Lisp is the ideal language for this implementation.
Those that argue that Emacs is fine the way it is are doomed to spend
the rest of their lives waiting for an NNTP connection to die in Gnus,
and are doomed to wait for the GC to finish collecting the 1MB buffer
that they just flushed. They are doomed to live with characters
implemented as integers, and are doomed to live with GCPRO.
> Alain Picard <api...@optushome.com.au> writes:
>
> > ja...@unlambda.com (James A. Crippen) writes:
> >
> > > This is only good design, but that's part of the problem -- the two
> > > popular Emacsen suffer from a lack of design forethought.
> >
> > That's an interesting statement. My take on this is rather different;
> > I think its _because_ emacs wasn't over-designed to start with that
> > it had all this wonderful life and extensibility. You know, the
> > "big ball of mud" thing.
>
> I'm not smart enought to talk about the design of either GNU Emacs or
> XEmacs.
I first read this as: "I'm smart enough not to talk about the design
of either GNU Emacs or XEmacs." Twice.
I'm not smart enough to know whether this means I've read too much
Emacs source or too little of it.
> > What _important_ changes would YOU like to see to emacs?
> > Curious minds want to know.
>
> The two things I would love to pick up in emacs is multi-threaded
> support and FFI.
[snip]
I guess these two would make the rest of the desired
changes/enhancements/features far easier to implement than they are
now. I'd like to add a wish for a better (in some way which I'm
unqualified to exactly define) garbage collector. The one or ones now
used do not usually cause trouble, but I always have the feeling that
some day the stop-collect will end up having a larger working set than
I have memory available or some other undesirable event will take
place. Luckily typical memory amounts have been rising more than the
size of my Emacs processes.
--
Teemu Kalvas
> Those that argue that Emacs is fine the way it is are doomed to spend
> the rest of their lives waiting for an NNTP connection to die in Gnus,
> and are doomed to wait for the GC to finish collecting the 1MB buffer
> that they just flushed. They are doomed to live with characters
> implemented as integers, and are doomed to live with GCPRO.
>
I think you're right. Sooner or later Emacs needs to be re-written.
Why not sooner? In this spirit I've setup a page on Cliki for
discussing the design. I'd love to just go off and make some
proof-of-concept and then say, "Come hack on this". But I'm not
nearly that smart (yet). I encourage you to make comments relating to
specific design issues on the Cliki page.
http://ww.telent.net/cliki/CL-Emacs
> Those that argue that Emacs is fine the way it is are doomed to spend
> the rest of their lives waiting for an NNTP connection to die in Gnus,
> and are doomed to wait for the GC to finish collecting the 1MB buffer
> that they just flushed. They are doomed to live with characters
> implemented as integers, and are doomed to live with GCPRO.
Those who argue that emacs needs to be redesigned/rewritten using CL
are doomed to spend the rest of their lives *talking* about it on CLL.
;-)))
Peter
Or they wisely shut up about it.
> > Certain operations take over the Lisp interpreter, potentially for
> > significant periods of time. [GNUS is probably the best example of
> > this...]
>
> The garbage collector is another.
which is no different from Common Lisp implementations :-(
my dream is that someone would manage to bring the work of Takeuchi (*)
on "hard" real time lisp into a common lisp garbage collector...
(*) see the proceedings from JLUGM 2000:
http://jp.franz.com/jlug/en/jlugm2000/
--
(espen)
> If you use ILISP, then you are saying (thanks to me) that RMS is
> wrong in this respect. :)
He's not wrong, he just never pressed the issue. I remember he was
pretty adamant about Gnus, which is used by many more people that
ILISP. It is also possible that he changed his mind.
> Alain Picard <api...@optushome.com.au> writes:
> > What _important_ changes would YOU like to see to emacs? Curious
> > minds want to know.
>
> The one really crucial change: Adding in the ability to multithread.
This is far from the *only* thing that can benefit using CL. There
are quite a few things.
Here's how I'd think about it. Look at your favorite Emacs or KDE
apps. sure, they differ a lot, but there are certain principles and
extensions that they follow. Especially Emacs. Emacs has the
following general properties:
1) most of the neat programs run inside an xterm (e.g. Gnus, VM,
w3...)
2) all good Emacs apps are pretty easily controlled by key bindings
Then, you dive into some of the individual programs, like Gnus, or
ilisp (using comint), and see that even within Gnus, there is a notion
of a standardized way of extending the application (e.g. Gnus and its
various backends).
Elisp and CL are generally nice for extending behavior because of
macros, and other language features. Emacs Lisp mostly lacks a true
CLOS/MOP (eieio is heavily lacking) which makes it a bit hard to do
some of the nice things that you'll see in something like LispWorks
CAPI, which has a simple Emacs-style editor pane that you can create
and customize very easily.
So, getting back to how this all relates to why XEmacs could do much
better with CL...It could make use of packages written in CL, and not
written in elisp to do serious things, e.g. parsing HTML and XML (of
course, the HTML parser used in w3 is neat, but it's not the nicest
code to read, and part of that is because it was written in elisp).
there have been some great CL contributions in the past few years, and
most of the interesting ones couldn't be ported to elisp without lots
of work.
Here's another interesting aside. I wrote some CL code in grad school
that implemented HMMs in CL (along with all of the matrix stuff). It
worked well, and it was quick, and I used ACL back then (since my
university had a site license). Well, it was great. But guess what?
(X)Emacs would not be able to handle that stuff easily, and if it were
ported, then the elisp compiler *still* wouldn't be able to do
justice.
Bottom line: elisp is cute, and I heavily enjoy programming in it.
And it's been able to help make many problems go away. But it still
sucks hard. I prefer it to Scheme in a heartbeat, but it's still not
a realistic programming language, and IMO is what's keeping the emacs
environments back.
The reason I've given up on Gnu Emacs is that it's tied to someone's
ego, and I can't stand that. I think that it's worthwhile to start
all over again, with a fresh CL, and a fresh mind...and start writing
the editor which will emulate Emacs, but which will be done right.
As for micro opinions. I would absolutely use CLISP, as it is a fine
Lisp, and its portability *is* critical, since the purpose of this
project will be that programmers *don't* program in non-portable
behavior, and also to maximize the contributing community. If it's
done in CLISP, then Mac, Unix, and Windows people can all get in (as
well as several others, not to mention how many Unixes there are).
dave
> * I wrote:
And now -- how did your smart newsreader make up this one?
--
Janis Dzerins
If million people say a stupid thing it's still a stupid thing.
> Tim Bradshaw <t...@cley.com> writes:
>
> > * I wrote:
>
> And now -- how did your smart newsreader make up this one?
Regexp syndrome. I have a clever regexp that tries to spot people who
are me (because I have various accounts, and historically have had
more than I do now). This regexp has been sufficiently hairy for some
time that I actually have an (e)lisp functin which writes it for me in
its various forms (for emacs, procmail, with or without various
quotifications &c). But because regexps are broken it occasionally
blows up like this. Fortunately in this case I have some control over
the code, so this shouldn't happen any more, due to the wonders of
boolean logic and yet another regexp, this one defining a set of
people who look like me but actually aren't. But there are other
places this thing is used where I can't do this without maintaining
patched versions of things like VM and GNUS which I am loth to do.
Soon, my life will be entirely taken over by the endless struggle to
keep these insane regexps from eating all my mail, and I will, after
several nervous breakdowns, end my days as a sad, embittered old man
in some home for geriatric systems programmers.
Let this be a lesson to people who think that regexps are the answer
to anything.
--tim
> For historical edification, in case it wasn't obvious from the
> previous messages: Flavors was Multiple Inheritence.
I know New Flavors was. Was this true for 'Old' (or dare I say
'Vanilla') Flavors as well?
ISTR that I read it was, particularly because the Smalltalk flavor of
OO that the Flavors author(s?) had experience with was not. Wrongp?
> Christopher Stacy <cst...@spacy.Boston.MA.US> writes:
>
> > For historical edification, in case it wasn't obvious from the
> > previous messages: Flavors was Multiple Inheritence.
>
> I know New Flavors was. Was this true for 'Old' (or dare I say
> 'Vanilla') Flavors as well?
Quite definitely. The original big application for Flavors was
implementing the window system, which used it heavily, even in
Old Flavors. The whole idea of mixins (part of the original "flavors"
metaphor, based on Steve's Ice Cream terminology) was multiple
inheritance. Flavors used mixins heavily to add borders, etc.
> ISTR that I read it was, particularly because the Smalltalk flavor of
> OO that the Flavors author(s?) had experience with was not. Wrongp?
No idea about the relation to Smalltalk. I never tracked that.
Regards
Friedrich
I know it's something of a flip comment, but I don't get your meaning.
Having single dispatch (which both Old and New flavors did) made methods
a little more awkward to write individually, and might have had a different
arg order sometimes than one would write with multi-methods, but it didn't
really affect the number or nature of mixins, as far as I know.
Can you suggest how you think maybe they did so we can discuss it concretely?
Am not totally sure but think of having a window with two different
kind of "border" than I could write
(defmethod show-window ((win window) (border borderstyle))
)
So I do not have to have mixin
window-with-plain-boder-mixin
window-with-motif-styled-border-mixin....
I'm not sure which approach is the better one. But it seems that you
have a choice in Common Lisp but not in Flavours. Looking through some
of the docs and sources it seems that this mixins were probably even
overused...
I think this show-window method is easy extensible, but any time you
add another border-style you have to create a new mixin... It seems
to me as if things are tight too much together, and to some extend it
reminds me of some pattern ( I think it is called Decorator in the GOF
book). There are surely a lot of ways to solve this problem, but it
seems that the Multimethod stuff is the most straigtforward and
easiest to understand. Of course YMMV
Regards
Friedrich
> Kent M Pitman <pit...@world.std.com> writes:
>
> > I know it's something of a flip comment, but I don't get your meaning.
> > Having single dispatch (which both Old and New flavors did) made methods
> > a little more awkward to write individually, and might have had a different
> > arg order sometimes than one would write with multi-methods, but it didn't
> > really affect the number or nature of mixins, as far as I know.
> > Can you suggest how you think maybe they did so we can discuss it
> >concretely?
> Am not totally sure but think of having a window with two different
> kind of "border" than I could write
>
> (defmethod show-window ((win window) (border borderstyle))
> )
>
But if anyone wanted this kind of thing in Flavors, it was *very* routine
to write something that did
(defmethod (:show-window window) (border)
(typecase border ...))
or, even more likely:
(defmethod (:show-window window) (border)
(send border :draw-yourself self))
The thing you're thinking of is not a capability but a syntax, and people
absent the syntax are very used to getting around it.
> So I do not have to have mixin
> window-with-plain-boder-mixin
> window-with-motif-styled-border-mixin....
Well, at the time we were still debugging the use of class definitions at
all. People generally thought window border was an intrinsic type of the
window. Later, under dynamic windows, border objects were a slot value,
and this worked fine in single-dispatch. It wasn't the syntax that did
this, it was the realization of what might change (or want to be composed)
and what might not.
> I'm not sure which approach is the better one. But it seems that you
> have a choice in Common Lisp but not in Flavours. Looking through some
> of the docs and sources it seems that this mixins were probably even
> overused...
Hopefully I've debunked this.
> I think this show-window method is easy extensible, but any time you
> add another border-style you have to create a new mixin...
You're really assuming we let the syntax drive us and that we had no brains.
Syntax doesn't change very fast for practical reasons, but people learn
very quickly and any relatively general syntax can cope. That's why Smalltalk
has lasted as long as it has--single inheritance and single dispatch have
their known limitations, but they also have their elegance. And saying one
is crippled by using them or assuming that the syntax will drive a particular
way of connecting things is selling the user-base short.
> It seems to me as if things are tight too much together, and to some
> extend it reminds me of some pattern ( I think it is called
> Decorator in the GOF book). There are surely a lot of ways to solve
> this problem, but it seems that the Multimethod stuff is the most
> straigtforward and easiest to understand. Of course YMMV
Much though I like CLOS, I think most of the world's mileage varies, or
there would not be such a passion for what people call object-oriented
(and I call object-centric) programming. CLOS uses what I call gf-centric
programming, which I think is just as object-oriented as object-centric
programming. But the Java folks have a model *way* closer to Flavors,
especially to new flavors, than to CLOS, and they're not exactly crippled
either--at least not by *that* decision. ;-)
>
> Well, at the time we were still debugging the use of class definitions at
> all. People generally thought window border was an intrinsic type of the
> window. Later, under dynamic windows, border objects were a slot value,
> and this worked fine in single-dispatch. It wasn't the syntax that did
> this, it was the realization of what might change (or want to be composed)
> and what might not.
Yes I think that's the way it goes. But a lot of code was written
before that and it still "survives" and therefor I do think a lot of
mixins are there which wouldn't otherwise.
>
> > I'm not sure which approach is the better one. But it seems that you
> > have a choice in Common Lisp but not in Flavours. Looking through some
> > of the docs and sources it seems that this mixins were probably even
> > overused...
>
> Hopefully I've debunked this.
Well, it was a "fast" thing which has come to my mind. And I'm aware
of alternatives, in fact I would have to use simular solutions in
Eiffel, but I've to think "harder" and just if you know that there are
other solutions you can judge which approach seems to be better, if
you do not have some facilities you won't miss them, because you know
how to get around it. This "get around" is anyway not always desirable
or even good, but if you can't do it in another way ... you mut use
it..
As I said I'm not sure which approach is the "right" one.
>
> > I think this show-window method is easy extensible, but any time you
> > add another border-style you have to create a new mixin...
>
> You're really assuming we let the syntax drive us and that we had no
>brains.
Well I did not want to say that and am sorry that it sounds like
that. Fact is that I'm impressed by the programmer friendlieness of
OpenGenera, of course all is totally new for me. (anyone else who just
started using LispMachines with OpenGenera like me). After climbing a
bit upwards and getting "basic" things work, I really like to work
with it. What I especially like is the interactivity.
>
> > It seems to me as if things are tight too much together, and to some
> > extend it reminds me of some pattern ( I think it is called
> > Decorator in the GOF book). There are surely a lot of ways to solve
> > this problem, but it seems that the Multimethod stuff is the most
> > straigtforward and easiest to understand. Of course YMMV
>
> Much though I like CLOS, I think most of the world's mileage varies, or
> there would not be such a passion for what people call object-oriented
> (and I call object-centric) programming. CLOS uses what I call gf-centric
> programming, which I think is just as object-oriented as object-centric
> programming. But the Java folks have a model *way* closer to Flavors,
> especially to new flavors, than to CLOS, and they're not exactly crippled
> either--at least not by *that* decision. ;-)
Well I agree, and of course I won't say they are "crippled" but it
seems clear to me that at least one degree of flexibility is
simply not there. And to get that back you "have" to think
harder. Obviously it can be done but I see it quite a bit different
since I'm using Common Lisp. And I have especially an "ungut" feeling
if I look through the GOF book. A lot of patterns have been developed
(IMHO) to work around design decisions in C++ for example. I do not
think that patterns are bad per se, but that they are overrated and
overused. I think if you "see" patterns everywhere, it's time to think
how to make them "element" of your language.
Regards
Friedrich
> And I have especially an "ungut" feeling
> if I look through the GOF book. A lot of patterns have been developed
> (IMHO) to work around design decisions in C++ for example. I do not
> think that patterns are bad per se, but that they are overrated and
> overused. I think if you "see" patterns everywhere, it's time to think
> how to make them "element" of your language.
Yes, there's a web page around somewhere stating exactly that, and
showing how most of the patterns in GOF aren't needed in Dylan.
Ah, yes, here it is ... it's by Peter Norvig:
http://www.norvig.com/design-patterns
The examples are in Dylan but of course they apply to CL as well, and
often also to SmallTalk.
-- Bruce
But Java's MI is only about protocols, while Flavors' MI was about both
method (implementation) combination and protocols (:REQUIRED-METHODS etc).
Flavors actually allowed code reuse: no need to manually copy code or
write delegation trampolines all over the place, and the ability
to write new methods for someone else's classes.
CLOS lacks the protocol management part from Java or Flavors.
>
> Ah, yes, here it is ... it's by Peter Norvig:
>
> http://www.norvig.com/design-patterns
Quite interesting but IMHO much too short. Well it's a presentation
;-)
Regards
Friedrich
RMS has mentioned a number of times over the years that he wants Emacs
to switch over to Scheme someday. I think the idea is for Guile to
become the Emacs extension language. Its design lets it handle Emacs
Lisp code with minimal changes--e.g. unlike Scheme, it allows
case-sensitive symbol names and so forth, and I think it may support
separate value and function cells on symbols.
Here's some old documentation (SFA.DOC)
that I happened to come across today:
******************************
I) Introduction.
There are many instances when a MacLISP programmer wishes to
simulate an arbitrary I/O device. That is, he wishes to use the primitives
provided in LISP for doing I/O (such as READ and PRINT) yet wishes the
target/source of the characters to be completely arbitrary. I propose a
mechanism called an "SFA" (Software File Array) to handle this situation.
An SFA consists of a function and some associated local storage
bound up in an "SFA-object". In order to satisfy the above goals, an
SFA-object may be used in almost all places that a file-object can be used.
Note that the existence of SFAs will not obviate the need for the current
NEWIO implementation of files. SFAs are strictly a user entity, and the
files are still needed in order to communicate with the operating system.
The SFA-function is a user-defined EXPR/SUBR which accepts 3
arguments. Its first argument is the SFA-object on which it is to act, and
the second argument is a symbol indicating the operation to be performed.
The third argument is operation specific. The SFA-object needs to be
supplied to the function because one SFA-function may be associated with
many different SFA-objects. There are some operations used by the system
(TYI, TYO, etc...). The user is free to define new operations and to use
them by calling the SFA directly.
As there are predefined uses for many of the system I/O functions
that currently exist, when these functions are translated to SFA operations
they must retain the semantics they have when applied to file-objects.
Since this is the case, any SFA that expects to interface directly to the
standard I/O routines must know about the difference between binary and
character-oriented operations. Generally, an SFA will only handle one type
of I/O (i.e. most SFAs will not handle both the IN and TYI operation), yet
the structure of SFAs will not preclude them handling both.
II) System defined operations
1) Operations applicable to both input and output SFAs
A) WHICH-OPERATIONS
The SFA must return a list of operations which it can perform.
Every SFA *must* handle the WHICH-OPERATIONS operation.
B) OPEN
This operation is invoked if an OPEN is done with a SFA as its first
argument. The SFA should return either a file-object or a
SFA-object. The third argument to the SFA is the second argument to
the OPEN, and it defaults to NIL.
C) CLOSE
This operation is invoked when a CLOSE is done with a SFA as its
argument. There is never a third arg, and the SFA should return either
T if the close was successful and NIL if not (e.g. if the SFA was already
"closed".)
D) DELETEF
No arguments will be passed to the SFA. There is no clear global
semantic meaning of this.
E) FILEPOS
If an argument is given to FILEPOS, it is NCONS'ed and handed to the
SFA, else the SFA is given NIL. For a NIL argument, a fixnum should be
returned. For one argument, the SFA should interpret the argument as a
"position" to set the "file" to, and perform appropriate actions.
F) (STATUS FILEMODE) [operation: FILEMODE]
The SFA should return a list of adjectives describing itself. If the
SFA cannot support the FILEMODE operation (as determined by a
system-performed WHICH-OPERATIONS check), then a list of the form:
((SFA) {results of a WHICH-OPERATIONS call})
is returned.
2) Functions applicable to SFAs which can do input
If the SFA is to support normal LISP reading other than TYI, it must
support at least TYI and UNTYI, and preferably TYIPEEK also.
A) TYI
The SFA should return a fixnum representing a character. This operation
may be called from a READ, READLINE, or a straight TYI (the SFA cannot
depend upon being able to determine the context). The argument is the
value to return when EOF is reached (this is true for all input functions
including IN).
B) UNTYI
The SFA should put the argument into its input stream and on the next
TYI should spew forth the saved character rather than the next one in the
"stream". Note that an arbitrary number of UNTYI's may be done in a row.
C) TYIPEEK
The SFA should return a fixnum (as in TYI) but should not remove the
character from the input flow. Therefore, subsequent TYIPEEK's will read
the same character. If the SFA cannot handle TYIPEEK, it will be simulated
by a TYI/UNTYI pair of operations.
D) IN
The value returned should be a fixnum that is the next binary value in
the input stream. The argument is the EOF value.
E) READLINE
The value returned should be a symbol that represents one "line" of the
input stream. If the SFA cannot handle this operation, it is simulated in
terms of TYI. The argument is the value to return upon EOF.
F) READ
The value returned should be the next "object" in the input stream If
the SFA cannot handle this operation, it is simulated in terms of
TYI/UNTYI. The argument is the value to return upon EOF.
G) LISTEN
CLEAR-INPUT
For LISTEN, the total number of items (or "characters") currently
held in the input buffer is returned; for CLEAR-INPUT, the input buffer
is just thrown away.
3) Functions applicable to SFAs which can do output
A) TYO
The argument is a fixnum representing a character. The value returned
should be T if the SFA succeeds, and NIL if the output was unsuccessful.
B) OUT
The argument is a fixnum and is output as such. The TYO/OUT pair of
functions is the analog to the TYI/IN pair.
C) FORCE-OUTPUT
CLEAR-OUTPUT
For FORCE-OUTPUT, the SFA should empty its output buffer to the logical
"device" to which it is connected; for CLEAR-OUTPUT, any buffer is just
thrown away.
D) PRIN1
The SFA should print the arg in its slashified form. If the SFA
cannot perform this operation, it is simulated in terms of TYO.
E) PRINT
The SFA should print the arg in its slashified form preceeded by a
<NEWLINE> and terminated by a <SPACE>. If the SFA cannot perform this
operation, it is simulated in terms of TYO.
F) PRINC
The SFA should print the arg in its unslashified form. If the SFA
cannot perform this operation, it is simulated in terms of TYO.
G) CURSORPOS
The SFA will receive a list of the data args given to CURSORPOS. There
may be 0, 1 or 2 of these. If given the horizontal and vertical position,
there will be two elements in the list (<Vertical> <Horizontal>) which are
zero-indexed fixnums (or NIL for either/both meaning use the current value).
If given a single ascii character object, a relative cursor movement will
be expected (eg, C=Clear Screen, U=Move Up, ... [See .INFO.;LISP CURSOR])
If an empty data list is received, the SFA-handler should return the current
cursor position as a dotted pair (<Vertical> . <Horizontal>). There is
no clear global semantic meaning of this operations when
performed on non-tty's.
H) RUBOUT
The sfa should try to delete the last character output (either
by main output, or by echo output). See documentation on the MacLISP
RUBOUT function.
III) System functions for manipulating SFAs
A) (SFA-CREATE <old-SFA or SFA-function>
<amount-of-local-user-storage>
<printed-representation>)
SFA-CREATE returns an SFA-object which represents a function and
associated local storage. If the first arg is a SYMBOL, then that symbol
is taken as the SFA-function; if the first arg is an SFA-object, then the
function associated with it is used as the SFA-function, and the local
storage is possibly reclaimed (the first arg being an SFA is not yet
implemented). The second arg to this function should be the number of user
locations to allocate in the SFA-object.
When the SFA-CREATE is done, the SFA is immediatly invoked with a
WHICH-OPERATIONS call. A mask is then created corresponding to the
internal functions that the SFA knows how to do. This mask is used for
fast error-checking when a predefined operation is handed an SFA (e.g.
TYO).
B) (SFA-CALL <SFA-object> <operation> [<argument to SFA>])
SFA-CALL is used to perform an arbitrary operation on an arbitrary
SFA-object. For example, (TYO 1 <SFA>) is equivalent to
(SFA-CALL <SFA> 'TYO 1).
C) (SFAP <lisp-object>)
Returns T iff <lisp-object> is a SFA-object else returns NIL.
D) (SFA-GET <SFA-object> <fixnum or system-location-name>)
SFA-GET reads the local storage of <SFA-object>. If the second
arg is a fixnum, then location <fixnum> of the user's local storage is
accessed. This number may range from 0 to the maximum as specified when
the SFA-object is created {see SFA-CREATE}. If the second arg is a
symbol, then one of the "named" locations is accessed. The names are as
follows:
FUNCTION The SFA-function
WHICH-OPERATIONS A list of operations of interest to the
system that the SFA can perform. This is a
subset of the SFAs WHICH-OPERATIONS list.
PNAME The object to print as the "name" of an SFA
PLIST A general property list, for use by the user.
XCONS A "correspondent" for a bi-directional SFA;
akin to the TTYCONS concept for TTY's.
E) (SFA-STORE <SFA-object> <fixnum or system-location-name> <value>)
<value> is stored in the location specified by the second arg to
SFA-STORE. Locations are as for SFA-GET. It is strongly recommended that
the SFA-function NEVER be changed.
IV) Higher level tools.
A) DEFSFA -- Define a SFA
[requires some runtime support, which will be autoloaded in]
(DEFSFA <sfa-name> (<self-var> <op-var>) <slot-specs> <options>
&rest <clauses>)
<sfa-name> -- The name of this type of SFA.
<self-var> -- a variable to be bound to the SFA itself when it is sent a
message
<op-var> -- a variable to be bound to the name of the message.
<slot-specs> -- a list of slot names, or (<slot-name> <initial-value>)
each slot will have an accessor macro defined for it, with a name formed by
(SYMBOLCONC <sfa-name> '- <slot-name>)
<options> -- a list of option specs. An option spec is either a bare
symbol, denoting that the specified option is TRUE, or a list of an option
name and a value. The only option defined currently is :INIT, which
suppresses newly consed SFA's of this type getting a :INIT message if no
initial values are supplied. (If initial values are supplied, the :INIT
message must be sent in order to store them).
<clauses> -- a list of clauses. Each clause has the form
(<op or list of ops> <argspec> . <body>)
<op or list of ops> -- the ops are message keys, ala the second argument
to SEND or SFA-CALL.
<argspec> -- This is either a symbol, in which case it is bound to the
3rd argument, or a list of two or more symbols, in which
case it is bound to the 3rd-<n+2>th arguments. No error
checking on number of arguments will be done. If the symbol
is atomic, it will be reachable via either SFA-CALL or SEND,
otherwise it will be reachable only by SEND. This is
because SFA-CALL does not allow other than 3 arguments to be
passed.
<body> -- list of forms to be evaluated. The last one's value will be
returned.
This defines accessor macros for each slot in the SFA, plus a macro for
creating the SFA, named by prepending CONS-A- to the sfa-name. The
CONS-A-mumble macro does the SFA-CREATE, and immediately sends it a :INIT
message with a list of alternating slot-names and values.
All this works via the following lower-level mechanisms:
B) (SFA-UNCLAIMED-MESSAGE <sfa> <operation> <data>) Basically, this reports
the error. But first, it handles several special cases. The first is
turning a SEND into an SFA-CALL. When a SEND is performed on a SFA, it is
simulated by doing an SFA-CALL with an operation of :SEND, and a 3rd
argument of (LIST <old operation> <3rd argument> ... <nth argument>). If
:SEND is the operation to SFA-UNCLAIMED-MESSAGE, AND the SFA claims to
implement the <old operation> (in response to a WHICH-OPERATIONS-INTERNAL
query), it will turn it back into an SFA-CALL of the <old operation>, with
any extra arguments being thrown away. If it DOESN'T claim to handle it
the message is passed off to the superclass via SEND-AS
A subcase of the :SEND case is when the <old operation> is
WHICH-OPERATIONS. This implies a (SEND <sfa> 'WHICH-OPERATIONS) was done.
So this is simulated by mergeing any operations obtainable from the
superclass of SFA-CLASS. (If the EXTEND package isn't loaded, it doesn't
try to get them though.)
If the message is WHICH-OPERATIONS-INTERNAL, then the handler did not
implement WHICH-OPERATIONS-INTERNAL. We assume WHICH-OPERATIONS will
do the job and SFA-CALL that.
If the message is something other than :SEND, it just passes the message on
up to the superclass with SEND-AS.
It is recommended that any SFA's which for one reason or another do not use
DEFSFA should use SFA-UNCLAIMED-MESSAGE. Essentially the correct behaviour
with respect to SEND should result.
******************************
> E) FILEPOS
>
> If an argument is given to FILEPOS, it is NCONS'ed and handed to the
> SFA, else the SFA is given NIL.
What did NCONS do?
Heh.
(defun ncons (x) (cons x nil))
Btw, lest you think this is the same as LIST (which, ok, it is :-),
on the Lisp Machine, (ncons x) has the subtle difference in effect
from (list x) that it creates a non-cdr-coded list rather than a
cdr-coded list. Since the net effect of cdr-coding is that rplacd
can cause consing if you haven't "planned" for it, you call NCONS
when you're going to rplacd the cons and LIST when you're saying
"this is the entire anticipated backbone, go ahead and store it compactly".
So you would use NCONS as the base case in a recursive list creation
and LIST in the case where you had global knowledge, as in making a
non-dotted alist by doing (list key value).
The great thing about cdr-coding is that in most cases, people's
programming style at the time it was introduced already matched this.
It's kind of a pity that CL didn't introduce enough subtlety to
accomodate cdr-coding, but then it's also a pity that lisp-based
hardware is such a dinosaur these days.
That makes a lot more sense than what my instincts said: "non-consing CONS".
Which of course was just confusing.
--
/|_ .-----------------------.
,' .\ / | No to Imperialist war |
,--' _,' | Wage class war! |
/ / `-----------------------'
( -. |
| ) |
(`-. '--.)
`. )----'