Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

When to recompile/reeval?

109 views
Skip to first unread message

Peter Seibel

unread,
Oct 9, 2002, 12:54:54 AM10/9/02
to
Suppose I have a bunch of functions and macros spread out over a
couple of different .lisp files. And i'm working on the code
willy-nilly. I imagine that there are rules about what needs to be
re-evaluated when as I make various kinds of changes. Unfortunately my
understanding of those rules is very fuzzy: I believe that if i have
functions foo and bar and foo calls bar, I can redefine bar and a call
to foo will reflect the new behavior of bar. (Though I have an idea
that if I've compiled foo, as opposed to just eval'ing the definiton,
that might not be true.) But if bar was a macro, then I'd have to
re-eval the definition of foo if I wanted it to use a new definition
of bar, right? And then there's compile-file and load. For the moment
I tend to kill my Lisp process every so often and recompile and reload
everything to make sure I've got a consistent image. Which seems less
than optimal.

Anyway, what I'm really hoping is that someone can give me a pointer
to the right thing to read to wrap my head around these issues. (For
one, I don't even know if this is a language issue that I should be
reading about in the HyperSpec or an implementation issue.)


-Peter

--
Peter Seibel
pe...@javamonkey.com

Vassil Nikolov

unread,
Oct 9, 2002, 4:24:53 AM10/9/02
to
On Wed, 09 Oct 2002 04:54:54 GMT, Peter Seibel <pe...@localhost.localdomain> said:

[...]
PS> if i have
PS> functions foo and bar and foo calls bar, I can redefine bar and a call
PS> to foo will reflect the new behavior of bar. (Though I have an idea
PS> that if I've compiled foo, as opposed to just eval'ing the definiton,
PS> that might not be true.)

That would still be true (except in the case when BAR is
declared/proclaimed inline, but it is likely that you don't have to
concern yourself with that, at least for now). By the way,
evaluating a DEFUN form might automatically compile.

PS> But if bar was a macro, then I'd have to
PS> re-eval the definition of foo if I wanted it to use a new definition
PS> of bar, right?

Yes. More precisely, you have to redefine FOO each time you change
the definition of a macro it uses. You can redefine FOO by:

* re-evaluating the DEFUN form; or

* reloading the file where FOO is defined; or

* recompiling and then reloading the file where FOO is defined.

All these are equally good as far as having FOO pick up the new
macro definition is concerned.

PS> And then there's compile-file and load. For the moment
PS> I tend to kill my Lisp process every so often and recompile and reload
PS> everything to make sure I've got a consistent image.

It should be enough to reload the files, if you have lost track of
which individual definitions need to be redone. While developing,
especially if things change very often, you may want to load the
source files, and compile the files later when things are more
stable. Loading the source files does not necessarily mean you
cannot run compiled code: invoking COMPILE by hand is tedious, but
if you find the switch that tells the implementation to
automatically compile on DEFUN, it will do it for you. (This one,
of course, is an implementation-specific feature; not all of them
would have it.)

PS> Anyway, what I'm really hoping is that someone can give me a pointer
PS> to the right thing to read to wrap my head around these issues. (For
PS> one, I don't even know if this is a language issue that I should be
PS> reading about in the HyperSpec or an implementation issue.)

Start at www.lisp.org.

---Vassil.

--
Garbage collection is charged at 0.19e-9 cents a cons. Bulk rates
are also available: please contact memory management for details.

Alain Picard

unread,
Oct 9, 2002, 5:17:50 AM10/9/02
to

Peter Seibel <pe...@localhost.localdomain> writes:

> I tend to kill my Lisp process every so often and recompile and reload
> everything to make sure I've got a consistent image. Which seems less
> than optimal.

There are two answers to your question; the one explaining when/why
you need to recompile and reload a file; and one explaining how
to set your environment so that tracking these changes is done for
you so you don't have to think about it. Killing your lisp should
very rarely be necessary; I regularly hack on the same image for days
and days without introducing any dependency problems.

> Anyway, what I'm really hoping is that someone can give me a pointer
> to the right thing to read to wrap my head around these issues.

This is really quite simple, if you understand what macros do.
A macro defines a function which takes a form and computes a new form.
If you have function FOO which makes a call to macro BAR, and compile
the file in which FOO resides, the compiler calls the macroexpansion
function to get the new form, then compile _that_ new form as part
of the body of FOO.

Now, if you later change your mind about what BAR does, you need to
recompile and reload all files in which BAR is used, because those
functions never make any calls to BAR; the compiler does that, _once_,
at compilation time.

Functions are different; if BAR is also a function, then, at _runtime_,
the function object associated with the symbol BAR is retrieved when FOO
is called, so recompiling BAR is sufficient.
[Recompiling BAR or simply re-evaluating it are precisely the same module
runtime efficiency. But I always just hit C-c C-c in ilisp, so I always
run compile code, since the evaluator doesn't issue warnings.]

Finally, if you declared a function INLINE, then for the purposes of
dependecies, it behaves as a macro, as FOO will (or, more properly, may)
have been compiled to simply include the body of FOO into the body of BAR.

Lastly, the way to not worry about such things is to use a build system
like DEFSYSTEM, and to think hard about what goes where. You teach
the build system the compile and load time dependencies of your application,
and, when you've hacked willy-nilly, as you say, you just ask the build
system to recompile/reload your application, and it should be smart enough
to only recompile/reload the necessary files.


p.s. the only time this breaks for me is when I need to redefine a structure;
unlike classes, the standard does not mandate that you can redefine them
and expect sensible results. Under Lispworks, I find that redefining
the structure is OK as long as you do it in the interpreter, but breaks
if you try to recompile. Anybody know a workaround (other than killing
the image, that is)?

Faried Nawaz

unread,
Oct 9, 2002, 8:24:40 AM10/9/02
to
Peter Seibel <pe...@localhost.localdomain> wrote in message news:<m3bs64w...@localhost.localdomain>...

> Suppose I have a bunch of functions and macros spread out over a
> couple of different .lisp files. And i'm working on the code
> willy-nilly. I imagine that there are rules about what needs to be
> re-evaluated when as I make various kinds of changes.

I don't know what happens when packages are involved, but I don't believe you
need to re-eval anything. Remember -- when you enter the final closing paren
of the lisp form, the entire form's automatically re-evaluated (and with most
Lisp environments, recompiled).

Even with CLOS the instances change when you change their class definitions:


* (defclass animal ()
((name :accessor name :initarg :name :initform nil)
(age :accessor age :initarg :age :initform nil)))

#<STANDARD-CLASS ANIMAL {48038355}>
* (defclass bird (animal)
((can-fly-p :accessor can-fly-p :initarg :can-fly-p :initform t)))

#<STANDARD-CLASS BIRD {4808C5FD}>
* (defvar *zebra* (make-instance 'animal :name "Zebby" :age 4))

*ZEBRA*
* (defvar *parrot* (make-instance 'bird :name "Polly" :age 2))

*PARROT*
* (describe '*zebra*)

*ZEBRA* is an internal symbol in the COMMON-LISP-USER package.
It is a special variable; its value is #<ANIMAL {480AE755}>.

#<ANIMAL {480AE755}> is an instance of class #<Standard-Class ANIMAL
{4803161D}>:
The following slots have :INSTANCE allocation:
AGE 4
NAME "Zebby"
*

;;; redefine animal

* (defclass animal ()
((name :accessor name :initarg :name :initform nil)
(age :accessor age :initarg :age :initform nil)
(owner :accessor owner :initarg :owner :initform "me")))


#<STANDARD-CLASS ANIMAL {48038355}>
(describe '*parrot*)

*PARROT* is an internal symbol in the COMMON-LISP-USER package.
It is a special variable; its value is #<BIRD {480B6495}>.

#<BIRD {480B6495}> is an instance of class #<Standard-Class BIRD {48085D4D}>:
The following slots have :INSTANCE allocation:
OWNER "me"
AGE 2
NAME "Polly"
CAN-FLY-P T
*

If it is an issue somewhere, try the HyperSpec. My (admittedly cursory) search
hasn't revealed anything.


Faried.
--
The Great GNU has arrived, infidels, behold his wrath !
If I wanted a GF, Values, not variables.
I'd use CL.

Martin Simmons

unread,
Oct 9, 2002, 8:38:45 AM10/9/02
to
"Alain Picard" <apicard+die...@optushome.com.au> wrote in message
news:86k7ksv...@gondolin.local.net...

> p.s. the only time this breaks for me is when I need to redefine a structure;
> unlike classes, the standard does not mandate that you can redefine them
> and expect sensible results. Under Lispworks, I find that redefining
> the structure is OK as long as you do it in the interpreter, but breaks
> if you try to recompile. Anybody know a workaround (other than killing
> the image, that is)?

It might be the inlining of constructors and accessors that is causing problems.
After redefining a structure, you need to recompile all the calls to these and
also make sure that no old instances of the structure are used.
--
Martin Simmons, Xanalys Software Tools
zne...@xanalys.com
rot13 to reply

JP Massar

unread,
Oct 9, 2002, 1:13:30 PM10/9/02
to
On Wed, 09 Oct 2002 04:54:54 GMT, Peter Seibel
<pe...@localhost.localdomain> wrote:

> And then there's compile-file and load. For the moment
>I tend to kill my Lisp process every so often and recompile and reload
>everything to make sure I've got a consistent image. Which seems less
>than optimal.
>

It is much less than optimal. Something like DEFSYSTEM (discussed
by others in this thread) would help you.

But if you want something really quick and dirty then just create
a file/function that contains code to compile and then load each of
your files in turn and in the right order. Instead of killing your
lisp just execute this function.

Barring some seriously screwed up interdependencies among your files
this will work just fine.

This should take significantly less time than killing and restarting
the lisp (depending on how huge your system is -- but if your system
is getting big then you really want to get a DEFSYSTEM or similar).

Peter Seibel

unread,
Oct 10, 2002, 12:11:19 AM10/10/02
to
mas...@alum.mit.edu (JP Massar) writes:

> On Wed, 09 Oct 2002 04:54:54 GMT, Peter Seibel
> <pe...@localhost.localdomain> wrote:
>
> > And then there's compile-file and load. For the moment
> >I tend to kill my Lisp process every so often and recompile and reload
> >everything to make sure I've got a consistent image. Which seems less
> >than optimal.
> >
>
> It is much less than optimal. Something like DEFSYSTEM (discussed
> by others in this thread) would help you.

Yeah. I actually wrote my original post right after downloading
mk.defsystem from the CLOCC. I haven't really looked at it too
carefully yet, but I realize that it, or something like it is in my
future. I just wanted to get a little theoretical understanding under
my belt. Coming from the world of 'make' (i.e. Java/C/etc) I've seen
people do horrible things with a makefile because they don't
understand how it's actually supposed to work.

Anyway, based on the responses, it seems like a "proper" (for
some--probably over-engineered--definition of "proper") Lisp build
system would keep track of dependencies at a function/macro level
(since that's the smallest unit you can compile). Do any of the
systems out there actually work that way. That is, you could imagine
keeping a big DAG of all the functions and macros (and, I guess,
defvars, defclasses, etc) and when you say (make) it would check which
ones had changed and recompile them in the proper order. And given the
code is data, data is code, thing it seems a lot easier to maintain
that information automatically than it is in languages like C or Java
where you don't necessarily have access to a parser for the language.

But that's probably crazy. I guess I'll go read about defsystem and
see if I can find that paper Kent Pitman wrote about what he wants
from a build system. Thanks to everyone who responded.

Alain Picard

unread,
Oct 10, 2002, 5:11:17 AM10/10/02
to
"Martin Simmons" <zne...@xanalys.com> writes:

>
> It might be the inlining of constructors and accessors that is causing problems.
> After redefining a structure, you need to recompile all the calls to these and
> also make sure that no old instances of the structure are used.

Yes. It's getting rid of the old instances which is usually
such a pain that I have to exit the image.

Thanks,

Thomas A. Russ

unread,
Oct 11, 2002, 7:40:53 PM10/11/02
to

Summary: Macros and Defstructs will cause problems, others won't

Function calls work with redefinition in both interpreted and compiled
code. This is one of the nice features of Lisp that allows hot patching
of running applications, as well as providing various debugging and code
development advantages.

Changing the definition of CLOS classes will also work on the fly
without requiring changes to the using code -- even if you add or delete
slots from the class. (Unless of course the code happens to use slots
you decide to delete...)

The items that require re-evaluation or re-compilation of functions are
if you change a macro (since that is expanded when the function is
compiled) or if you change the structure of a defstruct (since the
accessor functions can be open-coded).

One more systematic approach to this (if you project gets big enough to
be in multiple files) is to use one of the DEFSYSTEM utilities that
manage the dependencies among source files. Put macros and defstructs
in their own files, and mark the dependencies.


--
Thomas A. Russ, USC/Information Sciences Institute t...@isi.edu

Erik Naggum

unread,
Oct 12, 2002, 3:22:44 PM10/12/02
to
* t...@sevak.isi.edu (Thomas A. Russ)

| Changing the definition of CLOS classes will also work on the fly
| without requiring changes to the using code -- even if you add or delete
| slots from the class. (Unless of course the code happens to use slots
| you decide to delete...)

It may help to understand how slot access works in CLOS compared to
inferior object models. If you try to access a slot with `slot-value´,
and it has been deleted, the system invokes an error handler that may
still return a useful value. If you read a slot with an accessor generic
function, that function can still exist and do useful things, and it
might just be able to know enough about the caller to write a log entry
that a maintenance programmer can use to update or correct it. These are
features that are not available in the simple-minded object systems that
are used in "static" languages.

| One more systematic approach to this (if you project gets big enough to
| be in multiple files) is to use one of the DEFSYSTEM utilities that
| manage the dependencies among source files. Put macros and defstructs in
| their own files, and mark the dependencies.

Reasonably complete cross-referencing function are available in the real
Common Lisp environments that can tell you which functions need to be
recompiled if you change a macro (and indirectly a defstruct). These
features are enormously helpful in locating both the source file and the
functions and macros calling and called from a function or macro. They
can also help you build a patch file of the functions that have changed
after a macro has been changed for compilation as a patch.

Depending on how you store your source files, extracting individual
functions for recompilation may be a problem or no problem at all.

--
Erik Naggum, Oslo, Norway

Act from reason, and failure makes you rethink and study harder.
Act from faith, and failure makes you blame someone and push harder.

Thomas F. Burdick

unread,
Oct 13, 2002, 1:09:46 PM10/13/02
to
Erik Naggum <er...@naggum.no> writes:

> * t...@sevak.isi.edu (Thomas A. Russ)
> | Changing the definition of CLOS classes will also work on the fly
> | without requiring changes to the using code -- even if you add or delete
> | slots from the class. (Unless of course the code happens to use slots
> | you decide to delete...)
>
> It may help to understand how slot access works in CLOS compared to
> inferior object models. If you try to access a slot with `slot-value´,
> and it has been deleted, the system invokes an error handler that may
> still return a useful value. If you read a slot with an accessor generic
> function, that function can still exist and do useful things, and it
> might just be able to know enough about the caller to write a log entry
> that a maintenance programmer can use to update or correct it. These are
> features that are not available in the simple-minded object systems that
> are used in "static" languages.

One of the nice things about doing slot access via accessor functions
is that if/when you decide that some piece of information is living in
the wrong class, you can move the slot to the class it belongs in, and
often write a replacement for the previously-DEFCLASS-generated
accessor function that finds the value in its new home. You can
evaluate these new definitions in your Lisp image, and continue
development without interruption. All your old code works, and all
your instances are still valid.

(Assuming your implementation is even kind of conforming...
[144]> ;;;Evaluating defclass basic-block
WARNING:
DEFCLASS: Class BASIC-BLOCK is being redefined, instances are obsolete
#<STANDARD-CLASS IR:BASIC-BLOCK>
Whee! I get to restart CLISP! *sigh*)

--
/|_ .-----------------------.
,' .\ / | No to Imperialist war |
,--' _,' | Wage class war! |
/ / `-----------------------'
( -. |
| ) |
(`-. '--.)
`. )----'

Erik Naggum

unread,
Oct 14, 2002, 9:58:17 AM10/14/02
to
* Peter Seibel

| I guess I'll go read about defsystem and see if I can find that paper
| Kent Pitman wrote about what he wants from a build system. Thanks to
| everyone who responded.

Since nobody has even mentioned this yet, I would like you to consider
another option: Do not compile your code in the first place. The great
value of running interpreted is that macros you change will affect their
uses immediately. If you intend to do a lot of macro programming, you
will find that compiling macros and functions using macros is a serious
drag on programmer performance. While testing and developing your code
interactively, chances are pretty good that you will be far slower than
the interpreted environment of your Common Lisp system and that any time
you might save in executing compiled code is more than offset by the
time you waste with your what-to-compile-when-problem.

Programming in Common Lisp is all about not worrying about needless
things, and if you need to worry about when a macro is being used because
the caller has been compiled before you changed it, you should not ask
what to recompile, but what to compile in the first place.

Peter Seibel

unread,
Oct 15, 2002, 1:04:53 AM10/15/02
to
Erik Naggum <er...@naggum.no> writes:

> * Peter Seibel
> | I guess I'll go read about defsystem and see if I can find that
> | paper Kent Pitman wrote about what he wants from a build
> | system. Thanks to everyone who responded.
>
> Since nobody has even mentioned this yet, I would like you to
> consider another option: Do not compile your code in the first
> place. The great value of running interpreted is that macros you
> change will affect their uses immediately. If you intend to do a
> lot of macro programming, you will find that compiling macros and
> functions using macros is a serious drag on programmer
> performance. While testing and developing your code
> interactively, chances are pretty good that you will be far slower
> than the interpreted environment of your Common Lisp system and
> that any time you might save in executing compiled code is more
> than offset by the time you waste with your
> what-to-compile-when-problem.
>
> Programming in Common Lisp is all about not worrying about
> needless things, and if you need to worry about when a macro is
> being used because the caller has been compiled before you changed
> it, you should not ask what to recompile, but what to compile in
> the first place.

Good point. Does that work even in implementations that don't make
much of a distinction between compiled and interpreted code. For
example, I've been using SBCL and was reading something about it the
other day that said (paraphrasing from memory) that they barely have
an interpreter and pretty much compile everything all the time. I know
there are some places in the CL spec that talk about the difference
between compiled and interpreted code but don't recall the
details. [Goes to browse through the HyperSpec]. Okay, I see "3.2.2.3
Semantic Constraints" talking about the things that you can count on
being the *same* between compiled and interpreted evaluation; is there
somewhere else that talks about required *differences*. Are there any?

Erik Naggum

unread,
Oct 15, 2002, 10:47:28 AM10/15/02
to
* Peter Seibel

| Does that work even in implementations that don't make much of a
| distinction between compiled and interpreted code. For example, I've been
| using SBCL and was reading something about it the other day that said
| (paraphrasing from memory) that they barely have an interpreter and
| pretty much compile everything all the time.

You would have to find out if SBCL expands macros too early even when you
want to run interpreted. In my view, that is a serious disservice to its
programmers, when the value of actually working with interpreted code is
so obvious when you actually need it. Performance is the key, but Common
Lisp has generally been willing to consider programmer performance as a
higher value than code performance in some stages of development.

| [Goes to browse through the HyperSpec].

:)

| Okay, I see "3.2.2.3 Semantic Constraints" talking about the things that
| you can count on being the *same* between compiled and interpreted
| evaluation; is there somewhere else that talks about required
| *differences*. Are there any?

I expect to be able to change a macro definition and its interpreted
users will reflect the chang while its compiled users would not.

(defmacro macro-expansion-test ()
(print "I'm being expanded!")
`(print "I'm the expansion!"))

(defun expanded-when ()
(macro-expansion-test))

Evaluate these and see what `(expanded-when)´ produces when you evaluate
it twice in a row. If you get the first `print´ while defining it, your
environment is very eager. If you get the first `print´ only once, it
does the expansion on the first evaluation. I expect it on both. Then
`(compile 'expanded-when)´ and it should print "I'm being expanded!" and
calling it should only produce "I'm the expansion".

Peter Seibel

unread,
Oct 15, 2002, 11:53:09 PM10/15/02
to
Erik Naggum <er...@naggum.no> writes:

So I take it I'm pretty much always running in compiled code given
the output below from SBCL 0.7.8:

* (defmacro macro-expansion-test () (print "I'm being expanded!") `(print "I'm the expansion!"))
MACRO-EXPANSION-TEST
* (defun expanded-when () (macro-expansion-test))
"I'm being expanded!"
EXPANDED-WHEN
* (expanded-when)
"I'm the expansion!"
"I'm the expansion!"
* (expanded-when)
"I'm the expansion!"
"I'm the expansion!"
* (compile 'expanded-when)
EXPANDED-WHEN
NIL
NIL
* (expanded-when)
"I'm the expansion!"
"I'm the expansion!"
*

Erik Naggum

unread,
Oct 16, 2002, 2:04:42 AM10/16/02
to
* Peter Seibel

| So I take it I'm pretty much always running in compiled code given the
| output below from SBCL 0.7.8:
| * (defun expanded-when () (macro-expansion-test))
| "I'm being expanded!"

Yes. That premature line is telling.

If SBCL is a reasonable Common Lisp environment, it should have a way to
disable the premature macroexpansion. If not, it should have it in the
next version if its authors are serious about offering Common Lisp's
/dynamic/ and /interactive/ development tradition. However, it may be
that it is not intended to be a living Common Lisp environment. CLISP is
not able to accomodate living Common Lisp applications, either, with its
lack of ability to update class instances as required by the standard,
but that does not seem to deter a significant number of people. GCL is
also unable to accomodate demanding Common Lisp users, so there does
appear to be a market for dumbed-down and fake Common Lisp systems.

Martti Halminen

unread,
Oct 16, 2002, 4:27:24 AM10/16/02
to
Erik Naggum wrote:
>
> * Peter Seibel
> | So I take it I'm pretty much always running in compiled code given the
> | output below from SBCL 0.7.8:
> | * (defun expanded-when () (macro-expansion-test))
> | "I'm being expanded!"
>
> Yes. That premature line is telling.
>
> If SBCL is a reasonable Common Lisp environment, it should have a way to
> disable the premature macroexpansion. If not, it should have it in the
> next version if its authors are serious about offering Common Lisp's
> /dynamic/ and /interactive/ development tradition.

For historical interest, I tried this on a Symbolics. Got the "I'm being
expanded!" three times when evaluating the defun, never on
(expanded-when) and once on (compile 'expanded-when).

--

Thomas F. Burdick

unread,
Oct 16, 2002, 2:36:28 PM10/16/02
to
Erik Naggum <er...@naggum.no> writes:

> * Peter Seibel
> | So I take it I'm pretty much always running in compiled code given the
> | output below from SBCL 0.7.8:
> | * (defun expanded-when () (macro-expansion-test))
> | "I'm being expanded!"
>
> Yes. That premature line is telling.
>
> If SBCL is a reasonable Common Lisp environment, it should have a way to
> disable the premature macroexpansion. If not, it should have it in the
> next version if its authors are serious about offering Common Lisp's
> /dynamic/ and /interactive/ development tradition. However, it may be
> that it is not intended to be a living Common Lisp environment.

It is a project with limited development means, and a version number
less than 1. Obviously it's not for you (at the moment). Given that
the primary developers use it for their day-to-day work, it's
unnecessarily provocative to say that it is "not intended to be a
living Common Lisp environment." Obviously, it is so intended.

Matthew X. Economou

unread,
Oct 16, 2002, 4:27:00 PM10/16/02
to
Erik Naggum recently made the point that Lisp in an interpreted mode
is ideal for writing and debugging macros (among other programming
tasks). As I understand it, the typical Lisp implementation does not
remember that a compiled function depends on the result of a macro
expansion, so if the macro function is changed in such a way that
affects the expansion, any compiled function must be manually
recompiled in order to take advantage of the updated macro. Again, as
I understand it, the macro expansion is performed each time an
interpreted function is called, so if the macro function is changed,
there is no need for the interpreted function to be reloaded in order
to take advantage of the updated macro.

That said, what's stopping an implementation's compiler from storing
this dependency information in (or with) the object code of a compiled
function, so that the function gets recompiled automatically if a
macro on which the function is dependent is changed?

--
Matthew X. Economou <xeno...@irtnog.org> - Unsafe at any clock speed!
I'm proud of my Northern Tibetian heritage! (http://www.subgenius.com)
Max told his friend that he'd just as soon not go hiking in the hills.
Said he, "I'm an anti-climb Max." [So is that punchline.]

Barry Margolin

unread,
Oct 16, 2002, 5:42:57 PM10/16/02
to
In article <w4osmz6dt...@eco-fs1.irtnog.org>,

Matthew X. Economou <xenopho...@irtnog.org> wrote:
>That said, what's stopping an implementation's compiler from storing
>this dependency information in (or with) the object code of a compiled
>function, so that the function gets recompiled automatically if a
>macro on which the function is dependent is changed?

Suppose you're incrementally developing a macro. You have a version that
works, and you compile something with it. Then you start modifying the
macro, and you want to test it in a new function, but don't want to break
functions that were compiled with the previous version. Since the language
specification says that macros are expanded when the function is compiled,
you should be able to depend on the fact that modifying the macro won't
affect those functions.

There are some relatively easy ways to have your cake and eat it, too, in
many cases. One way is to have your macro expand into a call to a helper
function that does most of the work:

(defmacro with-foo ((&rest options) &body body)
`(with-foo-internal #'(lambda () ,@body) ,@options))

(defun with-foo-internal (body-fun &key ...)
...
(funcall body-fun)
...)

Now you can redefine WITH-FOO-INTERNAL and it will affect all the uses.
The WITH-FOO macro is so trivial that it's unlikely you'll ever need to
modify it.

This technique also makes the macro hygienic automatically.

--
Barry Margolin, bar...@genuity.net
Genuity, Woburn, MA
*** DON'T SEND TECHNICAL QUESTIONS DIRECTLY TO ME, post them to newsgroups.
Please DON'T copy followups to me -- I'll assume it wasn't posted to the group.

Erik Naggum

unread,
Oct 16, 2002, 8:53:08 PM10/16/02
to
* Thomas F. Burdick

| It is a project with limited development means, and a version number less
| than 1. Obviously it's not for you (at the moment).

And this was a counter-argument to what, precisely? Should I shut up
about where I think they should be going just because you -- what, felt
bad or something?

| Given that the primary developers use it for their day-to-day work, it's
| unnecessarily provocative to say that it is "not intended to be a living
| Common Lisp environment."

Yes, you were unnecessarily provoked. That does not make what I said
unnecessarily provocative. My God, after all these years, people still
cannot deal with "provocative" statements rationally. Gimme a break!

| Obviously, it is so intended.

*sigh* It is the Common Lisp environment that is living when it adapts
well to changes. If you force compilation prematurely, you force your
programmers to consider the cost of changing a macro, and this is anathema
to the interactive, dynamic programming that Common Lisp is so good at
supporting. As I said, there is a market for Common Lisp environments
that do not support this crucial element of Common Lisp, but people
should know about alternatives and consider the fact that SBCL made an
engineering decision between CMUCL and SBCL to force premature macro
expansion. So somebody made the wrong choice.

Unnecessarily provocative, my ass.

Christophe Rhodes

unread,
Oct 17, 2002, 4:12:26 AM10/17/02
to
Erik Naggum <er...@naggum.no> writes:

[snip]

> As I said, there is a market for Common Lisp environments
> that do not support this crucial element of Common Lisp, but people
> should know about alternatives and consider the fact that SBCL made an
> engineering decision between CMUCL and SBCL to force premature macro
> expansion. So somebody made the wrong choice.

Before I start, I don't want to give the impression of speaking /ex
cathedra/ here on SBCL's development; I happen to be one of the people
with CVS commit access, but it hasn't been for all that long, and I
certainly can't take full responsibility for development direction.

Firstly, on the subject of restoring non-eager
interpretation/compilation for the next release: given that according
to our (admittedly informal) release schedule sbcl-0.7.9 will be
released in the next ten to fifteen days, I can state with a fair
amount of confidence that it's unlikely that this functionality will
be included in the next release. Sorry.

Secondly, yes, it was an engineering decision to remove the IR1
interpreter (and the byte compiler) from SBCL; this was not, however,
motivated by the attempt to cripple SBCL as a development environment,
but a step taken to improve the maintainability of the SBCL codebase
as a whole. The point here being that I don't believe that there's
any political or technical problem with restoring this
interpreter-like functionality into SBCL; it shouldn't be
characterized as an irrevocable technical design decision.

However, there is the matter of priorities. From the PRINCIPLES file
of the SBCL source distribution:
1. conforming to the standard
2. being maintainable
3. being portable
4. making a better programming environment
As far as I am aware, SBCL (in its treatment of macros in function
definitions) is conforming to the ANSI standard. The possible
alternative behaviour pointed out by Erik is likewise conforming, of
course, which means that either behaviour would be fine by the first
guiding principle. Thus, as long as the code implementing this
behaviour is maintainable (and doesn't impact on portability) it would
seem that we have a definite win. However, I do not propose to spend
the next ten days of my time working to implement it, I'm afraid, as
there are a number of ANSI and AMOP compliance bugs that I view as
significantly more urgent than a programming convenience. Speaking
for myself, I would at this moment rather work on ensuring that users
can have the confidence that SBCL implements the ANSI specification.

I hope that clarifies my point of view; please don't hesitate to ask
questions, and comments are always welcome.

Christophe
--
http://www-jcsu.jesus.cam.ac.uk/~csr21/ +44 1223 510 299/+44 7729 383 757
(set-pprint-dispatch 'number (lambda (s o) (declare (special b)) (format s b)))
(defvar b "~&Just another Lisp hacker~%") (pprint #36rJesusCollegeCambridge)

William Newman

unread,
Oct 17, 2002, 4:21:04 PM10/17/02
to
Erik Naggum <er...@naggum.no> wrote in news:<32437370...@naggum.no>

> If SBCL is a reasonable Common Lisp environment, it should have
> a way to disable the premature macroexpansion. If not, it
> should have it in the next version if its authors are serious
> about offering Common Lisp's /dynamic/ and /interactive/
> development tradition. However, it may be that it is not

> intended to be a living Common Lisp environment. CLISP is not
> able to accomodate living Common Lisp applications, either,
> with its lack of ability to update class instances as required
> by the standard, but that does not seem to deter a significant
> number of people. GCL is also unable to accomodate demanding
> Common Lisp users, so there does appear to be a market for
> dumbed-down and fake Common Lisp systems.
Gack.


For the lurkers: ANSI compatibility is an extremely high
priority for SBCL, including of course behavior which is
important for incremental development. Thus I was particularly
annoyed and provoked by Erik's comparisons to CLISP and gcl
above. (Lack of ability to update things as required by the
ANSI standard? Yikes!) Furthermore, between that and his
remarks in a subsequent message,
news:<32438047...@naggum.no>,

> As I said, there is a market for Common Lisp environments
> that do not support this crucial element of Common Lisp, but
> people should know about alternatives and consider the fact
> that SBCL made an engineering decision between CMUCL and SBCL
> to force premature macro expansion. So somebody made the
> wrong choice.

there seems to be the suggestion that SBCL has been backsliding
away from ANSI-specified behavior which supports incremental
development. This is not the case. There are various
shortcomings in SBCL, notably the way that I personally have
pessimized compiler performance and dropped popular extensions.
And we also have some long-standing ANSI-compliance problems
(dating back to the original CMU CL code) which we haven't
fixed. But other than temporary stumbles when we inadvertently
introduce temporary bugs, as far as I am aware SBCL's ANSI
compliance has only increased over time, with no significant
backsliding in any area, including this one.

Besides taking care not to backslide, we (originally me, but
over the last year mostly others) have been rather energetic
about improving ANSI compliance in many areas. As two examples
chosen to be relevant to incremental development, we fixed the
long-standing non-ANSI
(compile (defun foo () nil))
(compile (defun bar () (if (foo) :true :false)))
(defun foo () 12)
(bar)
=> :FALSE
over-eager type-inference behavior that we inherited from old
CMU CL, and fixed a recently-discovered PCL bug which occurs
when EXPORT operations are done on internal symbols sometime
after they're used in CLOS definitions.

So while SBCL may have ditched support for extras that Erik
likes very much ("crucial element", check) please do not go
away with the impression that we mess up ANSI-specified basics
like being able the redefinition of class instances mentioned
above.


Erik: You stand out from the crowd by being unusually smart,
knowledgeable, articulate, prominent, opinionated, and
undiplomatic, but you are by no means the first person to
criticize SBCL for not supporting a favorite allowed-by-ANSI
extension or idiosyncrasy, and you probably won't be the last.

If I understand correctly, you want macro redefinition DWIM in
interpreted code, so that as long as you don't COMPILE-FILE or
COMPILE, you don't need to think about rebuilding the world
from scratch when you redefine a macro. That does sound like
handy behavior, and it's certainly deliciously dynamic. If
there were a clean and simple way to make it happen, we might
well do it. (Though as explained by Christophe Rhodes in another
post, even then we probably wouldn't do it next release, i.e.
late this month, despite your perhaps-well-meant "if SBCL is
a reasonable Common Lisp environment" remark quoted above.)
Whether in reality we will actually do it, I don't know, since
* Our abilities are not unlimited. (though still formidable:-)
* I place some priority on minimizing the number of options and
extras in the system, so having to control such behavior with
a non-ANSI flag would be a little distasteful to me.
* It's not immediately obvious to me that the semantics of this
are well-specified. E.g. I have (in an application program) code
implementing functor-like constructs using elaborate lexical
closures which are then sometimes stuffed into (FDEFINITION 'FOO),
and maybe if I thought about it I could find code within the
implementation of SBCL which has similar issues, too. I'm not
quite sure how to make macro redefinition DWIM on things like
that. (Though maybe it's known and I'm just not aware of it, or
maybe everything just works out naturally as long as the
interpreter is cautious or simple-minded enough not to try to
optimize away unused-under-current-macro-definitions elements
of the lexical environment.)

Also, I suspect that you may be writing from incomplete
knowledge when you give the impression that the
macro-redefinition behavior in the old implementation of
interpreted functions was clean before "SBCL" -- in this case,
me -- "made an engineering decision" to punt the IR1 and byte
interpreters in sbcl-0.7.0. Perhaps in your programming style
you don't need to worry about stuff like the persistent-closure
issues above (where both "interpreters" did various
precompilation and caching including, IIRC, optimizing away
elements of the lexical environment based on what was needed by
the macro definition in use at initial "interpretation" time).
But even if that wouldn't have bitten you, I doubt you were in the
habit of taking care to execute EVAL:FLUSH-INTERPRETED-FUNCTION-CACHE
when you redefined a macro.


CMU CL developers: Now that I think of it, maybe %DEFMACRO or
(SETF MACRO-FUNCTION) or some such thing should automatically
call FLUSH-INTERPRETED-FUNCTION-CACHE? unless of course there's
some other kind of magic going on that I didn't find with
"egrep flush-interp" on recent CMU CL CVS source)

--
William Harold Newman
"Lisp does have a problem: it doesn't come with a big enough
stick with which to hit the ignorant fucks so they get out of
their retarded state." -- Erik Naggum in comp.lang.lisp 7 May 1999

Thomas F. Burdick

unread,
Oct 17, 2002, 5:09:33 PM10/17/02
to
Erik Naggum <er...@naggum.no> writes:

> * Thomas F. Burdick
> | It is a project with limited development means, and a version number less
> | than 1. Obviously it's not for you (at the moment).
>
> And this was a counter-argument to what, precisely? Should I shut up
> about where I think they should be going just because you -- what, felt
> bad or something?

No, you should avoid giving the wrong impression about the project.
It's not like CLISP, which has a broken-by-design object system, and a
2.x version number. SBCL 0.7.x means you shouldn't expect a
ready-for-everybody system. It's quite good about ANSI compliance,
though, which is why it's one of the systems I use.

> Unnecessarily provocative, my ass.

It was. I don't mind provocative /per se/, but what you were saying
gives the impression that SBCL is willfully bad, as opposed to in
development. If you said that most people shouldn't use it until it
gets a sufficiently good, dynamic development environment, fine. But
lumping it in with willfully noncompliant systems for this reason,
given it's version number, is inappropriate. People who want a better
development environment should check back on SBCL later, to see if
it's grown one, yet. I wouldn't imagine people who make heavy use of
meta-object programming would check in on CLISP periodically to see if
it had grown a full CLOS, because its lack thereof is a design
decision in a mature product, not a missing extra-standard feature in
a work-in-progress.

Erik Naggum

unread,
Oct 17, 2002, 5:49:44 PM10/17/02
to
* William Newman
| Gack.

That was a particularly depressing opening.

| Thus I was particularly annoyed and provoked by Erik's comparisons to
| CLISP and gcl above.

In /addition/ to being egoistically concerned with your own feelings, how
about also considering how annoyed and provoked I was when I noticed that
SBCL had removed this crucial feature from CMUCL. I have come to rely on
the interpreted behavior of macros in Allegro CL because I have

| there seems to be the suggestion that SBCL has been backsliding away from
| ANSI-specified behavior which supports incremental development.

You know, I consider ANSI compliance the baseline. It is in my view quite
irrelevant when arguing for a particular implementation over another. It
would be like marketing books with "now conforming to standard spelling
and grammar" on the cover. In other words, I do not look for compliance,
I expect it, the same way I expect people to think and get annoyed when
they do, instead of applauding them if they do like the touchy-feely guys.

| So while SBCL may have ditched support for extras that Erik likes very
| much ("crucial element", check) please do not go away with the impression
| that we mess up ANSI-specified basics like being able the redefinition of
| class instances mentioned above.

I do not think there were any grounds to think that the annoyances of one
implementation would be contagious just because it was juxtaposed to
another implementation, so I do not believe anyone would go away with
that impression, but had I known you were nervous about compliance, I
would not have antagonized you on that aspect. As far as I am concerned,
CMUCL (and hence SBCL, which I have only recently installed for fun)
stands head and shoulders above the other no-cost alternatives.

| Erik: You stand out from the crowd by being unusually smart,
| knowledgeable, articulate, prominent, opinionated, and undiplomatic, but
| you are by no means the first person to criticize SBCL for not supporting
| a favorite allowed-by-ANSI extension or idiosyncrasy, and you probably
| won't be the last.

On my list of impressions that I want people to go away with are being


unusually smart, knowledgeable, articulate, prominent, opinionated, and

undiplomatic, but I have no intention of being remembered as the first or
last person to criticize SBCL, so I mark that down as one success. :)

| If I understand correctly, you want macro redefinition DWIM in interpreted
| code, so that as long as you don't COMPILE-FILE or COMPILE, you don't
| need to think about rebuilding the world from scratch when you redefine a
| macro.

Precisely.

| That does sound like handy behavior, and it's certainly deliciously dynamic.

Great that we see this the same way.

| If there were a clean and simple way to make it happen, we might well do it.

Very good! If you find that it is easier to keep a record of compiled
functions that used a particular macro and report this upon redefinition
of the macro in a way that could be used to recompile functions as needed,
that would be even better because it would allow for automated tools to
create patches and would influence the system-building features.

| Also, I suspect that you may be writing from incomplete knowledge [...]

I started to learn Common Lisp for real (after on-and-off use and toying
for at least a decade) in 1994 with CMUCL, but got my hands on Allegro CL
and found that it was really remarkably different and so much better than
CMUCL for software development. I have stayed with it since, thanks in
large part to the great people at Franz Inc.

--
Erik Naggum, Oslo, Norway http://www.beltwaybuzz.com/Story%20Four.htm

Erik Naggum

unread,
Oct 17, 2002, 6:35:47 PM10/17/02
to
* Thomas F. Burdick

| I don't mind provocative /per se/, but what you were saying gives the
| impression that SBCL is willfully bad, as opposed to in development.

If you could at least have recognized that it gave /you/ that impression,
I could have expressed some regret at it, but when you pretend that it is
some objective fact, I have to disagree very strongly.

| But lumping it in with willfully noncompliant systems for this reason,
| given it's version number, is inappropriate.

No, /you/ think it is inappropriate. As long as you have no room for the
fact that I think it is appropriate as far as user expectations go, I only
want to voice my opinion stronger to make you see my point of view, which
you actually deny.

| People who want a better development environment should check back on
| SBCL later, to see if it's grown one, yet.

What part of what I wrote could possibly be interpreted otherwise? When
I do not even close the lid on people who so richly deserve it, how could
you think I could possibly have closed the lid on SBCL? When I say it
sucks, that is the observation there and then. People change their mind
and do something better, products get updated and improved with user
input, and "it sucks" therefore means "if you do something about the
things I have criticized, it would only suck if it sucks for more reasons",
which is often a distinct propability, but you should at the very least
acknowledge the possibility that it would cease to suck. I really wonder
what this "eternal damnation" thing is all about, because it is so very
irrational and /primitive/. It may have worked on the savannah where
things stayed the same for the duration of an entire human life (all 25
years of it), but we no longer live in a world where /anything/ can be
expected to remain unchanged through your life (all 80+ years of it).

| I wouldn't imagine people who make heavy use of meta-object programming
| would check in on CLISP periodically to see if it had grown a full CLOS,
| because its lack thereof is a design decision in a mature product, not a
| missing extra-standard feature in a work-in-progress.

When is the decision made not to include something in a future relase?
This is not knowable. Even the CLISP crowed could evolve to get rid of
their annoying "we know better than the standard" attitude, for which
they think I am being unfair towards them every time I bring up, but they
keep insisting on being smarter than the standard. But just like people
who keep insisting on being smarter than the law occasionally wake up and
want to obey it for no better reason than that it is there and that people
expect them to before they want to deal with them, even they could wake
up one day and exclaim "hey, we could gain something by conforming to the
standard!". However, far be it for me to impute intent to people -- for
all I know, CLISP may be bogged down in implementation decisions that
make it intractable to move closer to conformance without backing out a
fair bit to use a different strategy that would take a lot of time to
make as good as what they have now. When you want to climb mountains,
you usually have to cross valleys to climb the next higher one.

Pekka P. Pirinen

unread,
Oct 18, 2002, 8:35:15 AM10/18/02
to
> as I understand it, the macro expansion is performed each time an
> interpreted function is called, so if the macro function is changed,
> there is no need for the interpreted function to be reloaded in
> order to take advantage of the updated macro.

There's nothing in the standard that requires this, and even if there
was, there's nothing in the standard that requires an interpreter to
be present at all (i.e., what looks like an interpreter can be
regarded as just another compiler). There are useful interpreter
implementation techniques that expand macros more or less often than
at the call to the containing function. A surrounding macro might
walk the code in its body and expand macros as it goes. You just have
no guarantees.

Usually, you can tell when your macro is being expanded. TRACE is
your friend here. If it's not often enough, there are other ways:
Just recompile - modern compilers are fast, and DEFSYS automates
compilation; you can define a helper function that encapsulates the
issue that you're working on; or you can test the macro interactively
(I find (PPRINT (MACROEXPAND-1 '...)) the most useful tool for macro
writing) or in a test bed - Lisp makes this easy.
--
Pekka P. Pirinen
On Usenet everyone is an expert. - Todd Michel McComb

Erik Naggum

unread,
Oct 18, 2002, 12:17:53 PM10/18/02
to
* Pekka P. Pirinen

| Just recompile - modern compilers are fast

An excellent argument against incremental development and compilation!
Why not just stop and restart applications when you make changes, too?
Or better yet, just reboot after you change a function. Modern computers
are fast.

Some people actually work on applications that take a /long/ time to
compile, even on very fast computers. Having to figure out what to
compile when a macro changes is time-consuming, but less so than to
recompile indiscriminately. Instead, run the parts under development
interpreted with late macro-expansion, and the difference in performance
between a compiled version, which would take a coffee break to achieve,
and the interpreted version may be compensated for by typing one less
letter in the command to run it. Modern computers /are/ fast. The
desire to compile everything all the time is simply wrong. Common Lisp
environments that force people back into the old ways of compiling the
whole system or major parts of it or having to go through build procedure
after every batch of changes are doing their users a great disservice.

Why do you guys think that "scripting" is such a hit? Because it does not
need the stupid build procedure when changes are made! Figure it out!

Lisp has supported this mode of development for 40+ years, but now that
it is becoming popular, let's make Common Lisp compiled-only and ask
people to compile everything! I would expect this shit from people who
wanted people to move away from Common Lisp to, say, Python or Arc.

Joe Marshall

unread,
Oct 18, 2002, 12:32:45 PM10/18/02
to
Erik Naggum <er...@naggum.no> writes:

> * Pekka P. Pirinen
> | Just recompile - modern compilers are fast
>
> An excellent argument against incremental development and compilation!
> Why not just stop and restart applications when you make changes, too?
> Or better yet, just reboot after you change a function.

Isn't this how Windows works?

Erik Naggum

unread,
Oct 18, 2002, 12:59:59 PM10/18/02
to
* Joe Marshall

| Isn't this how Windows works?

Yes. The subtle reference to the Third Way of doing things was intended.
I mean, we have the Right Way, the New Jersey Way, and the Redmond Way.

Joe Marshall

unread,
Oct 18, 2002, 1:59:01 PM10/18/02
to
Erik Naggum <er...@naggum.no> writes:

> * Joe Marshall
> | Isn't this how Windows works?
>
> Yes. The subtle reference to the Third Way of doing things was intended.
> I mean, we have the Right Way, the New Jersey Way, and the Redmond Way.

Sort of like the right hand, the left hand, and the wanking hand.

Will Deakin

unread,
Oct 18, 2002, 2:44:11 PM10/18/02
to
Joe Marshall wrote:
> Sort of like the right hand, the left hand, and the wanking hand.
If only I had a full set, I would never have to post on c.l.l. again...

;)

Kaz Kylheku

unread,
Oct 18, 2002, 3:26:20 PM10/18/02
to
"Matthew X. Economou" <xenopho...@irtnog.org> wrote in message news:<w4osmz6dt...@eco-fs1.irtnog.org>...

> Erik Naggum recently made the point that Lisp in an interpreted mode
> is ideal for writing and debugging macros (among other programming
> tasks). As I understand it, the typical Lisp implementation does not
> remember that a compiled function depends on the result of a macro
> expansion, so if the macro function is changed in such a way that
> affects the expansion, any compiled function must be manually
> recompiled in order to take advantage of the updated macro.

Not only to take advantage, but to *work*! The obsolete expansion may
quite simply be invalid.

> That said, what's stopping an implementation's compiler from storing
> this dependency information in (or with) the object code of a compiled
> function, so that the function gets recompiled automatically if a
> macro on which the function is dependent is changed?

Not having the source code, for one thing.

Tim Bradshaw

unread,
Oct 18, 2002, 5:13:29 PM10/18/02
to
* Erik Naggum wrote:

> Some people actually work on applications that take a /long/ time to
> compile, even on very fast computers.

Lisp applications which take a long time to recompile should, I guess,
either be doing something very complex in the way of really hairy
macros, say, or be huge. My system of ~22,000 lines takes 13 seconds
to build from cold, or ~26 seconds to build + dump an image. Typical
compile times in the development environment are well under a second.
This is on a non-state-of-the-art machine (1.7GHz pentium-something
with enough memory). Personally, I never run anything interpreted.

*However* I am not disagreeing with your point. In the absence of a
tool which will automatically tell you what dependencies are for Lisp
systems and construct a system definition which respects these
dependencies (and I've never seen such a tool, though one could
obviously be built based on a who-calls database), it is *very* hard
to write a system definition which really respects macro-dependencies.
Sometimes if I change a major macro I just blow everything away and
rebuild, because I know my system declarations have bugs. A
late-expanding interpreter solves this in the absence of such a tool.

It annoys me that no such tool exists though, especially as one of the
things the system above does (or did) was to construct these kinds of
dependency relationships for C++/CORBA systems automatically...

--tim

sv0f

unread,
Oct 18, 2002, 6:36:40 PM10/18/02
to
In article <3cr34o...@ccs.neu.edu>, Joe Marshall <j...@ccs.neu.edu> wrote:

>Erik Naggum <er...@naggum.no> writes:
[...]


>> Yes. The subtle reference to the Third Way of doing things was intended.
>> I mean, we have the Right Way, the New Jersey Way, and the Redmond Way.
>
>Sort of like the right hand, the left hand, and the wanking hand.

^^^^^^^
So this is why Pournelle and Niven called it the "gripping" hand.

Vassil Nikolov

unread,
Oct 19, 2002, 1:01:30 AM10/19/02
to
On 18 Oct 2002 22:13:29 +0100, Tim Bradshaw <t...@cley.com> said:

TB> Lisp applications which take a long time to recompile should, I guess,
TB> either be doing something very complex in the way of really hairy
TB> macros, say, or be huge.

Sometimes there are more prosaic reasons, like compiling from and
to a slow network file system.

---Vassil.

--
Non-googlable is googlable.

Paolo Amoroso

unread,
Oct 19, 2002, 1:25:07 PM10/19/02
to
On 18 Oct 2002 16:17:53 +0000, Erik Naggum <er...@naggum.no> wrote:

> Or better yet, just reboot after you change a function. Modern computers
> are fast.

They sure are, but they apparently lose all their speed at boot time :)


Paolo
--
EncyCMUCLopedia * Extensive collection of CMU Common Lisp documentation
http://www.paoloamoroso.it/ency/README

Don Geddis

unread,
Oct 19, 2002, 3:56:49 PM10/19/02
to
Erik Naggum <er...@naggum.no> writes:
> I started to learn Common Lisp for real (after on-and-off use and toying
> for at least a decade) in 1994 with CMUCL, but got my hands on Allegro CL
> and found that it was really remarkably different and so much better than
> CMUCL for software development.

I've used both myself. Just curious: would you mind listing a few specific
features that you've found in ACL to make it "different and so much better"
than CMUCL? (You've mentioned Franz's excellent technical support in the past;
I'm curious about the advantages of the software product itself that you see.)
_______________________________________________________________________________
Don Geddis http://don.geddis.org d...@geddis.org
It is a great advantage for a system of philosophy to be substantially true.
-- George Santayana

Erik Naggum

unread,
Oct 19, 2002, 5:20:58 PM10/19/02
to
* Don Geddis

| I've used both myself. Just curious: would you mind listing a few
| specific features that you've found in ACL to make it "different and so
| much better" than CMUCL?

Two things really stand out: (1) Multithreaded support directly from
Emacs, which meant background compilation and support functions. I
quickly became used to running different listeners in different packages
and even run various minor services from Emacs. (2) The cross-reference
facility is amazingly useful. From Emacs, M-x fi:edit-who-calls and M-x
fi:edit-who-is-called-by remove the need to go hunting through the
sources files with grep and tags files.

The arglist facility is also one of the most useful things around, but
this is sort of a given, but the way Emacs talks to the Common Lisp
process was so much more streamlined and integrated.

(For a time, I hosted the ILISP mailig list, but I have not used it in
many years. One of the reasons I have never put LispWorks to the test is
that their user interface is not sufficiently compatible with Emacs.)

--
Erik Naggum, Oslo, Norway

Act from reason, and failure makes you rethink and study harder.

Tim Bradshaw

unread,
Oct 20, 2002, 7:56:53 AM10/20/02
to
* Vassil Nikolov wrote:

> Sometimes there are more prosaic reasons, like compiling from and
> to a slow network file system.

I think if development environments have slow networked filesystems
then someone is not thinking. There's no real excuse for a local-area
filesystem which can sustain significantly less than 5MB/sec (I'm not
sure what the sustained throughput with NFS-over-100Mb switched
ethernet is but it's likely around that, or more, and I suspect SMB is
about the same). If the server is a bottleneck then that should be
because you have a lot of developers, so *buy a bigger one*.
Alternatively use a system - like CVS - which means you never need to
do compiles over a network filesystem.

This isn't to say that lots of people don't end up working in
environments where things are throttled by filesystem throughput
(badly configured clearcase environments are probably a common
example), just that I think that these are generally examples of bad
system management...

--tim

Greg Menke

unread,
Oct 20, 2002, 9:31:20 AM10/20/02
to

Tim Bradshaw <t...@cley.com> writes:
> * Vassil Nikolov wrote:
>
> > Sometimes there are more prosaic reasons, like compiling from and
> > to a slow network file system.
>
> I think if development environments have slow networked filesystems
> then someone is not thinking. There's no real excuse for a local-area
> filesystem which can sustain significantly less than 5MB/sec (I'm not
> sure what the sustained throughput with NFS-over-100Mb switched
> ethernet is but it's likely around that, or more, and I suspect SMB is
> about the same). If the server is a bottleneck then that should be
> because you have a lot of developers, so *buy a bigger one*.
> Alternatively use a system - like CVS - which means you never need to
> do compiles over a network filesystem.

I've quite easily gotten sustained 5-6 megabytes/sec thru NFS on
100baseT, but SMB has been lots slower on the same hardware- maybe up
to 2 megs a second. Perhaps my experience w/ slow SMB is a
consequence of poor tuning, but having messed with it a fair bit, I've
never seen much of an improvement. I've also seen this same poor
performance between 2 Windows systems.

Gregm

Erik Naggum

unread,
Oct 20, 2002, 9:51:01 AM10/20/02
to
* Tim Bradshaw

| Lisp applications which take a long time to recompile should, I guess,
| either be doing something very complex in the way of really hairy
| macros, say, or be huge.

Just to clarify: More than 100 ms is a long time if you have to do it in
order to test-run a macro. That is sufficient to interrupt your flow of
thinking and working. When I develop with Allegro CL and Emacs, I tell
it not to compile with (setq fi:lisp-evals-always-compile nil) on the
Emacs side, and I happily go about M-C-x'ing definitions, run stuff in
the background, and test code. Code that matures is saved, compiled, and
loaded, but code in development generally runs interpreted. I really
like that Allegro CL makes this so streamlined and painless.

| My system of ~22,000 lines takes 13 seconds to build from cold, or ~26
| seconds to build + dump an image.

And I think Unix `ls´ takes a long time to run on directories with 1000
files so Emacs `dired´ is something other than instantaneous.

| Sometimes if I change a major macro I just blow everything away and
| rebuild, because I know my system declarations have bugs.

Precisely. And you cannot do this every time you change a macro.

I have always developed mini-languages, even when I worked mainly in C,
and macros give me the opportunity to reprogram things and have code that
reasons about the code I write. Macros do a lot of the work in my code.
When I want to change something important, it is always in the macros,
and minor changes can have major effects. Neat protocol descriptions are
usually reworked by macros to produce reems of code that is extremely
hard to follow, but very efficient. Perhaps I "overuse" macros, but the
ability to run interpreted correctly is essential. If macros were harder
to use, or function redefinition did not work, I would not be able to
write this kind of code at all. What I would have done then is hard to
tell, but that I would have ended up with less intelligent and messier
code is virtually guaranteed.

E.g., I have been working on and off on a regular expression machine that
does as much as possible at compile-time, such as maintain multiple paths
through the graph in parallel (breadth-first) instead of backtracking and
wasting lots of time if the order of alternatives is not optimal. This
came up when I sat down to reimplement the Common Lisp reader: if I could
compute the integer value while reading a potential number digit by digit
and make other decisions along the way, I could exercise the cache more
fully and read only once from uncached memory. Writing code that does
this manually is very, very hard, especially if you want to get it right.
Writing macros that arrange multiple execution paths in parallel is not
easy, but at least it is much easier than to write what they produce.

| It annoys me that no such tool exists though, especially as one of the
| things the system above does (or did) was to construct these kinds of
| dependency relationships for C++/CORBA systems automatically...

This may be related to the curse of macros: You cannot in general expect
to understand fully what some code would compile to without being the
compiler. In inferior languages, the code you write is probably the code
the machine will run. In Common Lisp with lots of macros, the code you
write is only data for lots of other programs before it becomes code for
the compiler to arrange for the machine to run. It takes some getting
used to.

Tim Bradshaw

unread,
Oct 20, 2002, 10:47:34 AM10/20/02
to
* Erik Naggum wrote:

> Just to clarify: More than 100 ms is a long time if you have to do it in
> order to test-run a macro. That is sufficient to interrupt your flow of
> thinking and working.

yes, I agree. I'm always amazed that people who use C/C++ get
anything done at all...

> This may be related to the curse of macros: You cannot in general expect
> to understand fully what some code would compile to without being the
> compiler. In inferior languages, the code you write is probably the code
> the machine will run. In Common Lisp with lots of macros, the code you
> write is only data for lots of other programs before it becomes code for
> the compiler to arrange for the machine to run. It takes some getting
> used to.

Yes, this is why I think these tools need to be things the system
provides. It annoys me that so many of them so nearly provide it but
don't quite manage (for instance: a who-calls which understands
macros, and source location recording, which all the professional
systems have I think, is nearly enough to make it be the case that
redefining a macro will go and recompile all the code that uses it,
recursively...

--tim

Erik Naggum

unread,
Oct 22, 2002, 1:27:12 PM10/22/02
to
* Tim Bradshaw

| It annoys me that so many of them so nearly provide it but don't quite
| manage (for instance: a who-calls which understands macros, and source
| location recording, which all the professional systems have I think, is
| nearly enough to make it be the case that redefining a macro will go and
| recompile all the code that uses it, recursively...

I have just tried to make Allegro CL report information on a macro-to-be-
compiled such that it would advise the functions that had used a compiled
version of same to recompile themselves the next time they were called,
advise that would obviously be overridden if the function was compiled
along with the new macro. The experiment failed spectacularly (the lisp
image died!) and will result in a bug report if I can convince myself that
it was not a dumb thing to do. But I somehow think it should have worked.

Duane Rettig

unread,
Oct 22, 2002, 2:00:01 PM10/22/02
to
Erik Naggum <er...@naggum.no> writes:

> * Tim Bradshaw
> | It annoys me that so many of them so nearly provide it but don't quite
> | manage (for instance: a who-calls which understands macros, and source
> | location recording, which all the professional systems have I think, is
> | nearly enough to make it be the case that redefining a macro will go and
> | recompile all the code that uses it, recursively...
>
> I have just tried to make Allegro CL report information on a macro-to-be-
> compiled such that it would advise the functions that had used a compiled
> version of same to recompile themselves the next time they were called,
> advise that would obviously be overridden if the function was compiled
> along with the new macro. The experiment failed spectacularly (the lisp
> image died!) and will result in a bug report if I can convince myself that

=================================================^^


> it was not a dumb thing to do. But I somehow think it should have worked.

The only "dumb thing to do" (unless you find that it was a bug in your own
code) would be to not report it, especially if you believe that it should
have worked. But "dumb things" notwithstanding, the reporting of a problem
is never a dumb thing to do. If Lisp is to be believed as being an
intuitive language, then intuitive things should work, and the reporting of
non-intuitive behaviors is just as important as the reporting of outright
bugs. And even if the Lisp's behavior is intuitive and the customer's
intuitions are unreasonable, if the reason for the mindset can be
determined, then we can work further to preclude future mistakes by
other customers in similar situations.

--
Duane Rettig du...@franz.com Franz Inc. http://www.franz.com/
555 12th St., Suite 1450 http://www.555citycenter.com/
Oakland, Ca. 94607 Phone: (510) 452-2000; Fax: (510) 452-0182

Edi Weitz

unread,
Oct 31, 2002, 5:44:48 PM10/31/02
to
Erik Naggum <er...@naggum.no> writes:

> I expect to be able to change a macro definition and its
> interpreted users will reflect the chang while its compiled users
> would not.
>
> (defmacro macro-expansion-test ()
> (print "I'm being expanded!")
> `(print "I'm the expansion!"))
>
> (defun expanded-when ()
> (macro-expansion-test))
>
> Evaluate these and see what `(expanded-when)´ produces when you
> evaluate it twice in a row. If you get the first `print´ while
> defining it, your environment is very eager. If you get the first
> `print´ only once, it does the expansion on the first evaluation.
> I expect it on both. Then `(compile 'expanded-when)´ and it
> should print "I'm being expanded!" and calling it should only
> produce "I'm the expansion".

Sorry for being a little bit late on this topic but I just got bitten
by this in my project so here's my question:

I ran Erik Naggum's test with the following CL implementations that I
happen to have currently installed on my laptop:

- Allegro CL 6.2 trial edition
- LispWorks professional 4.2.7
- CMUCL 18d+ (built 2002-09-13)
- SBCL 0.7.6
- SCL 1.1.1 trial edition
- CLISP 2.29

It turns out that (with the notable exception of ACL as was already
mentioned by Erik Naggum) none of them "passes" the test,
i.e. interpreted users of a macro won't reflect the change of the
macro they're using. (SBCL will print "I'm being expanded" only when
EXPANDED-WHEN is defined, CMUCL, CLISP, and LW will print it when it
is defined and when it is compiled, SCL prints it when the function is
called for the first time and when it is compiled.)

Now, if I remember correctly, part of this thread was a discussing
about the fact that SBCL has changed CMUCL's behaviour with respect to
this. While I see that CMUCL and SBCL do behave differently they seem
to be essentially the same as far as reflecting macro changes is
concerned. Someone mentioned that some implementations might have a
switch to control this (whether macros will always be expanded or
not). Is this true? I searched the docs of CMUCL and LW 'cause I'm
working with these most of them time but I couldn't find anything.

Thanks in advance,
Edi.

Eric Marsden

unread,
Nov 6, 2002, 2:19:15 PM11/6/02
to
>>>>> "ew" == Edi Weitz <e...@agharta.de> writes:

ew> It turns out that (with the notable exception of ACL as was
ew> already mentioned by Erik Naggum) none of them "passes" the
ew> test, i.e. interpreted users of a macro won't reflect the change
ew> of the macro they're using.

with CMUCL, you can obtain the behaviour that Erik describes by
calling EVAL:FLUSH-INTERPRETED-FUNCTION-CACHE each time you redefine a
macro, by encapsulating #'(setf macro-function). This only works with
interpreted code, of course.

,---- CMUCL 18d+ ---
| USER>
| (ext:encapsulate '(setf macro-function) 'ecm-flush-interpreted-macroexpansions
| '(progn
| (eval:flush-interpreted-function-cache)
| (apply ext:basic-definition ext:argument-list)))
| #<Closure Over Function "DEFUN ENCAPSULATE" {4822BBE1}>
| USER> (defmacro macro-expansion-test ()


| (print "I'm being expanded!")
| `(print "I'm the expansion!"))

| macro-expansion-test
| USER> (defun expanded-when ()
| (macro-expansion-test))
| expanded-when
| USER> (expanded-when)
| "I'm being expanded!"
| "I'm the expansion!"
| "I'm the expansion!"
| USER> (expanded-when)
| "I'm the expansion!"
| "I'm the expansion!"
| USER> (defmacro macro-expansion-test ()
| (print "I'm being expanded again!")
| `(print "I'm the new expansion!"))
| macro-expansion-test
| USER> (expanded-when)
| "I'm being expanded again!"
| "I'm the new expansion!"
| "I'm the new expansion!"
| USER> (expanded-when)
| "I'm the new expansion!"
| "I'm the new expansion!"
| USER> (compile 'expanded-when)
| Compiling lambda nil:
| Compiling Top-Level Form:
| "I'm being expanded again!"
| expanded-when
`----

--
Eric Marsden <URL:http://www.laas.fr/~emarsden/>

0 new messages