This has made me doubt the real necessity of macros, as they always seem
to cover up for one or another language deficiency.
In general, macros are a messy thing:
- Since the macros are actually source code generators of some sort,
they create trouble such as namespace collisions, which result in
messy workarounds, such as gensym.
- Since macros are inline by nature, they make it much harder to
seperate the low-level code into units of functions.
Macros usually have a function-like syntax, and perform 'operations'
on the arguments (is this the right term for the case?) of the macro.
Those operations are commonly unusable on function arguments in the
same way, because function arguments are COPIES of the original values
and not transparentely pointing at them. This is the most common use
of macros, I believe, and is only there to make up for a method of
calling functions 'By value'. Can someone explain to me the
advantages of the implicit copying when sending 'By value'? This, I
would consider a language defiency, because it requires the use of a
messy solution (namely macros) when a non-messy one (functions) WOULD
have been possible.
Most other uses of macros I would consider ABUSES of macros, however
some non-abusive forms also exist, those are creating 'patterns' by
which the generate source code follows. I believe the language could
offer the ability to create those patterns using powerful language
features rather than the messy solution.
I would appriciate an example of something a macro can be used for,
for which a function, in a more powerful language (In terms of what
functions can do), cannot be used.
Thanks, Eyal Lotem
--
Micros~1 is history (www.microsoft.com).
Linux is the present (www.linux.org).
EROS is the future (www.eros-os.org).
> I have been thinking a lot about language design lately.
>
> This has made me doubt the real necessity of macros, as they always seem
>
> to cover up for one or another language deficiency.
>
> In general, macros are a messy thing:
>
> - Since the macros are actually source code generators of some sort,
>
> they create trouble such as namespace collisions, which result in
>
> messy workarounds, such as gensym.
>
You are partially correct. Macros are not necessary, but they also are
not a "coverup of language deficiency." They are a convenience. When
used appropriately, they allow you to define a language which allows
you to write code in a much more concise manner than you otherwise
would have to.
In my opinion, the convenience afforded by appropriate use of macros
far outweighs the drawbacks which you mention above.
But you are right, anything which you can produce with a macro can
also be written by hand. Assembler language can be written by hand, as
well. The point of macros is much like the point of a compiler - to
get the computer to do much of the grunt work of writing your program
for you.
-dave
--
David J. Cooper Jr, Chief Engineer Genworks International
dco...@genworks.com 5777 West Maple, Suite 130
(248) 932-2512 (Genworks HQ/voicemail) West Bloomfield, MI 48322-2268
(248) 407-0633 (pager) http://www.genworks.com
Peaker <pea...@makif.omer.k12.il> wrote in message
news:390371F0...@makif.omer.k12.il...
> I have been thinking a lot about language design lately.
>
> This has made me doubt the real necessity of macros, as they always seem
>
> to cover up for one or another language deficiency.
>
> In general, macros are a messy thing:
>
> - Since the macros are actually source code generators of some sort,
>
> they create trouble such as namespace collisions, which result in
>
> messy workarounds, such as gensym.
>
I'd agree that they _can_ be messy, which is why they should be used
somewhat cautiously. They aren't necessarily messy, though, and can be
invaluable.
> You are partially correct. Macros are not necessary, but they also are
> not a "coverup of language deficiency." They are a convenience. When
> used appropriately, they allow you to define a language which allows
> you to write code in a much more concise manner than you otherwise
> would have to.
>
> In my opinion, the convenience afforded by appropriate use of macros
> far outweighs the drawbacks which you mention above.
That's the key phrase: "appropriate use". I regard (and use) macros (in any
language) the same way I do chainsaws -- powerful and sometimes invaluable,
but only a fool would use them anywhere except where they really are the
best answer.
Larry
> (defun f ( str &rest body )
> `(format t ,str ,@body)
> )
...
> Actually, if there's a way of doing that *without* macros, I sure
> would like to know about it.
(defun format-t (string &rest args)
(apply #'format t string args))
Robert
I view a language without metalinguistic constructs as hamstrung.
Java, for example is soooo not extensible. Standard macros in C++
and ANSI C are barely a solution, although the gcc style macros
at least offer variatic argument support. In lisp, I've found that
macros are outright mandatory when you want to pass of a &rest
form to a function which expects that &rest form as not a list. e.g.,
(defun f ( str &rest body )
`(format t ,str ,@body)
)
(example of writing a format that passes through to format
a bit silly, but gets the point across; hope I got the syntax
right, not at work and don't have all this memorized yet).
Actually, if there's a way of doing that *without* macros, I sure
would like to know about it.
Overall, I like macros, however, as long as the programmer
doesn't go so overboard with them that the code becomes "write
only". :)
You do learn macro-expand and macro-expand1 pretty quickly
using them, though. :)
C/
Sorry, macros are not "mandatory" at all for doing that. There's a
straightforward functional way to do it.
+---------------
| Actually, if there's a way of doing that *without* macros, I sure
| would like to know about it.
+---------------
It's called "apply". See: <URL:http://www.xanalys.com/software_tools/
reference/HyperSpec/Body/fun_apply.html>
-Rob
-----
Rob Warnock, 41L-955 rp...@sgi.com
Applied Networking http://reality.sgi.com/rpw3/
Silicon Graphics, Inc. Phone: 650-933-1673
1600 Amphitheatre Pkwy. PP-ASEL-IA
Mountain View, CA 94043
Thank you! That's very helpful.
C/
Come to think of it, that neatly wipes out most of the macros
I've used in my current project. Still, I sleep better knowing
that `(,@) is available. :)-
C/
> (defun format-t (string &rest args)
> (apply #'format t string args))
Are either of these forms preferred over one another?
Or is it just a matter of style?
C/
> I have been thinking a lot about language design lately.
>
> This has made me doubt the real necessity of macros, as they always seem
>
> to cover up for one or another language deficiency.
OK, please don't be entirely surprised if the ng has a defensive
reaction to this. Your comments sound like, well, like you haven't
used Lisp long enuff to understand its macros. Which is not meant as
a flame, just an observation.
> In general, macros are a messy thing:
>
> - Since the macros are actually source code generators of some sort,
>
> they create trouble such as namespace collisions, which result in
>
> messy workarounds, such as gensym.
I don't consider an occasional (let ((... (gensym))) ...) to be so
messy. But supposing it were, IMO a better solution would be to make
it prettier. EG, perhaps add something to backquote syntax to
indicate that a given symbol must be a gensym.
> - Since macros are inline by nature, they make it much harder to
>
> seperate the low-level code into units of functions.
I don't follow this.
> Macros usually have a function-like syntax,
More than that, in Lisp they *are* functions that are entered "funny".
> and perform 'operations'
>
> on the arguments (is this the right term for the case?) of the macro.
>
> Those operations are commonly unusable on function arguments in the
>
> same way, because function arguments are COPIES of the original values
>
> and not transparentely pointing at them. This is the most common use
>
> of macros, I believe, and is only there to make up for a method of
>
> calling functions 'By value'.
Whoa! Back way up, please! It sounds like you are describing some
completely different language. True, it's *possible* to abuse Lisp
macros that way, but that's not what they are typically used for.
They operate lexically on forms before those forms are "really"
evaluated.
> I would appriciate an example of something a macro can be used for,
>
> for which a function, in a more powerful language (In terms of what
>
> functions can do), cannot be used.
Defining a function. I say that off the top of my head because that's
what the last macro I dealt with a short time before reading Usenet
did.
--
Tom Breton, http://world.std.com/~tob
Not using "gh" since 1997. http://world.std.com/~tob/ugh-free.html
Rethink some Lisp features, http://world.std.com/~tob/rethink-lisp/index.html
Some vocal people in cll make frequent, hasty personal attacks, but if
you killfile them cll becomes usable.
>
> I view a language without metalinguistic constructs as hamstrung.
> Java, for example is soooo not extensible. Standard macros in C++
> and ANSI C are barely a solution, although the gcc style macros
> at least offer variatic argument support. In lisp, I've found that
> macros are outright mandatory when you want to pass of a &rest
> form to a function which expects that &rest form as not a list. e.g.,
>
> (defun f ( str &rest body )
> `(format t ,str ,@body)
> )
>
> (example of writing a format that passes through to format
> a bit silly, but gets the point across; hope I got the syntax
> right, not at work and don't have all this memorized yet).
I assume you meant:
(defmacro f ( str &rest body )
`(format t ,str ,@body))
> Actually, if there's a way of doing that *without* macros, I sure
> would like to know about it.
Without backquote?
(defmacro f ( str &rest body )
(list* 'format t str body))
or equivalently:
(defmacro f ( str &rest body )
(append (list 'format t str) body))
Without using any macro? No, unless you care to write it out longhand
every time, which defeats the purpose.
> Overall, I like macros, however, as long as the programmer
> doesn't go so overboard with them that the code becomes "write
> only". :)
> You do learn macro-expand and macro-expand1 pretty quickly
> using them, though. :)
Ooh yeah.
> Courageous <jkra...@san.rr.com> writes:
>
> > Actually, if there's a way of doing that *without* macros, I sure
> > would like to know about it.
>
> Without backquote?
>
> (defmacro f ( str &rest body )
> (list* 'format t str body))
>
> or equivalently:
>
> (defmacro f ( str &rest body )
> (append (list 'format t str) body))
>
> Without using any macro? No, unless you care to write it out longhand
> every time, which defeats the purpose.
Egg on my face! I was focussed on creating that precise lexical form,
but of course that misses the point. Use:
(defun f ( str &rest body )
(apply #'format t str body))
> (defun f ( str &rest body )
> `(format t ,str ,@body)
> )
>
> (example of writing a format that passes through to format
> a bit silly, but gets the point across; hope I got the syntax
> right, not at work and don't have all this memorized yet).
>
> Actually, if there's a way of doing that *without* macros, I sure
> would like to know about it.
(defun my-format (str &rest arguments)
(apply #'format t str arguments))
or
(defun my-format (str &rest arguments)
(format t "~?" str arguments))
--
~jrm
> I have been thinking a lot about language design lately.
> This has made me doubt the real necessity of macros, as they always seem
> to cover up for one or another language deficiency.
What do you mean by `deficiency'? Macros allow you to extend the
syntax of a language in ways that the language designers didn't
forsee. This is analagous to the way procedures allow you to extend
the functionality of the language in ways the designers didn't
forsee.
Macros can also be used to extend the language compiler in ways that
the compiler writer didn't forsee. This use of macros is less
common.
Some languages are so poorly designed that it is difficult or
impossible to program without using lots of macros, and some languages
are so poorly designed that macros cannot be used without a complete
understanding of how they expand.
> In general, macros are a messy thing:
>
> - Since the macros are actually source code generators of some sort,
> they create trouble such as namespace collisions, which result in
> messy workarounds, such as gensym.
Gensym is much less messy than some alternatives (like making a `rule'
that macro writers must mangle the names of any variables they
introduce). `Hygenic' macros are another solution popular in the
Scheme world.
> - Since macros are inline by nature, they make it much harder to
> seperate the low-level code into units of functions.
Not if you use them correctly.
> Macros usually have a function-like syntax, and perform 'operations'
> on the arguments (is this the right term for the case?) of the macro.
> Those operations are commonly unusable on function arguments in the
> same way, because function arguments are COPIES of the original values
> and not transparentely pointing at them. This is the most common use
> of macros, I believe, and is only there to make up for a method of
> calling functions 'By value'. Can someone explain to me the
> advantages of the implicit copying when sending 'By value'? This, I
> would consider a language defiency, because it requires the use of a
> messy solution (namely macros) when a non-messy one (functions) WOULD
> have been possible.
Macros *can* be used to simulate call-by-reference, but I try to
eschew this use. When people see a form like (FOO A), they
generally do *not* expect the value of A to be modified, especially
if A is atomic. Unless I have a really good reason to do otherwise, I
would suggest that (SETQ A (FOO A)) would be a far better alternative
than a macro.
> Most other uses of macros I would consider ABUSES of macros, however
> some non-abusive forms also exist, those are creating 'patterns' by
> which the generate source code follows. I believe the language could
> offer the ability to create those patterns using powerful language
> features rather than the messy solution.
>
> I would appriciate an example of something a macro can be used for,
> for which a function, in a more powerful language (In terms of what
> functions can do), cannot be used.
Here's how I use macros:
1. Creating new concise syntax for a particular problem:
(define-alu-operation MOVI (register destination) (immediate value))
2. Creating new language extensions that deliberately mimic existing
special forms. For example, you could imagine a macro called
TOGGLEF that was essentially (SETF ,place (NOT ,place)). Then you
would call (togglef (light-switch (current-room)))
3. Extending other macros or abstracting certain features that cannot
be done via functions.
(with-log-file (stream "foo") ...) =>
(with-open-file (stream (make-pathname :name "foo" :type "log")
:direction :output :if-does-not-exist :create)
....)
4. Truly hairy preprocessing (examples would be too large to include).
I generally avoid using macros just to delay argument evaluation. It
is just as easy to wrap a LAMBDA around the appropriate argument, and
it avoids creating code that isn't applicative order. So you might
see this in my code:
(call-with-temp-file #'(lambda (file) ....))
whereas other lisp hackers would write a macro
(with-temp-file (file) ....)
Obviously, I was brought up on the wrong side of the tracks.
--
~jrm
> I suggest that you read "OnLisp" by Paul Graham. He gives many examples of
> both the convenience aspects of macros, and their use in "language design",
> and shows how macros extend the power of lisp, for example where CLOS
> originally was implemented.
Their convience, and CLOS are examples of how macros extend
LISP. As I tried to explain, I'm not referring to macros as
unnecessary in LISP, but as unnecessary in a different
language design that allows ANY desired feature of macros
through other means and features.
> > You are partially correct. Macros are not necessary, but they also are
> > not a "coverup of language deficiency." They are a convenience. When
> > used appropriately, they allow you to define a language which allows
> > you to write code in a much more concise manner than you otherwise
> > would have to.
The point I tried to make, though, was that the convinience, or ANY other
feature macros can provide, CAN be available by a non-macro language,
and the fact macros ARE indeed a convinence suggests that the language
had a deficiency. The reason I would call that a deficiency, is that to
achieve convinience, you are forced to use inline code, with a lot of potential
name-
space conflicts, and extra complexity on the overall.
> That's the key phrase: "appropriate use". I regard (and use) macros (in any
> language) the same way I do chainsaws -- powerful and sometimes invaluable,
> but only a fool would use them anywhere except where they really are the
> best answer.
Your accurate laser isn't powerful/accurate enough :)
I view a language without metalinguistic constructs as hamstrung.
Java, for example is soooo not extensible. Standard macros in C++
and ANSI C are barely a solution, although the gcc style macros
at least offer variatic argument support. In lisp, I've found that
macros are outright mandatory when you want to pass of a &rest
form to a function which expects that &rest form as not a list. e.g.,
(defun f ( str &rest body )
`(format t ,str ,@body)
)(example of writing a format that passes through to format
a bit silly, but gets the point across; hope I got the syntax
right, not at work and don't have all this memorized yet).Actually, if there's a way of doing that *without* macros, I sure
would like to know about it.
I am not sure if LISP has a way of making that work, but I do know
Overall, I like macros, however, as long as the programmer
doesn't go so overboard with them that the code becomes "write
only". :)
> I would appriciate an example of something a macro can be used for,
>
> for which a function, in a more powerful language (In terms of what
>
> functions can do), cannot be used.
Check:
"On Lisp - Advanced Techniques for Common Lisp"
Paul Graham
Prentice Hall, 1994
ISBN 0-13-030552-9
Paolo
--
EncyCMUCLopedia * Extensive collection of CMU Common Lisp documentation
http://cvs2.cons.org:8000/cmucl/doc/EncyCMUCLopedia/
> I am not sure if LISP has a way of making that work, but I do know
> that in another language design, it would definitely be possible
> without the use of a macro. I am not sure that the Java language
> design is complete in removing the need for macros, and I do not
> think it is provable that a language makes macros obselete.
> However, I would request that you bring a few cases where
> proper C++ macros are necessary (ones that form a complete
> syntax expression). Excluding for example:
> #define InproperMacro(x) ((x) +
> Any Java problems resulting from the lack of macros would also be
> interesting to view, and perhaps I can offer solutions by either
> proposing a more complete language design, or by demonstrating a
> non-macro solution that is already existant.
I don't think you get the point of macros. You can obviously cover
any *specific* case where Lisp needs a macro by simply adding that
special case to the syntax of the basic language. But that doesn't
solve the problem at all, because you've lost the generalisation which
is that the language has an extensible syntax, in a way that C and
Java just don't.
I don't understand why you are talking about C++ and Java by the way.
--tim
On Mon, 24 Apr 2000, Courageous wrote:
>
> > > (defun f ( str &rest body )
> > > `(format t ,str ,@body)
>
Assuming that's
(defmacro f (str &rest body)
`(format t ,str, ,@body))
> > (defun format-t (string &rest args)
> > (apply #'format t string args))
>
>
> Are either of these forms preferred over one another?
> Or is it just a matter of style?
For this kind of simple thing the second is preferred; it's a first-class
function, so you can do a lot more with it. For example:
(defun flexi-format (formats vals)
(mapc #'format-t formats vals))
(flexi-format '("foo: ~S~%" "bar: ~D~%" "naggum: ~@R~%")
'(silade 100 2112))
(It would be an interesting diversion to write flexi-format using only
format strings :)
Tim
Namespace conflicts are a consequence of a specific macro mechanism.
Scheme, for instance, has hygienic macros, so namespace conflicts are not a
problem. Common Lisp has chosen to keep the macro expansion mechanism
simple, shifting the burden of preventing namespace conflicts onto the
programmer. Even if you don't GENSYM the variables introduced by the
macro, the package system usually prevents unexpected conflicts.
You're correct that macros sometimes indicate a missing feature in the
basic language. But it's impossible to expect language designers to
foresee everything, so macros are the way in which the language evolves.
For instance, MacLisp didn't originally have LET, SETF, or LOOP -- they
were all added to the language using macros, and became so popular that
they were included as built-in features when Common Lisp was being defined.
However, many macros are application-specific, and do not indicate a
language deficiency at all.
--
Barry Margolin, bar...@genuity.net
Genuity, Burlington, MA
*** DON'T SEND TECHNICAL QUESTIONS DIRECTLY TO ME, post them to newsgroups.
Please DON'T copy followups to me -- I'll assume it wasn't posted to the group.
In other words, a macro system which is not called a macro system.
Not very interesting. Certainly there are no extant programming
languages where you can achieve the same sort of context specific
uplift of the base language that you have with macros.
I think the core issue here is that you don't understand the how, why,
what, and where of Lisp macros. You need to get this square in your
own head before you can even begin further discussion. As someone
else pointed out, Grahm's On Lisp book would be an excellent source
for gaining such understanding.
/Jon
--
Jon Anthony
Synquiry Technologies, Ltd. Belmont, MA 02478, 617.484.3383
"Nightmares - Ha! The way my life's been going lately,
Who'd notice?" -- Londo Mollari
> Macros usually have a function-like syntax, and perform 'operations'
>
> on the arguments (is this the right term for the case?) of the macro.
>
> Those operations are commonly unusable on function arguments in the
>
> same way, because function arguments are COPIES of the original values
>
> and not transparentely pointing at them.
You're completely lost. A macro's arguments are the "bits of the
input stream" making up the syntactic elements of the macro call. A
macro is just like a function, it just happens to run at compile time
and _typically_ returns transformed source.
Functions do not get "copies" (whatever this might mean) of their
arguments. They get the current evaluated _values_ of the objects
passed in at a call point during execution. Just like macros.
> This is the most common use of macros, I believe, and is only there
> to make up for a method of calling functions 'By value'.
This has no discernable semantic content.
> Can someone explain to me the advantages of the implicit copying
> when sending 'By value'?
Since this doesn't happen, I believe the answer is no.
You really need to first get an _understanding_ of macros before
proceeding with this, otherwise you will continue to be completely
lost, say incorrect or semantically empty things as if they were fact,
and in the process honk off a bunch of people who do know what is
going on.
> > > (defun f ( str &rest body )
> > > `(format t ,str ,@body)
>
> > (defun format-t (string &rest args)
> > (apply #'format t string args))
>
>
> Are either of these forms preferred over one another?
> Or is it just a matter of style?
Well, using a macro instead of a function adds one more level of
indirection, so don't use it unless it accomplishes something extra.
And as someone else pointed out, the function can be used, well,
functionally, the macro can't.
> Tim Bradshaw wrote:I don't think you get the point of macros. You can
> obviously cover
> any *specific* case where Lisp needs a macro by simply adding that
> special case to the syntax of the basic language. But that doesn't
> solve the problem at all, because you've lost the generalisation which
> is that the language has an extensible syntax, in a way that C and
> Java just don't.
First, I would be very interested to see where macros are actually used to extend
the LANGUAGE (syntax) and not program model. Second, I would propose converting
all macros to functions, and essentially allow functions to do whatever macros
do in compile-time, OR in run-time, by using either "Syntax Lists" for example,
which can make the fact a certain piece of code is not written in normal LISP but
some new defined syntax explicit. For example, a macro that would expand:
(+ 2 2) to (+ 4 4) and (* 1 3) to (*1 9), can be implemented by sending a
"Syntax list" (for example {(+ 4 4)}, or quoting thee list) to the function, which
can parse the syntax list the same way a macro would. Then I would propose
extending the function paradigm, so that every function can be "expanded" (as a
macro is, in compiletime), whenever it is beneficial to do so.
Another extension to functions that can be used to replace macros, is specifying
certain arguments are to be treated just as macro arguments are, and not to be
evaluated. You may ask what the point of all this is - and I will answer that it
makes the whole definition syntax of functions and macros simpler, more unified
and consistent. It also makes it possible to optimize code to the best desired
efficiency in each instance of the macro/function usage.
The next question I may ask though, is: Are "Syntax Lists" ever useful, or is
there always a simpler, cleaner solution (Which I always seemed to be able to
find)? Why would you want the compile-time language to support new ways to
express programming-related concepts, when the original langugae expresses any
thinkable ones very well?
"A more complete language design" seems to mean extending the compiler to
support a new linguistic abstraction. Macros (and let's focus on Lisp
macros in this newsgroup) are precisely a mechanism for a programmer to
express novel linguistic abstractions, thus providing a more complete
language design for their specific problems. You seem to believe the
language designer can and should foresee all such abstractions; the Lisp
philosophy is that the language designer should instead focus on providing
two things:
1. A rich set of base abstractions.
2. A powerful mechanism for defining new abstractions.
From this perspective, it's not clear what your beef with macros really
amounts to.
-- Harley
> The problem with these types of functions (FEXPRs) is that they complicate
> the evaluation model significantly.
Another problem with FEXPRs, if I understand correctly, is that they
stymie compilation.
~jrm
> First, I would be very interested to see where macros are actually
> used to extend the LANGUAGE (syntax) and not program model. Second,
> I would propose converting all macros to functions, and essentially
> allow functions to do whatever macros do in compile-time, OR in
> run-time, by using either "Syntax Lists" for example,
...
> Why would you want the compile-time language to support new ways to
> express programming-related concepts, when the original langugae
> expresses any thinkable ones very well?
PLEASE, get a book (some have already been suggested) and learn
something about this before you proceed any further. All the above
shows you are _completely_ lost on this subject and need to get some
very basic fundamental things straight before you even begin any more
attempts at discussion. If you continue wasting everyone's time and
bandwidth making pronouncements that show an utter lack of
understanding (while stating them as if fact) you will only succeed in
starting an irrelevant flame war.
Or are you just a troll and starting such a flame fest is your real
intent??
I mentioned several examples in my earlier message: SETF, LET, and LOOP
were originally macros that extended the Maclisp language.
>Another extension to functions that can be used to replace macros, is specifying
>certain arguments are to be treated just as macro arguments are, and not to be
>evaluated. You may ask what the point of all this is - and I will answer that it
>makes the whole definition syntax of functions and macros simpler, more unified
>and consistent. It also makes it possible to optimize code to the best desired
>efficiency in each instance of the macro/function usage.
Maclisp had something called "fexprs" that were functions that received
their arguments unevaluated. Zetalisp implemented a variant of this by
having "E and &EVAL lambda list keywords, to indicate that the
arguments between them should not be evaluated. The problem with these
types of functions is that they complicate the evaluation model
significantly, since it's often the case that you want to evaluate those
functions eventually; it's just being deferred so that the evaluation can
be done in a special way (for instance, you might be defining a conditional
like COND or IF, which should only evaluate certain arguments or parts of
arguments). But in a lexically-scoped language, you then have to arrange
for the evaluation to take place in the proper environment (Maclisp was
dynamically scoped, so this wasn't an issue -- the fexpr could simply call
EVAL).
Macros make this all much, much easier, which is why little consideration
at all was given to including user-defined special operators in Common
Lisp. Pretty much anything you could do with fexprs can be done just as
easily with macros. The evaluation model of macros is simple, so no
special hooks into the evaluator are needed.
I think this misses the point.
A fuzzy substitution of
s/macro/recursion/
provides:
"The convenience, or any other feature, that recursion can provide,
CAN be made available via explicit use of stacks, in a nonrecursive
language. The fact that recursions _are indeed_ a convenience
suggests that the language had a deficiency. The reasoning is that
to achieve convenience, you have to have functions call themselves,
with potential namespace conflicts, and overall additional
complexity."
Trolls periodically wander to comp.lang.lisp, actually claiming things
similar to the above statement.
I could add in any language feature that is a "convenience" and
contend that the desirability of addition of the "convenience" is
indicative of the language having some deficiency.
Reduction can thus continue to the point of absurdity until we arrive
at some language that be trivally encoded onto one of the minimal
Turing Machine representations.
The _real_ point is that if some language abstraction makes it
considerably more _convenient_ to build a notation to represent the
characteristics of a particular problem, this is a useful thing, and
is quite likely to be desirable.
--
BASIC is not a language. It's a plot to sucker poor unsuspecting
consumers into believing that they should buy a computer because
ANYONE can learn how to program.
cbbr...@hex.net- <http://www.ntlug.org/~cbbrowne/lsf.html>
> First, I would be very interested to see where macros are actually used to extend
> the LANGUAGE (syntax) and not program model.
LOOP, DEFUN, LAMBDA...
> [Elided stuff where you give what looks like a grotesquely
> over-complex equivalent to macros]
Would it not be easier simply to allow the language to have functions
which operate on a representation of its source code, to produce other
source code?
> The next question I may ask though, is: Are "Syntax Lists" ever useful, or is
> there always a simpler, cleaner solution (Which I always seemed to be able to
> find)? Why would you want the compile-time language to support new ways to
> express programming-related concepts, when the original langugae expresses any
> thinkable ones very well?
Because that's what Lisp is *about*! If you don't see that this is
just fundamental to the Lisp approach then you need to learn something
about the way it works. If you want some fixed little language, then
program in C or hex, and leave us to do our own thing.
--tim
Thanks for the help, everybody. Looking at my code today,
it was much worse than what the (defmacro...) form you
all assumed it was. It was actually embedded in an eval
inside a defmethod; a syntactically horrific aberration
that existed primary because I ,@ was the way I knew how
to do what I needed to do.
One poster pointed me to list* the same time he pointed
me to apply. Most helpful. I also got a good half dozen
explanations in my email box of the same thing, but
better more than less.
Thanks again,
C/
Courageous <jkra...@san.rr.com> writes:
> Are either of these forms preferred over one another? Or is it just
> a matter of style?
Those two functions perform two completely different tasks, so I guess
the answer is no; it is not a matter of style. If you can't see that,
I'd suggest you study the _basics_ of lisp more carefully before you
try to understand the finer points of lisp style.
Tom Breton <t...@world.std.com> writes:
> Well, using a macro instead of a function adds one more level of
> indirection, so don't use it unless it accomplishes something extra.
What do you mean by "one more level of indirection"? Myself, while
programming lisp, _very_ rarely make considerations like "should I use
a macro for this or a function". That _was_ something I did when
programming C, however. In lisp, macros and functions are used for
completely different things (more or less), although they accomplish
it by the same interface (the lisp language).
--
Frode Vatvedt Fjeld
it is important to have an explicit goal when thinking about language
design. your goal seems implicit, given that you don't mention it at
all. could you talk a little bit about what you want to accomplish?
| This has made me doubt the real necessity of macros, as they always seem
| to cover up for one or another language deficiency.
yes, that's what they do. the problem is, however, that _solving_ all
those deficiencies is impossible, and without macros, they would merely
be visible deficiencies, instead of covered-up deficiencies. now, macros
may cover up deficiencies so well that it would take an inordinate amount
of deconstructionism to find the deficiency, but I take this as evidence
that "cover up" is misapplied: the deficiency no longer _exists_ because
of the macro solution.
in short, macros are about building languages. _any_ such facility would
be messy if it were able to do its job well, and it would only be "neat"
if it were able to build things people had thought of while designing it.
trust me on this: we don't want that.
the core problem I sense in your approach to your language design has
been discussed at times, but seldom directly, so I'll summarize my
position on it and see if you recognize it: (Common) Lisp is such an
obviously elegant solution to the programming language problem that when
people look carefully at it, they get disappointed by warts and spots
with a rough finish, and then they start to think about how the language
would be if it were _all_ elegant. other languages are equally obvious
inelegant solutions to the programming language problem, and individual
features are hailed as elegant (or "cool hacks") when discovered, which
means that digging deeper uncovers isolated inelegances in Common Lisp
and isolated elegances in most other languages. since we all appreciate
more elegance rather than less, users of inelegant languages learn by the
rewards they get from doing it that digging deeper in particular ways is
a good way to _improve_ their language appreciation, and so they continue
with this habit when they come to Common Lisp, only to be disappointed.
if you don't realize the ramifications of this reversal, you will lose
track of the big picture: you should not need to dig deep into a language
to find its elegance, and it doesn't matter whether the innards that you
don't know about are inelegant.
#:Erik
Great fun!
And following your example with fuzzy substitution, I get the
following:
WINDOWS is not an os. It's a plot to sucker poor unsuspecting
consumers into believing that they should buy a computer because
ANYONE can learn how to use it.
:-)
Rolf Rander
--
(c) 2000 Rolf Rander Næss
http://www.pvv.org/~rolfn/
My mailer limits .sigs to 4 lines. But ingeniously I bypassed this by
While I certainly don't consider BASIC to be the epitome of programming
languages, I don't think I would be where I am today if it weren't for
BASIC. It allowed me to learn programming on my own pretty easily at a
time when there weren't many resources (the late 70's).
Peaker wrote:
> > Tim Bradshaw wrote:I don't think you get the point of macros. You can
> > obviously cover
> > any *specific* case where Lisp needs a macro by simply adding that
> > special case to the syntax of the basic language. But that doesn't
> > solve the problem at all, because you've lost the generalisation which
> > is that the language has an extensible syntax, in a way that C and
> > Java just don't.
>
> First, I would be very interested to see where macros are actually used to extend
> the LANGUAGE (syntax) and not program model.
Look at it this way: you can take lisp, without many language features, and
build in most of it's features via macros. Even very basic things such as
do, loop, dolist, defun, let, and let*.
> Second, I would propose converting
> all macros to functions, and essentially allow functions to do whatever macros
> do in compile-time, OR in run-time, by using either "Syntax Lists" for example,
> which can make the fact a certain piece of code is not written in normal LISP but
> some new defined syntax explicit. For example, a macro that would expand:
> (+ 2 2) to (+ 4 4) and (* 1 3) to (*1 9), can be implemented by sending a
> "Syntax list" (for example {(+ 4 4)}, or quoting thee list) to the function, which
> can parse the syntax list the same way a macro would. Then I would propose
> extending the function paradigm, so that every function can be "expanded" (as a
> macro is, in compiletime), whenever it is beneficial to do so.
Very nice, but what's the point? What you're suggesting is already very like
macros anyways. There are ways to design languages that are conceptually
cleaner than lisp ( *gasp* ) but lisp is largely about getting the job done. It's
elegance that it has is there in cases where it makes coding in it easier. After
all, compilers are already allowed to inline function as they see fit.
I was designing a language for myself as an exercise that was extremely elegant
and conceptually concise, but I asked myself, would it be easier to write code in
this as opposed to lisp? I realized that it really was not, but it was still an
interesting exercise.
dave
*Excellent* examples! I too use macros for all of these purposes.
> I generally avoid using macros just to delay argument evaluation. It
> is just as easy to wrap a LAMBDA around the appropriate argument, and
> it avoids creating code that isn't applicative order. So you might
> see this in my code:
>
> (call-with-temp-file #'(lambda (file) ....))
>
> whereas other lisp hackers would write a macro
>
> (with-temp-file (file) ....)
>
> Obviously, I was brought up on the wrong side of the tracks.
I nearly always define, document, and export both: the macro WITH-xxx
expands into the function CALL-WITH-xxx. Or in the case of iteration
constructs, DO-xxx expands into MAP-xxx. The user (often me!) can
then use whichever is more convenient in a specific context. I find
that if I just take the time to define both such constructs whenever I
first see the need for one of them, then I eventually find a natural
use for the other.
Or perhaps it's just my increasingly anal-retentive nature.... ;)
Phil Stubblefield
Rockwell Palo Alto Laboratory 206/655-3204
http://www.rpal.rockwell.com/~phil ph...@rpal.rockwell.com
Many of Symbolics's macros are implemented this way. It solves the name
clash problem often without requiring any GENSYMs.
If users had to write out all the CALL-WITH-xxx forms, it would have been
much less convenient to use things like the table formatting facility.
Compare:
(formatting-table (...)
(dolist ...
(formatting-row (...)
(present x)
(present y))))
with:
(call-with-table-formatting ...
:function
#'(lambda (stream)
(dolist ...
(call-with-row-formatting ...
:function #'(lambda (stream)
(present x stream)
(present y stream))))))
I know which one I find easier to read, because it doesn't have lots of
lambda's strewn about.
It's certainly possible to program in a Lisp-like language that merely has
lambda and the Y combinator, but I wouldn't want to. It's analogous to
writing in assembly language -- we design high-level languages to abstract
away all these details. Macros give that power to the programmer, not just
the language designer.
--
BASIC is not a language. It's a plot to sucker poor unsuspecting
consumers into believing that they should buy a computer because
ANYONE can learn how to program.
cbbr...@hex.net- <http://www.ntlug.org/~cbbrowne/lsf.html>
When I was ten, I could write all kinds of graphics stuff in BASIC, so
I thought I was a terrific programmer. Nobody could teach me anything;
I was convinced I knew all there was to know about programming.
I didn't really grasp the idea of a subroutine, though --- even though
I'd learned to write recursive subroutines in Logo years before, I
never really caught on.
Later on (when I was twelve?) I took another Logo class and started
doing more significant things (including programs composed of multiple
subroutines). But it wasn't until I was forced to take a Pascal class
that I really began to realize how much I had to learn. [I'm still not
sure I've shed that youthful arrogance, unfortunately.]
The languages one writes in guide one's mind --- they're not the only
influence, nor even the most important, but they are significant.
BASIC guides one to focus on syntax and makes things like local
variables, subroutines, parameter passing, and dynamic data structures
difficult enough that a novice like me would have had to invent and
implement them from scratch (and they would be more difficult than in
assembly, in fact).
Lisp guides one to treat syntax as trivial, to take dynamic data
structures, local variables, and subroutines for granted, and to focus
on the computation rather than the program.
--
<kra...@pobox.com> Kragen Sitaker <http://www.pobox.com/~kragen/>
The Internet stock bubble didn't burst on 1999-11-08. Hurrah!
<URL:http://www.pobox.com/~kragen/bubble.html>
The power didn't go out on 2000-01-01 either. :)
The "real" BASICs deployed from Dartmouth, as well as the ANSI and ISO
Standard BASICs are a far cry different from Microsoft BASIC. They
actually had matrix operators, for instance.
I've got an even more entertaining commentary on LOGO...
"LOGO is not a language. It's a way to simulate 'skid marks' made by
turtles with serious bowel control problems."
And I _do_ know that LOGO was the nearest "educational" equivalent
to Lisp... Probably about as much like CL as Scheme is...
--
Know the list of "large, chronic problems". If there is any problem
with the window system, blame it on the activity system. Any lack of
user functionality should be attributed to the lack of a command
processor. A suprisingly large number of people will believe that you
have thought in depth about the issue to which you are alluding when you
do.
-- from the Symbolics Guidelines for Sending Mail
cbbr...@hex.net - - <http://www.hex.net/~cbbrowne/lsf.html>
>
>
> Tom Breton <t...@world.std.com> writes:
>
> > Well, using a macro instead of a function adds one more level of
> > indirection, so don't use it unless it accomplishes something extra.
>
> What do you mean by "one more level of indirection"?
Sorry, I don't see what could be unclear. One more level of
indirection; an extra level above and beyond whatever layers of
indirection you have already used. Does that answer your question?
Perhaps you are thinking solely in terms of run-time indirection? Is
that the difficulty? I mean one more level of indirectness in
representation, if that's any help.
> BASIC is not a language. It's a plot to sucker poor unsuspecting
> consumers into believing that they should buy a computer because
> ANYONE can learn how to program.
> cbbr...@hex.net- <http://www.ntlug.org/~cbbrowne/lsf.html>
Hey, it worked on me. (More exactly, it was my dad's TRS-80 when I
was a wee child)
> What do you mean by "one more level of indirection"?
Tom Breton <t...@world.std.com> writes:
> Perhaps you are thinking solely in terms of run-time indirection?
> Is that the difficulty? I mean one more level of indirectness in
> representation, if that's any help.
I'm not thinking in terms of anything, as "indirection" is such a
general term. Representation of what? Code? That would translate in my
terms to "one more level of code transformation", and I don't really
see why you'd want to worry about that, in general.
--
Frode Vatvedt Fjeld
> While I certainly don't consider BASIC to be the epitome of programming
> languages, I don't think I would be where I am today if it weren't for
> BASIC. It allowed me to learn programming on my own pretty easily at a
> time when there weren't many resources (the late 70's).
The only reason I learnt BASIC is because I couldn't find a Lisp compiler
for my Timex/Sinclair 2068 (the early 80's) ;-(
--
Fernando D. Mato Mira
Real-Time SW Eng & Networking
Advanced Systems Engineering Division
CSEM
Jaquet-Droz 1 email: matomira AT acm DOT org
CH-2007 Neuchatel tel: +41 (78) 778 FDMM
Switzerland 3366
www.csem.ch www.vrai.com ligwww.epfl.ch/matomira.html
I had more luck. At that time there were LeLisp 80 on the TRS-80.
It worked well with 48Ko of RAM and a 128Ko floppy disk....
Marc Battyani
^^^^^^^^^^^^^^^^^
I hate you ;)
--
Fernando D. Mato Mira
Real-Time SW Eng & Networking
Advanced Systems Engineering Division
CSEM
Jaquet-Droz 1 email: matomira AT acm DOT org
CH-2007 Neuchatel tel: +41 (32) 720-5157
Switzerland FAX: +41 (32) 720-5720
www.csem.ch www.vrai.com ligwww.epfl.ch/matomira.html
> Frode Vatvedt Fjeld <fro...@acm.org> writes:
>
> > What do you mean by "one more level of indirection"?
>
>
> Tom Breton <t...@world.std.com> writes:
>
> > Perhaps you are thinking solely in terms of run-time indirection?
> > Is that the difficulty? I mean one more level of indirectness in
> > representation, if that's any help.
>
> I'm not thinking in terms of anything, as "indirection" is such a
> general term. Representation of what? Code?
Yes.
> That would translate in my
> terms to "one more level of code transformation",
Yes.
> and I don't really
> see why you'd want to worry about that, in general.
All else being equal, it's better to make fewer transformations on
your code; it's less machinery to think about. IMO. Don't you agree?
> All else being equal, it's better to make fewer transformations on
> your code; it's less machinery to think about. IMO. Don't you agree?
>
Why would you make a macro if it made writing the code harder?
The point of a macro is to allow for certain constructs which make
writing code easier. Maybe you need to rething why you're writing
whichever macros make writing code harder for you.
--
-> -\-=-=-=-=-=-=-=-=-=-/^\-=-=-=<*><*>=-=-=-/^\-=-=-=-=-=-=-=-=-=-/- <-
-> -/-=-=-=-=-=-=-=-=-=/ { Rahul -<>- Jain } \=-=-=-=-=-=-=-=-=-\- <-
-> -\- "I never could get the hang of Thursdays." - HHGTTG by DNA -/- <-
-> -/- http://photino.sid.rice.edu/ -=- mailto:rahul...@usa.net -\- <-
|--|--------|--------------|----|-------------|------|---------|-----|-|
Version 11.423.999.210020101.23.50110101.042
(c)1996-2000, All rights reserved. Disclaimer available upon request.
> All else being equal, it's better to make fewer transformations on
> your code; it's less machinery to think about. IMO. Don't you agree?
High-level languages are a pretty good indicator that all else is
seldom equal.
--tim
> All else being equal, it's better to make fewer transformations on
> your code; it's less machinery to think about. IMO. Don't you
> agree?
Yes, but if all else was equal, I'd have to call you a lousy macro
writer, no? You shouldn't write a macro unless it gives you some extra
benifit. The benifit should however be great enough to justify the
additional syntax the programmer has to remember, as was discussed in
a thread some weeks ago. But this applies equally to macros and
functions.
--
Frode Vatvedt Fjeld
I don't agree with this silliness. good macros are abstractions.
I also don't agree with your view of optimization, which seems to
include manual macroexpansion and reducing the amount of work no
human being should ever be doing in the first place. I'm also glad
you don't read this, as I would hate to see the silly response.
#:Erik
Language extension is a reasonable use for such a facility, however, the
problem is, that a common use of macros is not only to extend the language
and its syntax, but also to extend the program model.
Drawing the line between the program model and the language itself, is not
as easy as it may seem, given that function calls are very syntatically close
to special forms or macro forms. What may seem as an extension of the
language itself, under one definition ('if', for example), would seem as the
extension of the program or object model under another ('if' being a method
of a 'Boolean' object).
> in short, macros are about building languages. _any_ such facility would
> be messy if it were able to do its job well, and it would only be "neat"
> if it were able to build things people had thought of while designing it.
> trust me on this: we don't want that.
I understand what you're saying, and I'd take your word if I wasn't
after understanding it myself. Therefore, I would appriciate some small
examples of language extensions that are NOT possible to implement as an
object-model extension, using, for example, SmallTalk.
(The point of this, is to verify that the functionality of extending the
syntax,
achievable by macros, cannot be achieved using extensions to the object
model [Of a language that has built-in object support, ofcourse])
> the core problem I sense in your approach to your language design has
> been discussed at times, but seldom directly, so I'll summarize my
> position on it and see if you recognize it: (Common) Lisp is such an
> obviously elegant solution to the programming language problem that when
> people look carefully at it, they get disappointed by warts and spots
> with a rough finish, and then they start to think about how the language
> would be if it were _all_ elegant. other languages are equally obvious
> inelegant solutions to the programming language problem, and individual
> features are hailed as elegant (or "cool hacks") when discovered, which
> means that digging deeper uncovers isolated inelegances in Common Lisp
> and isolated elegances in most other languages. since we all appreciate
> more elegance rather than less, users of inelegant languages learn by the
> rewards they get from doing it that digging deeper in particular ways is
> a good way to _improve_ their language appreciation, and so they continue
> with this habit when they come to Common Lisp, only to be disappointed.
> if you don't realize the ramifications of this reversal, you will lose
> track of the big picture: you should not need to dig deep into a language
> to find its elegance, and it doesn't matter whether the innards that you
> don't know about are inelegant.
Some unnecessary inelligence of macros does appear to exist in the used-level:
the macro definitions themselves. I would suggest a small change to the macro
system, that in my oppinion, would make it more elegant: Allow 'unevaluated
parameters' to functions (macro-parameters), that would effectively turn them
into compile-time or (using run-time expression representations) run-time
functions. This would require unifying functions and macros, and allowing
both to be "expanded" or "executed", as the compiler sees fit for the case.
--
Micros~1 is history (www.microsoft.com).
Linux is the present (www.linux.org).
EROS is the future (www.eros-os.org).
Notice that in a language like Smalltalk, where 'if' is a method rather
than a macro or special operator like it is in Lisp, you have to wrap the
consequent clauses in special syntax to prevent them from being evaluated
prematurely. Smalltalk has made this convenient by using a simple syntax,
[...], for what Lisp uses LAMBDA expressions for; the brackets look similar
to the braces or begin/end keywords that many languages require as part of
their conditional syntax, so programmers hardly realize that they're doing
it.
So, basically, this all really comes down to different varieties of
syntactic sugar. The Smalltalk philosophy is to try to do as much as
possible using the object model, and they've made it easy to deal with
program blocks as objects. Lisp has first-class functions as well, and we
could do things similarly, but we've chosen instead to use them more
sparingly, prefering to use macros to hide things like this.
>Some unnecessary inelligence of macros does appear to exist in the used-level:
>the macro definitions themselves. I would suggest a small change to the macro
>system, that in my oppinion, would make it more elegant: Allow 'unevaluated
>parameters' to functions (macro-parameters), that would effectively turn them
>into compile-time or (using run-time expression representations) run-time
>functions. This would require unifying functions and macros, and allowing
>both to be "expanded" or "executed", as the compiler sees fit for the case.
As I mentioned a day or two ago, this was tried, and found lacking. Doing
it portably would require standardizing too much of the way that the
compiler works.
You seem to be rejecting macros precisely because of the features that make
them so great. They're a very simple mechanism that is nonetheless quite
elegant and powerful. Given a handful of primitive special forms and a
fully-programmable macro language like Lisp's, you can implement just about
any language extension.
Tom Breton wrote:
>
> All else being equal, it's better to make fewer transformations on
> your code; it's less machinery to think about. IMO. Don't you agree?
Actually, no, I don't see that that necessarily correlates. It's possible that
fewer transformations might mean the programmer having to write more
code, and harder code, to get the same job done.
dave
[snip]
> > in short, macros are about building languages. _any_ such facility would
> > be messy if it were able to do its job well, and it would only be "neat"
> > if it were able to build things people had thought of while designing it.
> > trust me on this: we don't want that.
>
> I understand what you're saying, and I'd take your word if I wasn't
> after understanding it myself. Therefore, I would appriciate some small
> examples of language extensions that are NOT possible to implement as an
> object-model extension, using, for example, SmallTalk.
Quick note: it's spelled 'Smalltalk'.
Quick caveat: I'm not a Common Lisper, though I do know something of the
langauge. I know Scheme much better which may or may not be a handicap
in the current situation. I'm comming in as a Smalltalker.
(OK, the caveat wasn't so quick...)
My understanding of Lispy macros is that they provide for the
modification or extension of *syntax*, in essence allowing one to add
new special forms. In Smalltalk, at least in most implementations, if
you wish to change the *syntax*, you modify the parser. This is rather
easy, given the simplicity of the base syntax and the availability of
the parsing classes. But it's a rather different style of operation, and
certainly not as flexible or scalable as the macro approach (for the
obvious reasons: changes are global, and the more special cases you add
the more complicated the parser gets).
Now, it's quite easy to change the object model (e.g., to prototypes)
and, thus, to get a rather different langauge (tailored to the
application domain, one might say), but the basic syntactic and statment
level evaluation mechanism don't change.
(Obviously, this is much more like using the MOP. So, perhaps, if you
were to compare the kind of "language extension" achieved via the MOP
and that achieved via macros, you might get a better sense of the vital
difference. It seems instructive that *both* mechanisms are present in
Common Lisp.)
Of course, I find it more congenial to look at Smalltalk's syntax as
*one* way, of many, of manipulating the system of objects embodied in
the image. A way that is coequal to other "IDE" tools. Again, a rather
different approach.
I'll add that in Smalltalk-72 (?...it might be Smalltalk-76) one could
change the syntax on a per class basis, though I confess to be ignorant
of the details.
Oh, one should probably point at Prolog as well, with it's op function
which allows one to define the associtivity and precedence of arbitrary
predicates. Very handy for quickly building domain specific sugar.
--
Bijan Parsia
http://monkeyfist.com/
...among many things.
> the macro definitions themselves. I would suggest a small change to the macro
> system, that in my oppinion, would make it more elegant: Allow 'unevaluated
> parameters' to functions (macro-parameters), that would effectively turn them
> into compile-time or (using run-time expression representations) run-time
> functions. This would require unifying functions and macros, and allowing
> both to be "expanded" or "executed", as the compiler sees fit for the case.
That's what fexprs were. There's a reason they went away. In 41
years a lot of things have been tried in Lisp...
--tim
An Introduction to Lisp Macros
<http://www.apl.jhu.edu/~hall/Lisp-Notes/Macros.html> provides some
examples.
The main point of the exercise is that of controlling the evaluation
of arguments.
(defmacro Iff (Test Then &optional Else)
"A replacement for IF, takes 2 or 3 arguments. If the first evaluates to
non-NIL, evaluate and return the second. Otherwise evaluate and return
the third (which defaults to NIL)"
`(cond
(,Test ,Then)
(t ,Else)) )
You couldn't do this using a function, because passing the THEN and
ELSE values to a function would mean that both would get evaluated
before being passed in.
If the plan is to create some sort of specialized control structure,
whether:
a) A nifty variation on "IF", as shown above, or
b) A control structure that is given some data structure, "opens" it,
loops on it, and then "closes it,"
this is useful.
So, for instance, you have:
(with-open-file (s p)
(do ((l (read-line s) (read-line s nil 'eof)))
((eq l 'eof) "Reached end of file.")
(format t "~&*** ~A~%" l)))
This macro does three things:
- Opens the file
- Repeats the body on each line in the file
- Closes the file
--> Guarantees that the file is closed, even if an error is signalled.
Another MAJOR point to the exercise is that of transforming another
language, namely the language of your problem, into Lisp. By
transforming it into Lisp, the optimizations available to the Lisp
compiler may be used on your language without the need for a
specialized optimizer.
>> the core problem I sense in your approach to your language design has
>> been discussed at times, but seldom directly, so I'll summarize my
>> position on it and see if you recognize it: (Common) Lisp is such an
>> obviously elegant solution to the programming language problem that when
>> people look carefully at it, they get disappointed by warts and spots
>> with a rough finish, and then they start to think about how the language
>> would be if it were _all_ elegant. other languages are equally obvious
>> inelegant solutions to the programming language problem, and individual
>> features are hailed as elegant (or "cool hacks") when discovered, which
>> means that digging deeper uncovers isolated inelegances in Common Lisp
>> and isolated elegances in most other languages. since we all appreciate
>> more elegance rather than less, users of inelegant languages learn by the
>> rewards they get from doing it that digging deeper in particular ways is
>> a good way to _improve_ their language appreciation, and so they continue
>> with this habit when they come to Common Lisp, only to be disappointed.
>> if you don't realize the ramifications of this reversal, you will lose
>> track of the big picture: you should not need to dig deep into a language
>> to find its elegance, and it doesn't matter whether the innards that you
>> don't know about are inelegant.
>
>Some unnecessary inelligence of macros does appear to exist in the
>used-level: the macro definitions themselves. I would suggest a
>small change to the macro system, that in my oppinion, would make it
>more elegant: Allow 'unevaluated parameters' to functions
>(macro-parameters), that would effectively turn them into
>compile-time or (using run-time expression representations)
>run-time functions. This would require unifying functions and
>macros, and allowing both to be "expanded" or "executed", as the
>compiler sees fit for the case.
That may remove the need for some of the more trivial uses of macros,
but not the more interesting uses, I'd think...
--
(1) Sigs are preceded by the "sigdashes" line, ie "\n-- \n" (dash-dash-space).
(2) Sigs contain at least the name and address of the sender in the first line.
(3) Sigs are at most four lines and at most eighty characters per line.
cbbr...@ntlug.org- <http://www.ntlug.org/~cbbrowne/lsf.html>
About the only thing I can say about macros is that it's a
bit irritating if your own code breaks in someone else's
macro. Usually, though, the macro-expand/1 functions will
help. Still... I'm currently making use of an environment
that involves extensive macros (and I believe reader macros),
and god it behaves badly sometimes: many errors that would
ordinarily be caught by the compiler don't get caught til
runtime. I still haven't figured out why that is.
It will, for example, happily allow me to transpose
characters on an initarg (there are special macros for
implementing "concepts" which are these ai-ish equivelent
of classes). Most bizarre.
Anyway,
Having used C/C++ macros quite a bit, I can say that working
with lisp macros/metalinguistic constructs is quite a
pleasure in comparison.
C/
> Tom Breton <t...@world.std.com> writes:
>
> > All else being equal, it's better to make fewer transformations on
> > your code; it's less machinery to think about. IMO. Don't you
> > agree?
>
> Yes, but if all else was equal, I'd have to call you a lousy macro
> writer, no?
The question was whether to prefer an equivalent function over a
macro. That's equal *by definition*.
> You shouldn't write a macro unless it gives you some extra
> benifit.
Didn't I write that exact thing a couple of messages back?
Am I the only who thinks that macros are harder to debug
and sometimes obfuscating?
Admittedly, working with macros in lisp seems quite
a bit easier than those preprocessor abberations in C,
and certainly has a much more native fit than any of
those beasts, but still... (actually, I'm not sure
what we're all going on about, I think not a one o
us disagrees that macros have their place.... god
forbid, even in C).
C/
Well, the idea is, that when the syntax is so small, you may not ever need
to change it, as the language consists of very little syntax, and a lot of
object operations. In my question, I was aware that LISP macros allow
lower-level access to the syntax (changing the way things are parsed),
however, I am suggesting that it may not be necessary, and that an
extendible object model may be all you'll ever need to change about the
language (which is what I want you to contradict by bringing practical
examples of things possible to achieve by affecting the ways things are
parsed (macros), and not by extending the object model).
> Now, it's quite easy to change the object model (e.g., to prototypes)
> and, thus, to get a rather different langauge (tailored to the
> application domain, one might say), but the basic syntactic and statment
> level evaluation mechanism don't change.
Again, those basic syntatic statement level evaluation may not need be
useful to change.
> Oh, one should probably point at Prolog as well, with it's op function
> which allows one to define the associtivity and precedence of arbitrary
> predicates. Very handy for quickly building domain specific sugar.
>
If I understand this correctly, it seems to be part of that same mechanism,
if you consider op's (operators) to be object methods and part of the
extendible object model. This only reenforces the idea, that the high-level
syntax is the low-level one, combined with the basic object model.
My theory is, that the high-level syntax is extendible, when the basic
object model is extendible enough, and that the low-level syntax need not
change, which is why I dislike LISP's way of providing means to change
the low-level syntax.
where do you draw the line between the two? I suspect you have an
artificial view of which is which.
| Drawing the line between the program model and the language itself,
| is not as easy as it may seem, given that function calls are very
| syntatically close to special forms or macro forms.
right. that's the beauty of the Lisp syntax.
| What may seem as an extension of the language itself, under one
| definition ('if', for example), would seem as the extension of the
| program or object model under another ('if' being a method of a
| 'Boolean' object).
yes. isn't it great? language design paradigms come in many
flavors that you can actually choose from. Common Lisp has chosen
some of them. you seem to be unhappy with the paradigms that those
who like Common Lisp favor the most.
| Therefore, I would appriciate some small examples of language
| extensions that are NOT possible to implement as an object-model
| extension, using, for example, SmallTalk.
you want examples of language extension in language X that aren't
possible in language Y? really. this is _ridiculous_ on its face.
| I would suggest a small change to the macro system, that in my
| oppinion, would make it more elegant: ...
really? well, it's been tried. it was a bad idea.
first grok, then suggest. that's general advice.
#:Erik
this is not true. you seem to be confusing reader macros with
macros, as many complete novices do.
| My theory is, that the high-level syntax is extendible, when the
| basic object model is extendible enough, and that the low-level
| syntax need not change, which is why I dislike LISP's way of
| providing means to change the low-level syntax.
no, you dislike it because you don't understand it on its own terms,
but rather want to force it into a different model. you must make a
distinction between disliking the model and its expression before
you can proceed usefully. but since you appear to dislike the Lisp
model a priori, I don't think there's much point in proceeding in
this direction. just find a language whose model you like and use
it productively.
#:Erik
TIA
- DM
> Well, the idea is, that when the syntax is so small, you may not ever need
> to change it, as the language consists of very little syntax, and a lot of
> object operations. In my question, I was aware that LISP macros allow
> lower-level access to the syntax (changing the way things are parsed),
You *can* change the way things are parsed, but that is via the
readtable, not macros.
You can differentiate `syntax' (how groups of tokens are taken to form
expressions) from `microsyntax' (how groups of characters are taken to
form tokens). People don't often fool around with the microsyntax of
lisp.
> however, I am suggesting that it may not be necessary, and that an
> extendible object model may be all you'll ever need to change about the
> language (which is what I want you to contradict by bringing practical
> examples of things possible to achieve by affecting the ways things are
> parsed (macros), and not by extending the object model).
That's true. In fact, *objects* aren't needed. Neither are *names*.
If you want, we can reduce the entire language down to strings of S
and K operators. For some reason that idea hasn't caught on, though.
The point of any programming language construct is to allow you to
write down *what you mean* as directly and clearly as possible to
communicate it to other programmers *and* to the computer. Macros are
one tool for doing this, objects are another. You should use either
one as necessary.
> My theory is, that the high-level syntax is extendible, when the basic
> object model is extendible enough, and that the low-level syntax need not
> change, which is why I dislike LISP's way of providing means to change
> the low-level syntax.
So don't use it.
--
~jrm
Fexprs were in Maclisp, and they were simply functions whose arguments were
not automatically evaluated. The function itself had to call EVAL when
necessary.
For a reference, I guess you'll have to find one of the old Maclisp
reference manuals.
Not if you wrapped them in LAMBDA expressions:
(defun if-function (test then &optional else)
(cond (test (funcall then))
(else (funcall else))))
(if-function (> x y)
#'(lambda () (format t "~&X is larger than Y.~%"))
#'(lambda () (format t "~&X is less than or equal to Y.~%")))
This is essentially what Smalltalk does; it spells #'(lambda () ...) as
[...].
In fact, we could do this in Lisp if we wanted -- it would be simple to
make [ a reader-macro that expands into the appropriate lambda expression.
>So, for instance, you have:
> (with-open-file (s p)
> (do ((l (read-line s) (read-line s nil 'eof)))
> ((eq l 'eof) "Reached end of file.")
> (format t "~&*** ~A~%" l)))
>
>This macro does three things:
>- Opens the file
>- Repeats the body on each line in the file
>- Closes the file
>--> Guarantees that the file is closed, even if an error is signalled.
It should probably be pointed out that Scheme uses functional analogues to
with-open-file: call-with-input-file and call-with-output-file. So the
above would be written:
(call-with-input-file p
(lambda (s)
(do ...)))
call-with-*-file provides the same guarantees as with-open-file.
As I've said in other messages, in many cases this is just a stylistic
choice that Lisp programmers have collectively made.
"microsyntax" is usually referred to as something like "lexical structure"
(or something with the word "lexical" in it).
No; I have a colleague, who is every bit the Lisp bigot I am, who
complains about another colleague's heavy use of macrology for what he
sees as no useful purpose.
Personally, I have no problem dealing with macros and their expansions,
at least in anything resembling a modern Lisp programming environment
(i.e. one with some degree of editor integration, even if the editor is
not actually part of the Lisp environment).
In fact, I find macros one of the key reasons I prefer to program in
Common Lisp.
-- Chuck
--
Chuck Fry -- Jack of all trades, master of none
chu...@chucko.com (text only please) chuc...@home.com (MIME enabled)
Lisp bigot, car nut, photographer, sometime guitarist and mountain biker
The addresses above are real. All spammers will be reported to their ISPs.
fexprs were functions whose arguments were not evaluated. The
function itself would call EVAL to get the argument evaluated. I
think they were in things like Maclisp, though I came across them in
Cambridge Lisp and Standard Lisp, which had both spread (lots of
parameters) and nospread (one parameter which works like &rest)
versions of fexprs and normal functions (as well as macros!).
I think the worst problem with fexprs is that you have to know all
about how the compiler works to use them. For instance if Common Lisp
had an operator DEFF for defining fexprs, you might think that this:
(deff foo (x)
(eval x))
is equivalent to
(defun foo (x)
x)
But it isn't at all, because you've lost the environment:
(let ((y 1))
(foo y))
is (almost certainly) an error for the fexpr, but not for the normal
function.
I don't know about maclisp, but the lisps I used which had fexprs were
dynamically scoped so this kind of thing would work. Actually I think
it probably wouldn't work in the compiler unless you did something to
tell it that y was fluid (which was the same as CL `special', normal
bindings in the compiler were dynamic extent but lexical scope, while
in the interpreter they were like CL `special').
In order to have something like this work for CL, I think you'd need
to have access to environment objects and generally expose a whole
bunch more internals than you'd like to.
--tim
the purpose of macros is to simplify your code. once the macro is
written and debugged, you shouldn't have to think about it. you just
use it like a built in language construct. pretty much all control
constructs of hlls can be expressed as macros with simple ifs and gotos
inside. do you really think about what is going on inside a while loop?
--
Hartmann Schaffer
> In article <sgghr2e...@corp.supernews.com>,
> David McClain <dmcc...@azstarnet.com> wrote:
> >I keep hearing about the historical "fexprs". Can someone please explain in
> >a bit more detail what these things are? Or, better yet, cite some reference
> >with a good historical explanation. I have seen them mentioned in some of
> >McCarthy's papers, but I have yet to see a good description of them and why
> >they went away....
>
> Fexprs were in Maclisp, and they were simply functions whose arguments were
> not automatically evaluated. The function itself had to call EVAL when
> necessary.
>
> For a reference, I guess you'll have to find one of the old Maclisp
> reference manuals.
Or try the following paper by Kent M. Pitman, which will also explain
some of the reasons that fexprs were found to be not a good idea(tm):
Special Forms in Lisp
http://world.std.com/~pitman/Papers/Special-Forms.html
(AFAICT it also appeared in paper form in the Conference Record of the
1980 Lisp Conference held at Stanford University, August 25-27, 1980.)
Regs, Pierre.
--
Pierre Mai <pm...@acm.org> PGP and GPG keys at your nearest Keyserver
"One smaller motivation which, in part, stems from altruism is Microsoft-
bashing." [Microsoft memo, see http://www.opensource.org/halloween1.html]
[snip]
>I keep hearing about the historical "fexprs". Can someone please explain in
>a bit more detail what these things are? Or, better yet, cite some reference
>with a good historical explanation.
[snip]
i ran across them in franzlisp. as i recall, there were lambdas,
fexprs, lexprs, and nlambdas. one of the latter two forms would
bundle all the arguments in a single list parameter, kinda like
writing (defun foo (&rest lis) ...). not that you'll take my
word for it, but it's a good thing all of this complexity went
away. i think kent pitman might have a paper on the downfall of
these older constructs somewhere on his web page.
sashank
While I believe that macros are a good mechanism, and consider myself an
accomplished macrologist, I will also accede Courageous's point. Complex
macros can sometimes be pretty difficult to troubleshoot. Often the
complexity isn't in the macro itself, but in the backquoting (especially
when writing macro-defining macros, which usually require multiple levels
of backquotes). Then again, complex functions can also be difficult to
troubleshoot, and there are usually more of them; if, say, 1% of functions
and 2% of macros have a bug, there's a good chance that a large program
will have several hundred functions, hence several buggy ones, but maybe
only 10-20 macros, in which case there's a good chance that none of them
have bugs. There's rarely a need to go overboard on macros (the Common
Lisp language itself has 5-6 times as many functions as macros, but I think
most applications would have a much higher ratio).
However, I still believe in them because the benefit they provide once they
work can be quite good. Also, there are some techniques that experienced
macro writers learn to make it easier to debug them in the first place.
For instance, many macros can simply transform the original code into a
functional style, e.g. translate WITH-OPEN-FILE into CALL-WITH-OPEN-FILE.
The first is the WITH-OPEN-FILE macro from Common Lisp itself. The
fundamental issue is to provide a nice, simple and safe environment for
doing file operations. When a program opens a file, it should close the
file -- even (especially?) if an error occurs while the program is
reading from the file. The way to do this nicely in CL is:
(with-open-file (input-stream "name-of-my-file")
(read-some-stuff input-stream)
(read-other-stuff input-stream))
Now, this can be done using the UNWIND-PROTECT form in CL. (If you
aren't familiar with this form, it is similar to Java's
"try {...} finally {...}" construct.
In fact, that is how the Common Lisp macro is implemented. Looking at
the macroexpansion from Allegro CL version 4.3 I notice that the
implementation is pretty close to what one would naively expect:
(LET ((INPUT-STREAM (OPEN "name-of-my-file")) (#:G15580 T))
(UNWIND-PROTECT
(MULTIPLE-VALUE-PROG1 (PROGN (READ-SOME-STUFF INPUT-STREAM)
(READ-OTHER-STUFF INPUT-STREAM))
(SETQ #:G15580 NIL))
(WHEN (STREAMP INPUT-STREAM) (CLOSE INPUT-STREAM :ABORT #:G15580))))
I also would point out that there are a few minor touches (such as the
:ABORT argument) that make the packaged code a bit safer. In fact,
since the WITH-OPEN-FILE macro only needs to be written once, it is more
likely that such extra effort for safety would be taken than if the
programmers had to include it every time they used an UNWIND-PROTECT
form. That is one of key arguments in favor of using Macros to extend
the language and package commonly useful operations in more convenient
forms. It becomes easier to have uniform code.
I will also note that this particular item would be rather difficult to
package as an object extension, since the body of the macro is a series
of language forms that need to be executed in an evironment established
by the macro itself.
One could do something like passing a list of functions with appropriate
parameters, to be executed once the environment is set up, but this is
often not done. I guess the closest that other languages would come to
this is to create instances of (possibly anonymous) classes which
implement some predefined method call and then package this up in a call
to another function which then calls this magic function name. That is
certainly doable, but is a lot more work (IMO) and which doesn't
generally encourage programmers to do it.
Now the example macro shown is essentially a fairly simple substitution
macro, so it doesn't really exploit the ability of macros to do
extensive code reformulation. But it is hard to find any simple
examples of the latter. I suppose that the LOOP macro might qualify:
(loop for i in L collect i)
partially expands into the following monstrosity in ACL 4.3:
(LET ((I NIL) (#:G15581 L))
(DECLARE (TYPE LIST #:G15581))
(LET* ((#:G15585 (LIST NIL)) (#:G15586 #:G15585))
(BLOCK NIL
(TAGBODY (WHEN (ENDP #:G15584) (GO END-LOOP))
(SETQ I (CAR #:G15584))
(SETQ #:G15584 (CDR #:G15584))
NEXT-LOOP (RPLACD #:G15586 (SETQ #:G15586 (LIST I)))
(WHEN (ENDP #:G15584) (GO END-LOOP))
(SETQ I (CAR #:G15584))
(SETQ #:G15584 (CDR #:G15584))
(GO NEXT-LOOP)
END-LOOP (RETURN-FROM NIL (CDR #:G15585))))))
From other systems you can get additional complexity from macro
expansion, for example from the Loom knowledge representation language
one can get write a query against the knowledge base using the RETRIEVE
macro:
(retrieve (?person ?age)
(:and (person ?person)
(> (height ?person) 1.5m)
(age ?person ?age)))
which produces the following wonderful :) expansion, which is a program
for computing the answer to the query.
(PROGN (LOOM::ENSURE-VISIBILITY-OF-UPDATES)
(LET ((LOOM::*DEFAULTMODEP* T))
(DECLARE (SPECIAL LOOM::*DEFAULTMODEP*))
(LET ((LOOM::*INSIDEQUERYP* T))
(LOOM::FAST-REMOVE-EQUAL-DUPLICATES
(LET ((LOOM::RESULTTUPLES NIL))
(LOOP FOR ?PERSON IN
(LOOM::GENERATE-INSTANCES-IN-QUERY USER-THEORY^PERSON)
WHEN
(LET (LOOM::QUERYRESULT)
(LOOP FOR #:V-15588 IN
(FUNCALL (GET
'USER-THEORY^HEIGHT
:LOOM-NO-DUPLICATES-COMPILED-FUNCTION)
(LOOM::GET-TEMPORALLY-OFFSET-INSTANCE
?PERSON))
DO
(WHEN (MEASURES:DIM> #:V-15588 1.5m)
(SETQ LOOM::QUERYRESULT T)))
LOOM::QUERYRESULT)
DO
(LOOP FOR ?AGE IN
(FUNCALL (GET 'USER-THEORY^AGE
:LOOM-NO-DUPLICATES-COMPILED-FUNCTION)
(LOOM::GET-TEMPORALLY-OFFSET-INSTANCE
?PERSON))
DO (PUSH (LIST ?PERSON ?AGE) LOOM::RESULTTUPLES)))
LOOM::RESULTTUPLES)))))
One beauty of doing this as a macro is that the resulting query program
can be compiled (which will happen fairly automatically if the user
includes the query in code which is compiled), or (with a bit more work)
compiled at run-time and then stored for better performance.
I suppose it is now up to you to explain how to accomplish these feats
using other means.
--
Thomas A. Russ, USC/Information Sciences Institute t...@isi.edu
> > My understanding of Lispy macros is that they provide for the
> > modification or extension of *syntax*, in essence allowing one to add
> > new special forms.
[snip]
> Well, the idea is, that when the syntax is so small, you may not ever need
> to change it, as the language consists of very little syntax, and a lot of
> object operations.
Which idea? I certainly like the Smalltalk approach, and, at least as a
partial consequence, I do most of my programming in Smalltalk. But, um,
Smalltalk is different that Common Lisp, and it seems a tad odd to try
to force the latter into the mold of the former, especially as Smalltalk
isn't as linguisitically oriented as Common Lisp. I'd say, for example,
that one is much more like to extend the IDE than the language, in
Smalltalk, and the reverse in Common Lisp (or in Scheme or in Prolog).
> In my question, I was aware that LISP macros allow
> lower-level access to the syntax (changing the way things are parsed),
Well, evaluated, certainly. I believe others have pointed to the reader.
> however, I am suggesting that it may not be necessary, and that an
> extendible object model may be all you'll ever need to change about the
> language
Change for what end? That's the key question. It seems that serious and
experienced Common Lisp programmers (which I am not) find it useful and
sensible to build up their programs in part via macros.
Sounds reasonable to me. I certainly don't have any experience to
contradict it.
> (which is what I want you to contradict by bringing practical
> examples of things possible to achieve by affecting the ways things are
> parsed (macros), and not by extending the object model).
Sorry, can't help you. I'm not even sure that this is a reasonable
request. Perhaps you supply a more specifc one?
> > Now, it's quite easy to change the object model (e.g., to prototypes)
> > and, thus, to get a rather different langauge (tailored to the
> > application domain, one might say), but the basic syntactic and statment
> > level evaluation mechanism don't change.
>
> Again, those basic syntatic statement level evaluation may not need be
> useful to change.
Well, maybe. If you want to keep the syntax simple, sure. If you want to
have the syntax bear some of the representation burden, probably not.
But this is mere idle speculation, eh?
> > Oh, one should probably point at Prolog as well, with it's op function
> > which allows one to define the associtivity and precedence of arbitrary
> > predicates. Very handy for quickly building domain specific sugar.
> >
> If I understand this correctly, it seems to be part of that same mechanism,
> if you consider op's (operators) to be object methods and part of the
> extendible object model.
Huh?
Suppose one has a predicate 'has', normally one would write that as:
has(sally, information).
By first adding the rule:
:- op(600, xfx, has).
One can write the first clause as:
sally has information.
But no shift in the basic sematics has occured. I'm under the impression
that many Prologs would transform both forms into the same internal
form, which is much closer to the former, which is essentially a sexpr.
Hmm. Perhaps this is more like a reader macro. I'm not sure (anyone?).
Prologs tend to have program representing predicates so that you can do
programs writing programs type things. I'm just not sure how this all
maps onto Common Lisp.
However, I would bet that it maps better onto Common Lisp than onto
Smalltalk.
> This only reenforces the idea, that the high-level
> syntax is the low-level one, combined with the basic object model.
I frankly can make no sense of this. I tend to associate object models
with semantics.
Object new initialize.
and
(send (send Object new) initialize)
are hardly syntactically identical, but they certainly could express the
same thing in the same object model.
> My theory is, that the high-level syntax is extendible, when the basic
> object model is extendible enough,
Using your undefined terms for the moment, if the high-level syntax is
extendible, why not have it be extendible by either componant?
> and that the low-level syntax need not
> change,
Why not the other way 'round?
> which is why I dislike LISP's way of providing means to change
> the low-level syntax.
I think you need, as several people mentioned, to go to the library.
> The first is the WITH-OPEN-FILE macro from Common Lisp itself. The
> fundamental issue is to provide a nice, simple and safe environment for
> doing file operations. When a program opens a file, it should close the
> file -- even (especially?) if an error occurs while the program is
> reading from the file. The way to do this nicely in CL is:
>
> (with-open-file (input-stream "name-of-my-file")
> (read-some-stuff input-stream)
> (read-other-stuff input-stream))
>
> Now, this can be done using the UNWIND-PROTECT form in CL. (If you
> aren't familiar with this form, it is similar to Java's
> "try {...} finally {...}" construct.
This is quite possible as an object operation. I'd declare a method of the class
InputStream (Not of an instance, a method of the class itself), that took a
file-name and closure (a piece of code along with its used 'enviorment')
(A bit of pseudo Smalltalk-like object language code):
InputStream with FileName: "name-of-my-file" do:
input-stream | [ "input-stream is an argument to this anonymous
closure function"
input-stream read: some-stuff.
input-stream read: some-more-stuff.
].
This would call InputStream's "with Filename: do:" function, that takes a file
name to open
as an argument and a piece of closure code to execute under a try{} block once the
file is
open. Note also that since the anonymous code is a closure (I'm not sure if under
SmallTalk it is, but with a language that wishes to avoid macros, this would be a
necessity),
it has access to all the variables external to this small piece of code and
therefore does not
require argument-passing of the enviorment it needs.
> I will also note that this particular item would be rather difficult to
> package as an object extension, since the body of the macro is a series
> of language forms that need to be executed in an evironment established
> by the macro itself.
If anonymous closure code is declared as easily as you can wrap a few
expressions within [], this should not be a problem.
I don't think this is more work, I think it is equivalent, and probably much
easier to debug and a lot less messy to declare :)
> ...
> examples of the latter. I suppose that the LOOP macro might qualify:
>
> (loop for i in L collect i)
I've been informed that Smalltalk has an implementation of 'collect' with
loops, but I will implement another one in a Smalltalk-like language
anyway:
L collectElements: element | [^ True].
"This would return a collected list in which all elements for which 'True'
is true for, are collected. This is obviously all elements, but can also be
a boolean expression concerning the 'element'."
> From other systems you can get additional complexity from macro
> expansion, for example from the Loom knowledge representation language
> one can get write a query against the knowledge base using the RETRIEVE
> macro:
>
> (retrieve (?person ?age)
> (:and (person ?person)
> (> (height ?person) 1.5m)
> (age ?person ?age)))
A slow (yet simple) item-by-item parse (that does not use a query-language
syntax) could use:
database collectElements:
person | [
^(person isPerson & person height > 1.5m & person age =
age).
].
To really duplicate the power of macros in this case, and create a query
'language' as well, some more language features are required, and I am
not aware of Smalltalk providing those (No, I am not referring to features
that replicate macros or features that are specific for this case :)
This language would need some sort of static-typing scheme, and
operator/method overloading, so that a query language syntax can be
provided by a piece of code, in a way similar to how it is provided by the
macro.
Inside the Query namespace (that must be selected for the overloaded
operators to be defined, and therefore the Query language syntax to be
available), a set of operators that take a message/method name as an
l-value, and an arbitrary value as an r-value, and return a Query can
be implemented. For example:
Returns Query, operator= (MethodOfQueryable, Any);
Note that the specified 'Query', 'MethodOfQueryable', 'Any' are type
specifications.
This allows an equation query to be declared as easily as:
(Age = 5)
(isPerson = True)
'Age' is a method of the 'Person' class, that returns its age.
'isPerson' is a method that returns whether or not the class is of the
'Person' type.
Another set of operators can be defined on the (Query) such as '&'
'|' etc, that return Queries which are combinations of existing
queries.
Then, a Database class would be one consisting of many instances
of some 'Queryable' type. Each Database would have attached
tables for all sorts of optimizations (such as a hash-table to optimize
equation checks), for each element in the queryable type.
For example, the database containing a lot of 'Person's would have a
hash-table attached, that allows quickly finding all the entries of
some Age.
When a 'retrieve' method call is issued on a database, it takes a query
(such as: (isPerson = True) & (height > 1.5m) & (age = some_age))
and can use the prepared hash-tables/optimization-tables to perform
the query as efficiently as possible. (Not requiring running code on a
per-element basis). For example, in this case, it would be able to use
the Age hash-table to easily find all entries matching 'some_age', and
run the 'isPerson' method and 'height' method on each of those to
complete the query.
This demonstrates, that using static-typing along with operator
overloading, it is possible to extend the syntax in ways that provide
(IMO) all the useful power from macros, in a way that is much less
messy than macros.
Note that to replace macros, operator/method overloading, static-
typing, and even namespace-shadowing are required mechanisms,
and therefore Smalltalk does not completely remove the need for the
convinience/power in macros, and was only used as an example
language where its features allowed enough extensibility of the
object model to demonstrate how it can extend the usable syntax.
Final example:
database query: (isPerson = True) & (height > 1.5m) & (age = some_age).
I find it a lot more readable, too, don't you?
> In article <m3r9bsb...@world.std.com>,
> Tom Breton <t...@world.std.com> writes:
> > ...
> > All else being equal, it's better to make fewer transformations on
> > your code; it's less machinery to think about. IMO. Don't you agree?
>
> the purpose of macros is to simplify your code. once the macro is
> written and debugged, you shouldn't have to think about it.
Wow, this thread just keeps chewing at this statement for no apparent
reason.
> once the macro is written and debugged,
Sure, once it's *done*.
> you shouldn't have to think about it.
Sure, you *shouldn't* have to, that doesn't mean you never will.
probably because there is a good reason for it
>> once the macro is written and debugged,
>
> Sure, once it's *done*.
i have news for you: this is one of the essential tasks when you are
programming, esp. when you are writing code that other people have to
use.
>> you shouldn't have to think about it.
>
> Sure, you *shouldn't* have to, that doesn't mean you never will.
fact of life.
btw. if you dislike macros so much, nobody forces you to use them
--
Hartmann Schaffer
sorry, no. I find the object model extremely constraining, just
like infix syntax is simple and compact for severely constrained
contexts and circumstances. once you leave the constrained world,
you have to _fight_ both of these premature optimizations by going
"underneath" their expression and thinking very carefully about what
they do because it will be uncommon to leave the constrained world,
and this is itself partly because it is painful to leave it. (sort
of like Tom Breton and his "optimization" for fewer levels of code
transformation still thinks he has to do with macros. competent
programmers gain a deep trust in their ability to deal with such
abstractions and do not fear localized and contained complexity,
except that both the object model and infix syntax _export_ their
complexity so you have no choice but to rub against the constraints
when using either.)
#:Erik
> Fexprs were in Maclisp, and they were simply functions whose arguments were
> not automatically evaluated. The function itself had to call EVAL when
> necessary.
The same as nlambdas in Interlisp?
--
(espen)
Personally, I don't find that infix notation has much
merit, except for the fact that "that's what everyone is
used to". Really, I'd rather write (+ 3 4 5 6) than
(3 + 4 + 5 + 6) any day, and come now: is (+ 3 4) any more
syntactically unwieldly than (3 + 4)? Not that I can tell.
I suppose, perhaps, you're referring to all the very simple
expressions in other languages where the parentheses aren't
necessary? But then you and I both know that lisp programmers
stop seeing parens after a while...
I'm reminded of a study which showed that command line
environments were harder to learn but had higher payoff
in productivity in the end. I wouldn't really characterize
Lisp as being hard to learn at all, but I would still
characterize it somewhat like that: any initial investement
time is certainly paid off for by productivity in the end.
C/
> Personally, I have no problem dealing with macros and their expansions,
> at least in anything resembling a modern Lisp programming environment
If you have a 3Com Palm device, you may want to try the cute little Scheme
system LispMe :)
It has macros but, unfortunately, no macroexpansion facilities. I debug
LispMe macros by eliminating potential trouble spots and concentrating on
the rest of the code. I do this by double checking at the LispMe
toplevel--i.e. the equivalent of a listener--that forms that select macro
arguments, which are usually complex, do what I expect.
Paolo
--
EncyCMUCLopedia * Extensive collection of CMU Common Lisp documentation
http://cvs2.cons.org:8000/cmucl/doc/EncyCMUCLopedia/
> The point of any programming language construct is to allow you to
> write down *what you mean* as directly and clearly as possible to
The INTERCAL designers may disagree :)
The usual accusation is that "forcing" people into "unnatural" practices
like prefix/postfix notation rather than infix is "less intuitive."
It seems to me that this is rather like the mindless focus on having
applications have GUIs that are specifically designed so that users can
get started instantly with no apparent learning curve. The
point that gets pointed out is that this means that the computer is
accomodating the users rather than vice-versa.
The problem is that by starting off this way, the system may wind up
not being extensible, or terribly powerful. You may be able to write
a few expressions more simply by using infix notation, but it it makes
it impractical to do anything more sophisticated, that's a loss.
--
"The chat program is in public domain. This is not the GNU public
license. If it breaks then you get to keep both pieces."
(Copyright notice for the chat program)
cbbr...@ntlug.org - - <http://www.ntlug.org/~cbbrowne/lsf.html>
> On 27 Apr 2000 10:26:34 -0400, Joe Marshall <jmar...@alum.mit.edu> wrote:
>
> > The point of any programming language construct is to allow you to
> > write down *what you mean* as directly and clearly as possible to
>
> The INTERCAL designers may disagree :)
Oh, I don't know, suppose you want to effectively change all dynamic
binding of special variables into assignments. This is very directly
and clearly written in INTERCAL:
PLEASE ABSTAIN FROM STASHING + RETRIEVING
If only such power and expressability were available in CommonLisp.
> Well, the idea is, that when the syntax is so small, you may not ever need
> to change it, as the language consists of very little syntax, and a lot of
> object operations.Which idea? I certainly like the Smalltalk approach, and, at least as a
partial consequence, I do most of my programming in Smalltalk. But, um,
Smalltalk is different that Common Lisp, and it seems a tad odd to try
to force the latter into the mold of the former, especially as Smalltalk
isn't as linguisitically oriented as Common Lisp. I'd say, for example,
that one is much more like to extend the IDE than the language, in
Smalltalk, and the reverse in Common Lisp (or in Scheme or in Prolog).
> In my question, I was aware that LISP macros allow
> lower-level access to the syntax (changing the way things are parsed),Well, evaluated, certainly. I believe others have pointed to the reader.
> however, I am suggesting that it may not be necessary, and that an
> extendible object model may be all you'll ever need to change about the
> languageChange for what end? That's the key question. It seems that serious and
experienced Common Lisp programmers (which I am not) find it useful and
sensible to build up their programs in part via macros.
Sounds reasonable to me. I certainly don't have any experience to
contradict it.> (which is what I want you to contradict by bringing practical
> examples of things possible to achieve by affecting the ways things are
> parsed (macros), and not by extending the object model).Sorry, can't help you. I'm not even sure that this is a reasonable
request. Perhaps you supply a more specifc one?
> > Now, it's quite easy to change the object model (e.g., to prototypes)
> > and, thus, to get a rather different langauge (tailored to the
> > application domain, one might say), but the basic syntactic and statment
> > level evaluation mechanism don't change.
>
> Again, those basic syntatic statement level evaluation may not need be
> useful to change.Well, maybe. If you want to keep the syntax simple, sure. If you want to
have the syntax bear some of the representation burden, probably not.
> > Oh, one should probably point at Prolog as well, with it's op function
> > which allows one to define the associtivity and precedence of arbitrary
> > predicates. Very handy for quickly building domain specific sugar.
> >
> If I understand this correctly, it seems to be part of that same mechanism,
> if you consider op's (operators) to be object methods and part of the
> extendible object model.Huh?
Suppose one has a predicate 'has', normally one would write that as:has(sally, information).
By first adding the rule:
:- op(600, xfx, has).
One can write the first clause as:
sally has information.
But no shift in the basic sematics has occured. I'm under the impression
that many Prologs would transform both forms into the same internal
form, which is much closer to the former, which is essentially a sexpr.
Hmm. Perhaps this is more like a reader macro. I'm not sure (anyone?).
Prologs tend to have program representing predicates so that you can do
programs writing programs type things. I'm just not sure how this all
maps onto Common Lisp.
However, I would bet that it maps better onto Common Lisp than onto
Smalltalk.
> This only reenforces the idea, that the high-level
> syntax is the low-level one, combined with the basic object model.I frankly can make no sense of this. I tend to associate object models
with semantics.Object new initialize.
and
(send (send Object new) initialize)are hardly syntactically identical, but they certainly could express the
same thing in the same object model.
> My theory is, that the high-level syntax is extendible, when the basic
> object model is extendible enough,Using your undefined terms for the moment, if the high-level syntax is
extendible, why not have it be extendible by either componant?
> and that the low-level syntax need not
> change,Why not the other way 'round?
> which is why I dislike LISP's way of providing means to change
> the low-level syntax.I think you need, as several people mentioned, to go to the library.
> The idea I was trying to convey through this thread.
> And as much as many think of Smalltalk as a whole enviorment, I prefer
> taking just the syntatic and semantic part as an example of what types
> of *CLEANER* semantic/syntatic modifications can be used to replace
> the unclean macros.
What exactly do you mean by `unclean' or `*CLEANER*'? How are you
measuring `cleanliness'?
> I am criticising LISP for not providing powerful-enough alternative
> features such as a built-in powerful object model and syntax features
> concerning that, that can also change the ways things are evaluated, and
> provide a CLEANER mean than macros, for the same purpose.
I guess if CLOS+MOP isn't powerful enough for you, you should consider
using another language. If the ability to modify and extend the
readtable, evaluator, and compiler isn't enough to change the way
things are evaluated, you're SOL as far as Lisp is concerned.
> What I am trying to request there is, the contradiction of the claim that in an
> ideal language, there is ALWAYS a cleaner way than macros, and therefore
> macros only provide extra mess.
We've been contradicting you.
> I don't see how I could have made that any more specific.
How about by defining cleanliness in a quantifiable way, defining what
you mean by `ideal language'?
> My claim is that the representation ability is just as good with an
> immutable syntax, when the objects, methods, operators, etc can be
> powerfully modified and extended.
So what's your beef?
--
~jrm
> As I mentioned earlier, LISP *DOES* require macros for the extra
> convinience and usefulness. I am not critizing LISP programmers for using
> macros. I am criticising LISP for not providing powerful-enough alternative
> features such as a built-in powerful object model and syntax features
> concerning that, that can also change the ways things are evaluated, and
> provide a CLEANER mean than macros, for the same purpose.
Lisp has an object model more powerful than any other language I
know, certainly more powerful than smalltalk. It is also essentially
trivial to alter the evaluation rules in the way smalltalk does. You
could easily make things be so that:
(iff form
{...}
{...})
was just a normal function, and {...} is a tiny bit of magic to do
what [...] does in smalltalk.
You really need to learn something about Lisp I think.
--tim
<...tortuous accounts of how to "simplify" things ...>
Let's see. This is clearly more clumsy, more complex, inherently less
efficient, and less general (capable) than the current macro facility.
An obvious loser all the way around. Yet you think it's more "elegant".
I think that speaks volumes.
> database query: (isPerson = True) & (height > 1.5m) & (age = some_age).
>
> I find it a lot more readable, too, don't you?
No.
/Jon
--
Jon Anthony
Synquiry Technologies, Ltd. Belmont, MA 02478, 617.484.3383
"Nightmares - Ha! The way my life's been going lately,
Who'd notice?" -- Londo Mollari
Graham, in one or both of his books (_ANSI Common Lisp_ or _On Lisp_)
makes the point that macros must bear the burden of "breaking" the
usual evaluation rules. Macros therefore must save comparatively more
effort (in both writing and reading code) than a comparable function.
E.g., you might write a function and then use it only once to make
things clearer: it doesn't save any effort of writing code (except of
separate debugging), nor much effort reading code (still the same
number of lines.)
A macro like "with-open-file" doesn't obey function evaluation rules,
but saves so much effort getting things *safe* and *correct* that it
more than makes up for the added "wart" to some abstract "lambda
nature" of the language. Plus, it can be used all over the place.
Furthermore, to tie this to a (different?) thread, the proper use of
&body in the argument list makes the auto-formatting of the macro call
clear in code. Simply by looking at the indenting, one realizes that
with-open-file isn't a function, and its behavior as a wrapper becomes
visually clear, and really no more a burden to understanding than
"let" is.
Obviously, not every macro can be as great a boon.
Note that Graham's criterion really says more about using functional
abstraction vs. code manipulation than about when to use a macro in
isolation. In a language like Lisp with such powerful functional
abilities, macros *might* be relatively *less* desirable than in a
language like C or Fortran. That doesn't say much though, since
neither *offers* Lisp macros. Also, Graham appears to be influenced by
Scheme somewhat in his preference for constructing utility functions
for different types of iteration to using something like "loop".
Personal example:
I've been guilty, probably, of "mis-using" macros
<http://www.people.fas.harvard.edu/~jaoswald/lisp/magic/ident.lisp>
where I write a little "graphics language", allowing me to define
CLOS classes with constructor functions such as
#'make-sideways-number-9, by saying
(def-sideways-numeral 9 ((0 5 0 1) (2 5 2 3) (2 3 1 2) (4 5 1 2)))
specifying what pen strokes are used to write the glyph.
Basically, I did it to combine a defclass and a defun in the same
expression.
I'd be hard pressed to justify this in the abstract. The macro is
pretty ugly, and I can't necessarily say this avoided errors, since I
probably botched the code to create the function symbols from the
digit, so that it must be used in the correct package. But I only use
it 10 times, and actually, since I had already written the function
once, it was pretty easy to write the macro by adding backquote
syntax in the right place, and abstracting things slightly to allow
for different digits to be described.
If I had to do something like this for a publically-exposed interface,
I would have spent much more time fixing the design, probably to use
a more functional interface, with a much simpler helper macro. For me,
in this instance, I let the Lisp compiler do a lot of typing for me,
and that got the job done somewhat quicker.
--Joe Oswald
Sent via Deja.com http://www.deja.com/
Before you buy.
> In article <m3wvljj...@world.std.com>,
> Tom Breton <t...@world.std.com> writes:
> > ...
[misleading snippage undone. I wrote:]
> > All else being equal, it's better to make fewer transformations on
> > your code; it's less machinery to think about. IMO. Don't you agree?
>
> >> the purpose of macros is to simplify your code. once the macro is
> >> written and debugged, you shouldn't have to think about it.
> >
> > Wow, this thread just keeps chewing at this statement for no apparent
> > reason.
>
> probably because there is a good reason for it
Now I know beyond a reasonable doubt that this thread isn't on the
level. I may not be such a gifted writer that I get my meaning across
100% of the time, but this is ridiculous. There is nothing in my
innocuous statement that should remotely evoke this sudden
insult-hurling conga-line. 3 people so far have attacked me for the
paragraph above - or should I say, 3 user-ids have.
I'm inclined to agree with the guy who suggested that we're seeing a
bunch of Naggum tentacles. Same style, same quick resort to "then
you're a bad programmer", same pattern of zealotry, same type of
conceptual blind spots.
While obviously no-one can be 100% sure, it's done a hell of a lot of
walking like a duck and quacking like a duck; I say it's a duck named
Erik Naggum. I'm disgusted that you would stoop so low as to harrass
the newsgroup under phony names. I believe you used the phrase
"obsessed stalker" in one of your recent attacks; you may want to
apply it closer to home.
I will continue to use this newsgroup and ignore you, regardless your
user-id. I've already killfiled your recent tentacles, I suppose I
shall have to keep doing so for each new tentacle. I assure you, it's
far less work to add you to my killfile each time than it is to create
each "Hartmann Schaffer".
> >> once the macro is written and debugged,
> >
> > Sure, once it's *done*.
>
> i have news for you: this is one of the essential tasks when you are
> programming, esp. when you are writing code that other people have to
> use.
Wow, you really are just trying to goad me, aren't you?
> >> you shouldn't have to think about it.
> >
> > Sure, you *shouldn't* have to, that doesn't mean you never will.
>
> fact of life.
>
> btw. if you dislike macros so much, nobody forces you to use them
Yup, just trying to goad me.
Ob-liten-up-with-some-humor: The insult-hurling conga line goes like
this:
"You're bad programm-ER!
You're bad programm-ER!
..."
> What exactly do you mean by `unclean' or `*CLEANER*'? How are you
> measuring `cleanliness'?
Its a complex subjective term, and I doubt I can do well defining it, but I can
say it directly relates to readability.
> > I am criticising LISP for not providing powerful-enough alternative
> > features such as a built-in powerful object model and syntax features
> > concerning that, that can also change the ways things are evaluated, and
> > provide a CLEANER mean than macros, for the same purpose.
>
> I guess if CLOS+MOP isn't powerful enough for you, you should consider
> using another language. If the ability to modify and extend the
> readtable, evaluator, and compiler isn't enough to change the way
> things are evaluated, you're SOL as far as Lisp is concerned.
I did not say CLOS+MOP was not powerful. I think it is safe to say,
however, that the LISP syntax values simplicity over many things, and
the syntax/lang. itself is not related in any way to the object model, and
is simply not extendible in means other than macros (such as, for example,
operator overloading). Which is why I also consider examples of how
macros are useful in extending the LISP syntax simply invalid to the
argument of whether or not the concept macros is useful.
As powerful as CLOS+MOP may be, I would not consider it an
alternative to macros in replacing the language extendibility abilities.
> > What I am trying to request there is, the contradiction of the claim that in an
> > ideal language, there is ALWAYS a cleaner way than macros, and therefore
> > macros only provide extra mess.
>
> We've been contradicting you.
I've not yet noticed a contradiction of that claim. The best way to contradict
that would probably be to bring some macro definitions that are simply not
doable using non-macro means. Thomas A. Russ brought a few nice
examples of how macros effectively extend the language. I brought
equivalent non-macro solutions, that show that all the specific macro
functionality from his examples can be implemented using the non-macro
means. I challange you to contradict me by bringing examples of
extensions that I simply cannot replicate using non-macro means.
> > I don't see how I could have made that any more specific.
>
> How about by defining cleanliness in a quantifiable way, defining what
> you mean by `ideal language'?
I was actually referring to my original 'challange' of bringing those
examples, that did not mention an ideal language or cleanliness. Cleanliness
is subjective but some charactaristics are quite absoloute, and can roughly
measure the cleanliness of a piece of code. I would say the simplest way to
measure cleanliness, is to measure readability (probably just as problematic
as well, but probably better defined than cleanliness).
By the concept of an 'ideal language', I obviously refer to my oppinion of
an ideal language, which is a language that has all the extendibility, but not
the macros.
> * Peaker <GNUP...@yahoo.com>
> | I find it a lot more readable, too, don't you?
>
> sorry, no. I find the object model extremely constraining, just
> like infix syntax is simple and compact for severely constrained
> contexts and circumstances. once you leave the constrained world,
> you have to _fight_ both of these premature optimizations by going
> "underneath" their expression and thinking very carefully about what
> they do...
Can you give examples of contexts and circumstances where you'd
have to go underneath the expressions? I have noted the need to
actually understand the underlying implementation of such
abstractions existed, but only when the abstractions were implemented
incorrectly, or in a severely limited manner. Most implementations
have some imposed limitations, but when those are well documented,
they still do not require you to go underneath the expressions and
abstractions.
> .... competent
> programmers gain a deep trust in their ability to deal with such
> abstractions and do not fear localized and contained complexity,
> except that both the object model and infix syntax _export_ their
> complexity so you have no choice but to rub against the constraints
> when using either.)
Could you make yourself clearer on this point?
You do have to deal with some complexity with any
type of interface, but interfaces built of infix
operation definitions, and object models, tend to
make things simpler, and end up exporting less
complexity.
> [..] Same style, same quick resort to "then you're a bad
> programmer", same pattern of zealotry, same type of conceptual blind
> spots.
Blind spots for what, precisely? Who's to say who's got "blind spots"?
It could be you, even.
> While obviously no-one can be 100% sure [..]
You're being paranoid and quite rude.
--
Frode Vatvedt Fjeld
I'm always relieved when my critics turn out to be clinically insane.
#:Erik
simpler? simpler than what? it has been pretty obvious for a while
that you don't really understand alternatives to your own one view.
that means you are _unable_ to make comparisons with any merits.
the people you argue against typically know dozens of languages.
Common Lisp has more than 40 years of history behind it. it's quite
amazing that you think you can beat that with arguments from
ignorance of what has been done and without making comparisons of
languages each on their _own_ merits.
I'm sorry to say so, but you're an annoying waste of time when you
don't want to respect the past and the knowledge and the decisions
of this community, before you want to impose your own desires from a
different community at best.
#:Erik