What the paper was about
This paper begins by discussing how Scheme at this point in time was believed to have hygienic macros (or at least the R5RS implied that they should be?), and how previous work has contributed to the advances that have led up to the current point in time. It is then pointed out that despite these beliefs there is in fact problems that persist which allow for referentially opaque macros to be written (i.e. in the system which was designed to be hygienic, unhygienic macros can actually still be written.
In fact, despite the macro system being not technically written in the Scheme, but in a subset or special language within scheme in order to try to be more understandable and avoid some of the problems of non hygienic macros that had been highlighted in the past (even though they were still Turing complete), it failed to exclude all possible ways of making referentially opaque macros as it has intended.
What this paper contributes
This paper contributes the formalized realization that although the system presently is supposed to be hygienic, it is in fact not. He goes on to specifically articulate how the current system can be broken (or in other words, how the current system can still behave improper when considering hygiene) by redefining macros within their own expansion. This allows the capturing of variables which normally would have been ‘panited’ a different color and allows them to be captured in the macro’s expanded code.
He describes the distinction between the weak and true hygienic method being based on the assumption of a variable either being unbound or defined early in global scope and not re-defined since - or find a way to do something to the effect of making the macro to be captured be explicity passed to the macro.
It was interesting in reading the example of how the overloaded ‘lambda’ was able to be injected in during the expansion and how this new syntax rule would spread like a ‘virus’ throughout the program.
What did/did not understand?
The higher level discussion made sense, and it was clear to see the result of the referentially opaque macros (the expanded terms with the ‘colors’ on the identifiers), however, I found it a little difficult reasoning about all the nested syntax rule definitions.
A Few Principles of Macro Design
What is this paper about
This paper is about confronting some of the realities of macros, namely that they do allow for more expressive, natural extension of Scheme, but that they do not always result in programs that are easily reasoned about. This is partially because they can arbitrarily manipulate the program in question, and thus some cannot be fully understood without a graphical IDE such as DrScheme to step through the macro expansion and actually observe what kind of expansions are going on. Ideally, however, the author states a macro can be well understood as a true abstraction if the macro is ‘well-behaved’.
This paper is about
Gives an example of how to create an sexp comparison, ident test and ident comparison using macros. This is meant to be an example of the introspective power of macros, showing how powerful computations can be expressed elegantly in macro form. Perhaps more specifically as well it is an example of how to move computation to compile time, since the macro expansion itself is doing the work to solve the puzzle presented in the section. This seems exactly like what we were doing (or attempting to do) for the prolog assignment in 330.
It then goes on to explore the definition of hygienic macros, shows an example of one which seems to break hygiene, and then goes on to state that the goal or question should be how to use hygienic macros correctly and clearly so the program is easy to understand. To show that although current hygienic macros can still be flawed but are still the best thing out there, he brings up the fexprs Lisp feature which takes source code dynamically and thus potentially eliminate the ability for compilers to optimize. He argues the hygienic macro expansion is far less troublesome because they are restricted to performing their transformations at compile time. Does this mean the Lisp version/alternative was acting entirely as a preprocessor and not a compiler? WIth all this macro expansion we’ve been talking about it seems odd to distinguish between compile time expansion, which can have a plethora of levels and layers, and pre-compile time behavior - couldn’t that just be thought of as a first step for compilation?
He goes on to specifically discuss principles of macro design. This was a relief, because all of the contrived, complicated examples we have seen that generate errors and show problems seem a little overly complicated to me at the moment. He begins this discussion by highlighting the importance of preserving alpha-equivalence, One way to do this is to ensure identifiers are used consistently in the macros (not used in both free and bound positions). He additionally highlights the problems associated with quoting bound identifiers, and of not treating subterms of the macro as atomic units. The atomic unit problem is highlights the idea that optimizations that can reasonably be done by a compiler should be left to a compiler to keep the code simple and avoid problems.
In conclusion, they highlight the power of hygienic macros as well as their ability to cause complications and thus create a need for patterns of their use which prevent this problem. Although, from our previous discussions of design patterns, it makes me wonder if there is not some way to better define/restrict macros so these patterns are more natural or encouraged (or enforced) by the way macro support is built into the language.