Summary: Last macro system that did something different was ‘92. The paper is trying to make a model of what racket is doing, it does not present anything new. The way it compiles patterns is different. The approach in this paper is to turn each identifier into a macro. It gets bound to the result of the macro that describes it. Before pattern matching was like an interpreter approach. Now the compiling that happens turns it into a racket environment. It reuses the machinery of the language to do substitution (it’s more efficient). We went over an example, I didn’t catch of what because I was trying to help Yu get on.
We talked about classes in racket and how they also use the define keyword. We then went over an example of a class definition. If you use let, there’s more parenthesis. You also indent more to the right. Bad style. We also had to write an explicit lambda with the let example. This shows how clumsy it is when you don’t use internal definitions. We then talked about how define binds inside of a package-begin. having to use define* inside of package-begin is really annoying. We did an example where the id identifier was getting bound funny, so we ended up having to use lambda expressions. It basically ended up behaving like a package. We then ended up using define-package to do the same thing. We showed how using the define-package construct was more robust and elegant. We then showed the expansion of the package. We talked about how you want to write something in a functional way but have it compile into an imperative program, and how the example seemed to do the opposite. We answered the question “Is it immoral to use define* as an imperative style?” First we recognized that imperative things are terse and that’s useful. Then we pointed out that functional programming makes it necessary to have to come up with a new name at every step. define* makes it easier to write the definition, but it’s still funcitonal under the hood, so you can reason about it functionally. Programs should stay functional until you’ve decided how the work is being split up. We talked about changes in architecture from serial to parallel and how that means there’s going to be a push toward functional programming to support using parallel architecture. Fortran is popular because there are some things you can’t do in it that you can do in C, so it’s easier to discover parallelization opportunities. People like it because it’s weak from a systems perspective.
We started discussing the model. We discussed how primitives works. Primitives are pointers to the implementation. This is what gives primitives their uniqueness, each one points to a different pointer in the implementation. We talked about how functions were evaluated. We talked about the make-syntax primitive. Syntax-e strips the contextual information and gives you what’s inside. This definition of the parser in the paper isn’t normal. We should think of there being a stage prior to parsing. In this stage, you’re doing the expansion step. The parser erases the contextual information that isn’t needed.
We talked about the expansion-driven parser. Every time we need an identifier in the parsing stage, we need the resolve step. Resolve is a syntax object that has a symbol inside it that maps it to the name that will be used for a variable.
We talked about expansion next. When you expand a lambda, you make new names for it so it doesn’t conflict with other names. We then showed a brief example illustrating this. We then discussed macros and their bindings. We stepped through how they’re expanded. We discussed why you can’t capture the compile-time environment when expanding.
Something I learned: I discovered why fortran was so popular. I’ve been wondering for a while why it’s still used for scientific calculation and now I understand why.
Lingering Question: At the beginning of the discussion we pointed out that nothing new had been done since 1992 in the realm of macros, but then we said that this paper did pattern matching differently. Does this paper actually match what racket does in terms of pattern matching, or is it just how the author chose to explain it?