scala-macros vs scala-virtualized

1,400 views
Skip to first unread message

Ivan Todoroski

unread,
Feb 14, 2012, 10:23:18 AM2/14/12
to scala-...@googlegroups.com
I am far from a Scala expert and I could be wrong, but to my naive eyes
it looks like two large projects are entering Scala which both
accomplish pretty much the same thing (or at least overlap to a very
large extent):

scala-macros and scala-virtualized

They are both mechanisms that allow you to write normal-looking Scala
code which does something else in the background, like generating SQL
queries and what not. In other words, both scala-macros and
scala-virtualized are mechanisms for creating domain specific languages
in Scala.

Having two distinct and complex mechanisms for doing essentially the
same thing seems wasteful and redundant. Not to mention the possible
unintended interactions between two such complex features.

Now, if you accept that the two mechanisms are somewhat redundant (which
you might disagree with of course), the question turns to which one is
better for Scala programmers...


1) Scala-virtualized is much more in the spirit of Scala

Even before scala-virtualized you could implement e.g. flatMap() in your
classes and have them participate in Scala's for/yield protocol, or you
could override various operators and such. Using methods to override
behavior of Scala's builtin constructs is nothing new. Scala-virtualized
merely expands that existing precedent to all language constructs and
brings it to its logical conclusion.

On the other hand, Scala had nothing like scala-macros before. It's a
completely new concept and somewhat alien in the Scala context. I've
used macros in Lisp to great effect, but Lisp is a language that is very
amenable to macros because Lisp code and data look the same. Scala code
and data look nothing alike, which brings me to the second point...


2) Scala-virtualized is much more intuitive to use for the library writer

With scala-virtualized you write your DSL implementations in
straightforward Scala code, similar to how you would write e.g. implicit
conversions to pimp a library or things of that nature.

With scala-macros you are forced to write this weird AST meta-language
that looks nothing like normal Scala code. It's almost like having to
learn another language on top of Scala.


3) How do you even debug scala-macros?

In scala-virtualized the overridden behavior sits in normal methods that
a debugger could interactively step through at run time, whenever you
hit an overridden construct as you step through your DSL.

With scala-macros, the code inside the macro is executed in some
precompilation phase before the code runs. When you actually run the
program it's no longer the macro code that gets executed, it's the
*result* produced by the AST manipulations in the macro code. So when
stepping interactively with a debugger you can't really step through the
original macro code that generated the code you are currently running.
This makes finding bugs in the macros much more difficult as your macro
codebase grows.


4) Scala-macros might be more powerful

Not to be completely one-sided, I will concede that scala-macros might
be strictly more powerful than scala-virtualized. I have no formal proof
of this, but it sounds like rewriting/generating ASTs from scala-macros
might enable some crazy things that might not be possible (or easy) in
scala-virtualized. But are these corner cases where scala-macros is more
powerful than scala-virtualized really something that you would often
use in practice?


In summary, compared to scala-virtualized, scala-macros look like a
low-level hackish solution in search of a problem. They are complex to
write, un-scalaish, difficult to debug, and will probably just add to
Scala's image of "too much complex inscrutable magic".


Finally, my question to the Scala Community is:

What *practical* problems can you solve with scala-macros that you
couldn't solve just as easily with scala-virtualized?

Paul Phillips

unread,
Feb 14, 2012, 10:36:30 AM2/14/12
to Ivan Todoroski, scala-...@googlegroups.com
I'm not equipped to answer the question, but I applaud the asking of it.

Simon Ochsenreither

unread,
Feb 14, 2012, 10:43:01 AM2/14/12
to scala-...@googlegroups.com
Great question.

I at least hope that whatever will be integrated in the future will make it possible to provide a better, type-provider-like way to access databases and other storages in a fully typed way. I think that is the killer feature which would spell out a huge advantage to a lot of people in Scala space.

Erik Osheim

unread,
Feb 14, 2012, 10:44:16 AM2/14/12
to Ivan Todoroski, scala-...@googlegroups.com
Hi Ivan,

On Tue, Feb 14, 2012 at 04:23:18PM +0100, Ivan Todoroski wrote:
> What *practical* problems can you solve with scala-macros that you
> couldn't solve just as easily with scala-virtualized?

Like you, I am not an expert at either the work on macros or on
virtualization. But given that macros are essentially "mini compiler
plugins" which do tree transformations there are some problems I face
which I bet they can solve.

The big one is optimizing away intermediate objects which are created
through use of the implicit enrichment pattern, but which aren't
needed. I talked about this a bit on the scala-sips mailing list
recently but here's a summary in code form:

import scala.{specialized => spec}
import com.azavea.math.Numeric
import com.azavea.math.FastImplicits._

// code user writes
def foo1[@spec A:Numeric](x:A, y:A) = x + y

// code with sugar removed
def foo2[@spec A](x:A, y:A)(implicit ev:Numeric[A]) = x + y

// code after implicit resolution
def foo3[@spec A](x:A, y:A)(implicit ev:Numeric[A]) = infixOps(x)(ev).+(y)

// code with infixOps method inlined
def foo4[@spec A](x:A, y:A)(implicit ev:Numeric[A]) = new FastNumericOps(x)(ev).+(y)

// code i wish could get generated, basically inlining the
// implementation of FastNumericOps#+ without creating a new object.
def foo5[@spec A](x:A, y:A)(implicit ev:Numeric[A]) = ev.plus(x, y)

This might seem a bit esoteric but this pattern also gets used with
Ordering and other type classes. It's really the only performance
problem left with this pattern, after specialization.

There are other cases too (until recently, for-loops) where after
profiling you can identify certain places where a (small) tree
transformation will yield huge gains. Virtualization may help in the
case of loops, but my sense is that it isn't designed to handle this
general class of problem.

-- Erik

P.S. If you're curious the version of Numeric I'm using can be found at:
https://github.com/azavea/numeric.

Simon Ochsenreither

unread,
Feb 14, 2012, 10:45:35 AM2/14/12
to scala-...@googlegroups.com
... in the Java space. Sorry :-)

Francois

unread,
Feb 14, 2012, 10:46:35 AM2/14/12
to Paul Phillips, Ivan Todoroski, scala-...@googlegroups.com
On 14/02/2012 16:36, Paul Phillips wrote:
> I'm not equipped to answer the question, but I applaud the asking of it.
>


That is also something found recently in discussion on Scala User Groups
(well, at least PSUG), where user start to ask when Scala will start to
*remove* features, not add (huge) one like this - moreover, two as big
as that, and kind of far from the orthogonal-feature set that Scala
adverts (you know, "deep not broad" - well, it will become hard to
explain what is not broad in Scala)

Thanks,

--
Francois ARMAND
http://fanf42.blogspot.com
http://www.normation.com

Daniel Sobral

unread,
Feb 14, 2012, 10:54:43 AM2/14/12
to Ivan Todoroski, scala-...@googlegroups.com
Whereas you see them as accomplishing the same thing, I see them as
accomplishing very different things.

As you, I might be wrong of course. I see scala-virtualized as
generating code at run-time, whereas scala-macros generates code as
compile-time. They are both code generators, which, together with
string interpolators and virtpatmat makes it four different new code
generators in the next version of Scala!

Some use cases particular to macros are ad-hoc performance
optimizations, type-safe evaluation of strings (eg: formatting
strings, regex, json, and even xml if its literals are ever replaced
with string interpolation), and generation of code that needs to be
present as libraries.

--
Daniel C. Sobral

I travel to the future all the time.

Daniel Sobral

unread,
Feb 14, 2012, 10:57:02 AM2/14/12
to Francois, Paul Phillips, Ivan Todoroski, scala-...@googlegroups.com
On Tue, Feb 14, 2012 at 13:46, Francois <fan...@gmail.com> wrote:
> On 14/02/2012 16:36, Paul Phillips wrote:
>>
>> I'm not equipped to answer the question, but I applaud the asking of it.
>>
>
>
> That is also something found recently in discussion on Scala User Groups
> (well, at least PSUG), where user start to ask when Scala will start to
> *remove* features, not add (huge) one like this - moreover, two as big as
> that, and kind of far from the orthogonal-feature set that Scala adverts
> (you know, "deep not broad" - well, it will become hard to explain what is
> not broad in Scala)

Macros + Interpolation make it possible to remove XML while, at the
same time, providing it as a library. And, like XML, one could
likewise provide JSON, etc.

Paul Brauner

unread,
Feb 14, 2012, 10:58:51 AM2/14/12
to Ivan Todoroski, scala-...@googlegroups.com
I was wondering too but I think the answer is that scala virtualized is limited to expressions (and thus does not allow for generating class declarations for instance).

Francois

unread,
Feb 14, 2012, 11:03:53 AM2/14/12
to scala-...@googlegroups.com
On 14/02/2012 16:57, Daniel Sobral wrote:
[...]
Macros + Interpolation make it possible to remove XML while, at the same time, providing it as a library. And, like XML, one could likewise provide JSON, etc.


Yes, it's what I keep saying to myself :) That would be great indeed, let's just hope that it will the main focus point of Scala 2.11.

Ivan Todoroski

unread,
Feb 14, 2012, 11:21:02 AM2/14/12
to Daniel Sobral, Francois, Paul Phillips, scala-...@googlegroups.com
On 14.02.2012 16:57, Daniel Sobral wrote:
> Macros + Interpolation make it possible to remove XML while, at the
> same time, providing it as a library. And, like XML, one could
> likewise provide JSON, etc.

Is string interpolation intimately tied with macros though? I thought
(perhaps wrongly) that they were orthogonal features. Would it be
possible to make XML an optional library by using the interpolation
support together with scala-virtualized or whatever other Scala features
aside from macros?

Ivan Todoroski

unread,
Feb 14, 2012, 11:24:12 AM2/14/12
to Erik Osheim, scala-...@googlegroups.com
Isn't this something that the inline implicit classes proposal would
address (or value classes, whatever they are called now)?

Erik Osheim

unread,
Feb 14, 2012, 11:34:30 AM2/14/12
to Ivan Todoroski, scala-...@googlegroups.com
On Tue, Feb 14, 2012 at 05:24:12PM +0100, Ivan Todoroski wrote:
> Isn't this something that the inline implicit classes proposal would
> address (or value classes, whatever they are called now)?

I'm glad you asked!

After an discussing this on the scala-sip mailing list we determined
that value classes would not support this feature. You may want to
revisit that thread for more information:

http://groups.google.com/group/scala-sips/browse_thread/thread/ad21c133a2d224b

-- Erik

Ivan Todoroski

unread,
Feb 14, 2012, 11:42:15 AM2/14/12
to Daniel Sobral, scala-...@googlegroups.com
On 14.02.2012 16:54, Daniel Sobral wrote:
> I see scala-virtualized as
> generating code at run-time, whereas scala-macros generates code as
> compile-time.

I understand this, but I consider it a mere implementation detail. What
do I care when the code is generated?

Maybe I am approaching this from a different perspective than you. I am
not a compiler developer, I am just a Scala programmer who wants to use
the language productively.

I am interested in things like expressiveness, conciseness and readability.

I am interested in removing boiler plate as much as possible (which is
why I am interested in scala-virtualized and scala-macros).

Finally, I am interested in doing all this in a statically checked
typesafe manner that eliminates runtime errors as much as possible.

As long as I get these features, I don't care how they are accomplished.
Whether through compile-time or run-time code generation, I honestly
don't care as a regular Scala programmer.

What I do care about is the severely increased cognitive load of using
scala-macros. I need to learn a whole new AST sub-language just to
eliminate some boilerplate here and there. Plus all the other things
mentioned in my original email that started this thread.


> Some use cases particular to macros are ad-hoc performance
> optimizations

I believe low-level optimizations are something that the compiler and
libraries should take care of. I still try to write as optimal code as
possible without sacrificing readability too much, but expecting me to
write what amounts to mini compiler plugins to help the compiler with
low-level optimizations is not what I would consider a common use case
for a regular Scala programmer.

The optimization use case on its own is certainly not enough to justify
inclusion of a complex feature like scala-macros. In the rare occasions
where you really needed to optimize some critical bit of code and the
Scala compiler was letting you down, you could also write an actual
Scala compiler plugin, or just write it in C and call it from JNI.


> type-safe evaluation of strings (eg: formatting
> strings, regex, json, and even xml if its literals are ever replaced
> with string interpolation), and generation of code that needs to be
> present as libraries.

This is a very interesting point. I already asked you about the
relationship between interpolation and macros in another sub-thread, I
would appreciate if you or anyone else could shed some light on this.

Ivan Todoroski

unread,
Feb 14, 2012, 11:51:28 AM2/14/12
to Paul Brauner, scala-...@googlegroups.com
On 14.02.2012 16:58, Paul Brauner wrote:
> I was wondering too but I think the answer is that scala virtualized is
> limited to expressions (and thus does not allow for generating class
> declarations for instance).

Macros can synthesize new class declarations from thin air rather than
just using existing classes you defined elsewhere? Now that does sound
pretty cool, but again I feel like I must insist on putting things in
perspective regarding the practical utility of all this.

What *practical* problem can be solved by synthesizing new class
declarations from a macro, that couldn't be solved using
scala-virtualized in a different yet still satisfactory way?

I am not trying to be contrary, I honestly wish to be educated on the
new possibilities that are opened by scala-macros and scala-virtualized,
and in particular the practical limits on what can be accomplished with
each of those features.

Daniel Sobral

unread,
Feb 14, 2012, 11:58:20 AM2/14/12
to Ivan Todoroski, Francois, Paul Phillips, scala-...@googlegroups.com

No, they are not tied. Well, macros have a dependency on string
interpolation to provide quasi-quotations.

But while string interpolation will give you XML literals and
matching, it won't check syntax at compile time. Using macros for the
interpolator makes that possible.

Rex Kerr

unread,
Feb 14, 2012, 12:05:17 PM2/14/12
to Ivan Todoroski, Paul Brauner, scala-...@googlegroups.com
The scalamacros.org site has a bunch of desired features for macros:
  http://scalamacros.org/usecases/index.html

Most of these can't be achieved with scala-virtualized as it now stands.

  --Rex

Daniel Sobral

unread,
Feb 14, 2012, 12:22:05 PM2/14/12
to Ivan Todoroski, scala-...@googlegroups.com
On Tue, Feb 14, 2012 at 14:42, Ivan Todoroski <grnch...@gmx.net> wrote:
> On 14.02.2012 16:54, Daniel Sobral wrote:
>>
>> I see scala-virtualized as
>> generating code at run-time, whereas scala-macros generates code as
>> compile-time.
>
> I understand this, but I consider it a mere implementation detail. What do I
> care when the code is generated?

Is it? Do you see no difference between a statically typed language
and a dynamically typed language? The only difference is when the type
check is made. Is it an implementation detail?

>> Some use cases particular to macros are ad-hoc performance
>> optimizations
>
> I believe low-level optimizations are something that the compiler and
> libraries should take care of. I still try to write as optimal code as
> possible without sacrificing readability too much, but expecting me to write
> what amounts to mini compiler plugins to help the compiler with low-level
> optimizations is not what I would consider a common use case for a regular
> Scala programmer.

That's where "ad-hoc" comes in, really. LIBRARIES CANNOT TAKE CARE OF
IT with the present support: they can't get rid of instance creation
and boxing. Yes, as a regular Scala programmer you won't be _creating_
macros, but if your for-loops suddenly become as fast as Java's, that
will be a library-provided macro. Neither can the compiler solve it,
because it doesn't have the knowledge that the library has. Or you
think Scala has the performance issues it has simply because no one
tried to fix them?

> The optimization use case on its own is certainly not enough to justify
> inclusion of a complex feature like scala-macros. In the rare occasions
> where you really needed to optimize some critical bit of code and the Scala
> compiler was letting you down, you could also write an actual Scala compiler
> plugin, or just write it in C and call it from JNI.

Talk to Yammer about that. If your "critical bit" means 80% of your
code, what use is it?

Now, compiler plugins suffer from three problems:

1. They are too hard to write.
2. They depend on compiler internals, which are too volatile.
3. You can't distribute them as a JAR.

So, what happens if you handle that? You get a macro. A macro is
really a specialized compiler plugin that doesn't suffer from those
three problems.

C and JNI? Unless your code spends most of its time on C, that won't
help you. The cost of calling JNI is often much higher than the gains
in performance that C will bring. But that's not a solution: you are
throwing away all of Scala's power when you do that. Consider:

Scala-with-macros:

def fatorial(n: Int) = {
var result = 1
for (n <- 2 to n) result *= n
result
}

Scala-without-macros:

def fatorial(n: Int) = {
// declare JNI binding and call C code that computes fatorial
}

That is what you proposed.

>> type-safe evaluation of strings (eg: formatting
>> strings, regex, json, and even xml if its literals are ever replaced
>> with string interpolation), and generation of code that needs to be
>> present as libraries.
>
> This is a very interesting point. I already asked you about the relationship
> between interpolation and macros in another sub-thread, I would appreciate
> if you or anyone else could shed some light on this.

That is pretty simple, and it comes down to that "implementation
detail" you mentioned. Say you have this:

printf("%d: %f%n", a, b)

At run time you can check whether "a" is an Int and "b" is a Double,
and throw an exception if they are not. At compile time you can check
this and produce a compilation error if they are not. The compiler
can't handle this unless it has knowledge of string formats used with
"printf", and, then, it won't be able to handle anything *like* this.
A compiler plugin can handle this, but we are back to the point that a
macro is nothing more than a light-weight plugin that's easier to
create, maintain and distribute.

Of course, Paul wants compiler plugins to be easier to create,
maintain and distribute, and *that* would represent a more direct
competition to macros.

sreque

unread,
Feb 14, 2012, 12:46:24 PM2/14/12
to scala-debate
One feature macros were supposed to solve is the handling of boiler
plate classes like tuples and functions. Instead of writing Tuple2,
Tuple3, etc. by hand you can generate them with macros.

Basically any abstraction problem you have where the limitation is in
the syntax of your language can be solved by macros. A simple example
that lots of people complain about is Scala's lack of a good optimized
for loop. Scala internal mailing list threads suggest that optimizing
Scala's general for construct is incredibly difficult. With macros,
this becomes a simple syntax problem with a simple solution.

Other good examples come from already-implemented language features,
like case classes. We may see case classes as a wonderful thing, but
Scheme programmers would laugh at us for doing so, as in the end they
are just syntactic sugar that Scheme programmers can and have
implemented as a library (http://docs.racket-lang.org/reference/define-
struct.html).

Now, you might say to yourself, "that's great, but case classes are
already implemented". Well, what if I want to add features to case
classes? What if I want to add a method to each case object that
returns a string version of the name of the object, which I don't
believe is available via reflection? What if, for a given Scala
enumeration, I want to automatically generate a Java version of that
enumeration for interop with Java libraries? What if I just want to
have an improved version of Scala enumerations, which most people are
currently unhappy with and don't use? With macros, all of these things
become feasible.

Once macros are part of your toolkit, you start to realize that lots
of abstraction problems that you previously accepted as a fact of life
suddenly become conquerable. I for one am very excited to see if Scala
can copy some of the success of Nemerle's macro system.

Daniel Sobral

unread,
Feb 14, 2012, 1:00:13 PM2/14/12
to sreque, scala-debate
On Tue, Feb 14, 2012 at 15:46, sreque <sean...@yahoo.com> wrote:
>
> Basically any abstraction problem you have where the limitation is in
> the syntax of your language can be solved by macros. A simple example
> that lots of people complain about is Scala's lack of a good optimized
> for loop. Scala internal mailing list threads suggest that optimizing
> Scala's general for construct is incredibly difficult. With macros,
> this becomes a simple syntax problem with a simple solution.

That isn't entirely true. Paul did manage to get Range's foreach to
the same speed as Java's (after warm up, mind you), and the complexity
associated with it can't be ridden of even in macros, except for cases
where the indices are literals or known constants.

Erik Osheim

unread,
Feb 14, 2012, 1:09:16 PM2/14/12
to Ivan Todoroski, Daniel Sobral, scala-...@googlegroups.com
On Tue, Feb 14, 2012 at 05:42:15PM +0100, Ivan Todoroski wrote:
> I believe low-level optimizations are something that the compiler
> and libraries should take care of. I still try to write as optimal
> code as possible without sacrificing readability too much, but
> expecting me to write what amounts to mini compiler plugins to help
> the compiler with low-level optimizations is not what I would
> consider a common use case for a regular Scala programmer.

Just to echo Daniel here, I am a library author who wants to do
optimizations which my users won't have to see to benefit from. I have
written a compiler plugin but it's a bit fragile and not as advanced as
I 'd like. Also, many users try to avoid compiler plugins.

I certainly like the idea of being able to use macros instead, since
macros will probably be more "future-proof" than compiler plugins
currently are (or at least, have been).

That said, if the compiler API is stable, compiler plugins get more
advanced, and the community is ready to embrace them, then I agree that
maybe macros aren't needed.

-- Erik

Haoyi Li

unread,
Feb 14, 2012, 1:13:23 PM2/14/12
to Rex Kerr, Ivan Todoroski, Paul Brauner, scala-...@googlegroups.com
I suppose part of the difference between generating code at compile
time and at run time is that compile-time code generation can be
statically checked, which is a pretty big thing.

Play! for example has a pretty extensive/complex custom compilation
process to compile your templates into class files, so your
controller->template calls can be verified at compile time, or even
before, during code completion! It also compiles your
LESS/Coffeescript. Being able to treat non-scala code as part of the
program (i.e. statically checked and verified) rather than as data
(dynamically load & pray) would be a pretty big plus.

Being able to statically check more things, e.g:

CSS files
HTML templates
Coffeescript/Javascript
XML Config Files
non-XML config files
Database-schema (a.l.a. F# Type Providers)
Regex-Literals

which are de-facto part of your "program" rather than the "data" would
be pretty awesome (naturally sometimes these things are dynamic and
need to be treated as such, but quite often they are not). One problem
with static languages is that when you integrate with external things
(config files, databases etc.) you lose all static checking and end up
getting lots of silly runtime-errors anyway. Being able to check at
least some of these things at compile-time would be sweet. You can do
all these as compiler plugins, and in a way macros are just a way of
making it easier and more regular.

-Haoyi

sreque

unread,
Feb 14, 2012, 1:22:55 PM2/14/12
to scala-debate
Looking at the last thread on the subject, I see you were in charge of
making the benchmarking harness Daniel, so you probably know what
you're talking about here. :-)

Still, the final impression I got from the thread at
http://groups.google.com/group/scala-internals/browse_thread/thread/1834f4a4239b4725/df1cea83328e6a6b?lnk=gst&q=optimized+for#df1cea83328e6a6b
was that there were still some sticky issues left and your best bet
was to stick with while loops if it really mattered.

Also, I don't know the limitations of Scala's macros, but presumably
with them I could easily right a construct like:


cFor(var i = 0; i < n; i += 1) { body ... }

that would compile down to:

{
var i = 0
while(i < n)
{
body ...
i += 1
}
}

Similarly, you could write for loop constructs that simulate Java's
foreach statement on iterable objects to avoid the allocation of
closures for the loop body. Of course, efficiently implementing a full-
fledged loop with multiple assignments, breaks, and continues is a
much bigger problem, but also hopefully doable. The point here is that
with macros your performance is going to be much more predictable and
less reliant on your JIT, the size of your method bodies, and the
current phase of the moon.

Another example of a use case I had recently had to do with pattern
matching on syntax. Basically, I need two versions of a function: one
that runs really fast and one that prints out helpful error messages
once all possible matches fail. Match failures are common and
acceptable as long as at least one possible match exists, but the
matching code runs in a tight enough loop that using something like
closures to control whether or not helpful error messages are
generated on match failures would still have a significant negative
impact on performance. Macros make it possible to generate both the
optimized version and the diagnostic version of a function from the
same source, and the diagnostic version could then be run when the
optimized version reports an error to find out the exact causes of the
error.


On Feb 14, 12:00 pm, Daniel Sobral <dcsob...@gmail.com> wrote:

Daniel Sobral

unread,
Feb 14, 2012, 2:25:07 PM2/14/12
to sreque, scala-debate
On Tue, Feb 14, 2012 at 16:22, sreque <sean...@yahoo.com> wrote:
> Looking at the last thread on the subject, I see you were in charge of
> making the benchmarking harness Daniel, so you probably know what
> you're talking about here. :-)
>
> Still, the final impression I got from the thread at
> http://groups.google.com/group/scala-internals/browse_thread/thread/1834f4a4239b4725/df1cea83328e6a6b?lnk=gst&q=optimized+for#df1cea83328e6a6b
> was that there were still some sticky issues left and your best bet
> was to stick with while loops if it really mattered.

No, I haven't got back to it -- there was no Caliper artifact
available last time I tried to -- but my own code was while-loop-fast,
though buggy. Paulp said he also got there, and I believe him.
However, the commit message on the last change to Range indicated he
wasn't done yet. Maybe that's true, maybe not, but I haven't been able
to check how fast that is due to the problem I mentioned.

> Also, I don't know the limitations of Scala's macros, but presumably
> with them I could easily right a construct like:
>
>
> cFor(var i = 0; i < n; i += 1) { body ... }
>
> that would compile down to:
>
> {
>  var i = 0
>  while(i < n)
>  {
>    body ...
>    i += 1
>  }
> }

Well, the start index is constant, and the loop increment is 1, which
makes that code correct. But say, for example, that the increment is
2, and it suddenly stops being correct. Oh, I'll grant that it is
_probably_ correct, but only the programmer can know that, not the
compiler. Specifically, it would fail for n == Int.MaxValue. Now
change the test to <= instead of <, and keep the increment equal to 1,
and you'll get another broken code, for the same value of n.

Range has two problems: one, it doesn't know the step (1 to 5 by 1 --
the step isn't know at the time "1 to 5" is initialized), and, two, it
has to deal with boundary conditions. Macros can help with the former
if the step is known at compile time, which is not always the case
either. There's a LOT of code required to handle all these conditions,
and that's what gets in the way.

JVM can optimize those away, however, which get us back on the same
footing as simple while loops -- once the optimization has kicked in.

> much bigger problem, but also hopefully doable. The point here is that
> with macros your performance is going to be much more predictable and
> less reliant on your JIT, the size of your method bodies, and the
> current phase of the moon.

If you don't trust JIT, you can't rely on your performance. :-) Code
that looks pretty simple and fast can fail to be JITted if you don't
pay attention to it -- which is what happened to Scala's 2.8 and 2.9's
foreach.

Not that I don't agree with you on the general issue of macros and
performance, but it's not that simple either.

Ivan Todoroski

unread,
Feb 14, 2012, 2:48:49 PM2/14/12
to Daniel Sobral, scala-...@googlegroups.com
Hi Daniel,

Thank you for taking the time to respond in depth.

On 14.02.2012 18:22, Daniel Sobral wrote:
>>> I see scala-virtualized as
>>> generating code at run-time, whereas scala-macros generates code as
>>> compile-time.
>> I understand this, but I consider it a mere implementation detail. What do I
>> care when the code is generated?
>
> Is it? Do you see no difference between a statically typed language
> and a dynamically typed language? The only difference is when the type
> check is made. Is it an implementation detail?

I was under the impression that DSLs written using scala-virtualized are
still statically type checked at compile time. Therefore the fact that
the scala-virtualized "generates code at run time" has no bearing on the
type check. The difference between statically and dynamically typed
language is a red herring, it has no bearing on this discussion.

(Of course, scala-virtualized doesn't really generate new code at
runtime, that's why I put it in quotes above, but I guess you were
making an analogy so it's close enough)


>>> Some use cases particular to macros are ad-hoc performance
>>> optimizations
>> I believe low-level optimizations are something that the compiler and
>> libraries should take care of.
>

> That's where "ad-hoc" comes in, really. LIBRARIES CANNOT TAKE CARE OF
> IT with the present support: they can't get rid of instance creation
> and boxing. Yes, as a regular Scala programmer you won't be _creating_
> macros, but if your for-loops suddenly become as fast as Java's, that
> will be a library-provided macro. Neither can the compiler solve it,
> because it doesn't have the knowledge that the library has. Or you
> think Scala has the performance issues it has simply because no one
> tried to fix them?

I don't understand... if a library has the knowledge how to make general
for-loops as fast as Java, why can't the compiler be enhanced to use the
same knowledge?


>> The optimization use case on its own is certainly not enough to justify
>> inclusion of a complex feature like scala-macros. In the rare occasions
>> where you really needed to optimize some critical bit of code and the Scala
>> compiler was letting you down, you could also write an actual Scala compiler
>> plugin, or just write it in C and call it from JNI.
>
> Talk to Yammer about that. If your "critical bit" means 80% of your
> code, what use is it?

So if macros were available to them, the Yammer programmers would have
spent their time writing complicated optimization macros instead of
working on their problem domain?

Writing code optimizers is a difficult task, especially on the JVM where
you have to second guess what the JVM might or might not do with your
code, and what shape of bytecode is most palatable to the JIT.

Don't get me wrong, I see what you are saying here, I understand that
macros can be used to optimize code specific to your problem, but is
that really the main use case for macros?

It boils down to saying "well, the compiler sucks and can't optimize
away instance creation or other bottlenecks I'm running into, so I am
going to use a complicated macro feature to work around compiler's
deficiencies".

It's a valid use, but it somehow seems less satisfying as a rationale
for such a feature.

I would prefer to see features that enhance expressiveness and reduce
boiler plate, yet are still easily accessible to regular day-to-day
programmers without too much cognitive load.


> Now, compiler plugins suffer from three problems:
>
> 1. They are too hard to write.
> 2. They depend on compiler internals, which are too volatile.
> 3. You can't distribute them as a JAR.
>
> So, what happens if you handle that? You get a macro. A macro is
> really a specialized compiler plugin that doesn't suffer from those
> three problems.

If I understand you correctly, you are saying that macros are basically
compiler plugins that are easier to create and distribute. But it seems
they are still too difficult for general programmers to use to reduce
various boilerplate here and there.

Scala-virtualized seems a much lighter weight feature for reducing
boiler plate and creating DSLs by regular programmers, without having to
learn arcane AST trees.


> C and JNI?

Yeah, I went a bit overboard with JNI, sorry about that.


>> This is a very interesting point. I already asked you about the relationship
>> between interpolation and macros in another sub-thread, I would appreciate
>> if you or anyone else could shed some light on this.
>
> That is pretty simple, and it comes down to that "implementation
> detail" you mentioned. Say you have this:
>
> printf("%d: %f%n", a, b)
>
> At run time you can check whether "a" is an Int and "b" is a Double,
> and throw an exception if they are not. At compile time you can check
> this and produce a compilation error if they are not. The compiler
> can't handle this unless it has knowledge of string formats used with
> "printf", and, then, it won't be able to handle anything *like* this.
> A compiler plugin can handle this, but we are back to the point that a
> macro is nothing more than a light-weight plugin that's easier to
> create, maintain and distribute.

But why do you need *macros* for this? This looks more like a job for
pluggable type providers, in combination with string interpolation.

It does come down to an "implementation detail", i.e. whether you
implement something like this with a specific feature designed for it
(pluggable type providers) or you use some general blunt tool (macros)
which seems like overkill.

Ivan Todoroski

unread,
Feb 14, 2012, 3:30:01 PM2/14/12
to Rex Kerr, Paul Brauner, scala-...@googlegroups.com
Yes, I am aware of that page, that site is where I learned about
scala-macros in the first place.

The very first use case on that page about Advanced DSLs compares how
macros can achieve much more than Squeryl for example, and they list
limitations of Squeryl such as its inability to use the == operator and
must use === for its syntax (which is expected, since its based on plain
Scala).

Yet they completely ignore the existence of Scala Integrated Query[1]
which is actually built on top of scala-virtualized and solves that
particular problem in Squeryl, along with many more. So that comparison
is misleading. I would love to see a more realistic comparison of SQL
generation by macros vs SIQ.

In the other use cases they mention type providers which are an
orthogonal concept not intrinsically tied to macros (even though their
current implementation might be tightly coupled to macros, I don't know).

Code generation for things like FunctionN and TupleN is certainly
compelling, but I can't help but feel that maybe a language feature for
abstracting over the number of method parameters might be a better
solution to that.


[1]
http://scala-integrated-query.googlecode.com/files/SIQ-Scala-Days-final.pdf


On 14.02.2012 18:05, Rex Kerr wrote:
> The scalamacros.org <http://scalamacros.org> site has a bunch of desired

Daniel Sobral

unread,
Feb 14, 2012, 3:34:58 PM2/14/12
to Ivan Todoroski, scala-...@googlegroups.com
On Tue, Feb 14, 2012 at 17:48, Ivan Todoroski <grnch...@gmx.net> wrote:
>
>> Is it? Do you see no difference between a statically typed language
>> and a dynamically typed language? The only difference is when the type
>> check is made. Is it an implementation detail?
>
>
> I was under the impression that DSLs written using scala-virtualized are
> still statically type checked at compile time. Therefore the fact that the
> scala-virtualized "generates code at run time" has no bearing on the type
> check. The difference between statically and dynamically typed language is a
> red herring, it has no bearing on this discussion.

Then you are under an incorrect assumption. By "generate code" I mean
"compile", with all that it entails, *including* typing.

Here's a simple comparison:

Scala-virtualized (rough example, based on vague memories from long ago):

val s = VString[Regex]("\d+$")

This will type check if Regex is a valid parameter to VString,
indicating it will be able to process the string in question. Also,
"s" will be known as a VString, and will be type-checked through-out.

Scala-macros:

val s = regex("\d+$")

The same guarantees made by Scala-virtualized are valid here, BUT, the
code for "regex" will be run at compile-time, and it will be able not
only to compile the regex itself at compile time, but verify the regex
is indeed valid.

>> That's where "ad-hoc" comes in, really. LIBRARIES CANNOT TAKE CARE OF
>> IT with the present support: they can't get rid of instance creation
>> and boxing. Yes, as a regular Scala programmer you won't be _creating_
>> macros, but if your for-loops suddenly become as fast as Java's, that
>> will be a library-provided macro. Neither can the compiler solve it,
>> because it doesn't have the knowledge that the library has. Or you
>> think Scala has the performance issues it has simply because no one
>> tried to fix them?
>
> I don't understand... if a library has the knowledge how to make general
> for-loops as fast as Java, why can't the compiler be enhanced to use the
> same knowledge?

Does Scala, the compiler, know what Scalaz does? Specs? ScalaTest?
Dispatch? BlueEyes? Anti-xml? The code you'll be writing tomorrow?
It's not that the library knows enough about how to make a fast
for-loop, it is that it knows enough about ITSELF, and that knowledge
would let it perform optimizations the compile can't or shouldn't. Or
even make such optimizations _available_, so that the *user*, who has
way more information about the problem, may choose to do them.

Let's give an example to make this more clear. Let's pick Range's
foreach. One problem with it is that it can't do this:

while(i < n) {
...
i += inc
}

That may wrap-around. However, in most cases "inc" is 1, for which
this idiom, this translation, is ok. How could the compiler possible
know such details about Range's implementation and how it could be
optimized?

>> Talk to Yammer about that. If your "critical bit" means 80% of your
>> code, what use is it?
>
> So if macros were available to them, the Yammer programmers would have spent
> their time writing complicated optimization macros instead of working on
> their problem domain?

Why do you assume macros are complicated? Here's a CFor:

macro def cfor(_this, start, cond, incr)(body) = c"""
${start}
while(${cond}) {
${body}
${incr}
}"""

for(val i = 0, i < 10, i++) {
println(i)
}

Is that complicated? Would it waste Yammer's time? What that would let
is make it possible for Yammer to keep the expressiveness of the
language while enjoying all the speed it could have, at the same time
it avoids having to maintain complex compiler plugins.

Moreover, Yammer might not to have to do that at all, because the
library themselves could take advantage of it, so that Yammer would
not have to go around the libraries. See their comments: they resorted
to Java collections to avoid performance problems with Scala
collections.

> Writing code optimizers is a difficult task, especially on the JVM where you
> have to second guess what the JVM might or might not do with your code, and
> what shape of bytecode is most palatable to the JIT.

Yes, writing code optimizers is a difficult task. Writing optimized
code isn't, which is what macros let you do.

> Don't get me wrong, I see what you are saying here, I understand that macros
> can be used to optimize code specific to your problem, but is that really
> the main use case for macros?

No, it is not the main case. It's one of them. If that was *all*
macros offered, I don't think they'd have much of a chance of getting
in the language. On the other hand, offering the other stuff they do
*and* making it faster, that's a real trick.

> It boils down to saying "well, the compiler sucks and can't optimize away
> instance creation or other bottlenecks I'm running into, so I am going to
> use a complicated macro feature to work around compiler's deficiencies".

It's not that the compiler sucks. JIT is one of the most impressive
piece of technologies out there, it has the benefit of knowing what
happens at run-time, and *it can't do anything about it either*!
Honestly, there isn't much more that Scala knows that JIT doesn't, so
it doesn't have much more opportunity to optimize than JIT. Some,
granted, but not all that much.

Optimizing turning-complete code is hard. It's NP-hard. Ad-hoc
optimization -- optimizing individual cases -- is quite possible, but
you have to special-case them. In fact, JIT does that a lot for Java
-- that's why Java's for-loops are so fast: JIT special-cases them.
But a compiler, a vm, can only work on most common cases. A library
can handle what it knows about itself.

>> Now, compiler plugins suffer from three problems:
>>
>> 1. They are too hard to write.
>> 2. They depend on compiler internals, which are too volatile.
>> 3. You can't distribute them as a JAR.
>>
>> So, what happens if you handle that? You get a macro. A macro is
>> really a specialized compiler plugin that doesn't suffer from those
>> three problems.
>
> If I understand you correctly, you are saying that macros are basically
> compiler plugins that are easier to create and distribute. But it seems they
> are still too difficult for general programmers to use to reduce various
> boilerplate here and there.

And Scala's type system full power is difficult to use as well, but
one can grab Scalaz and Shapeless and make use of some very advanced
type trickery with little trouble. Not everything is targeted at
"Hello World".

> Scala-virtualized seems a much lighter weight feature for reducing boiler
> plate and creating DSLs by regular programmers, without having to learn
> arcane AST trees.

You are much more optimistic about Scala-virtualized than I am.

> But why do you need *macros* for this? This looks more like a job for
> pluggable type providers, in combination with string interpolation.

Pluggable type providers are limited-use macros. They don't reduce
their complexity -- which seems to be your main problem with macros --
in any way.

> It does come down to an "implementation detail", i.e. whether you implement
> something like this with a specific feature designed for it (pluggable type
> providers) or you use some general blunt tool (macros) which seems like
> overkill.

As I said above, I don't see pluggable type providers decreasing
complexity related to macros in any way.

Daniel Sobral

unread,
Feb 14, 2012, 3:44:22 PM2/14/12
to Ivan Todoroski, Rex Kerr, Paul Brauner, scala-...@googlegroups.com
On Tue, Feb 14, 2012 at 18:30, Ivan Todoroski <grnch...@gmx.net> wrote:
>
> Yet they completely ignore the existence of Scala Integrated Query[1] which
> is actually built on top of scala-virtualized and solves that particular
> problem in Squeryl, along with many more. So that comparison is misleading.
> I would love to see a more realistic comparison of SQL generation by macros
> vs SIQ.

If macros get used (they might), the AST processing will be done at
compile time. Otherwise, it will be done at run-time.

Tiark Rompf

unread,
Feb 14, 2012, 3:56:59 PM2/14/12
to sreque, scala-debate
Macros and virtualization have complementary strengths and uses:

- Macros are good for local, context-free rewrites on Scala ASTs that are possibly untyped and can be manipulated before type checking, and for generating boilerplate code at compile time which seamlessly integrates with the rest of the program (think type providers).

- Staging and virtualization are good for embedded DSLs that need modularity, sophisticated global analysis and compilation, computation at staging time, specialization to runtime data etc., in particular when the DSL expression trees do not exactly correspond to Scala trees.

The static/dynamic checking aspect is the other way round: Virtualization maintains type safety and relative evaluation order of expressions across the stage boundary, whereas macros allow free composition of untyped trees. Both is useful for different purposes.

Do the benefits of macros outweigh the cost of adding them to the language, in particular given the 'Scala is complex debate'? Nobody knows. The only way to find out is to go ahead and implement them ...

As with any new technology, there will certainly be many 'shiny new hammer, looking for nails' effects, so the challenge will be to provide guidance and empower users to pick the right tool for the job. Implementing an 'advanced' DSLs exclusively using macros is almost certainly not a good idea, as experience in other languages shows. In the end I believe both technologies can benefit each other: macros can help reduce some of the boilerplate that is currently needed to define staged DSLs and virtualization technology can help defining more powerful macros, e.g. infix_ methods from Scala-Virtualized would allow to redirect arbitrary method calls to macros, within a local scope.

Cheers,
- Tiark

martin odersky

unread,
Feb 14, 2012, 3:56:48 PM2/14/12
to Daniel Sobral, Ivan Todoroski, Rex Kerr, Paul Brauner, scala-...@googlegroups.com
Some perspective: First, scala-virtualized and Scala macros are both
experimental at the present stage. The discussion on this thread has
already worked out the main points of differences, so I won't go into
that.

Scala-virtualized is a research project, done at EPFL and Stanford. We
hope that some number of features will make it into main Scala.
Adriaan's pattern matcher looks like an excellent first candidate, and
other elements might follow. But there's no a priori intention to
migrate all elements of scala virtualized.

Macros have a shorter term horizon. There will be a SIP soon, and if
things go well we might see them soon in trunk, probably enabled by
some flag.

The intention of macros, and of Scala's language design in general, is
to simplify things. We have already managed to replace code lifting by
macros, and hopefully other features will follow. For instance, there
was a strong push to somehow eliminate the implicit parameter in

atomic { implicit transaction => ... }

and several other related situations. With macros this is trivial.
Without them, it requires a sophisticated dance with scoping rules.
Optimizations such as on Range.foreach are another use case. I believe
that in the long run macros can be a simplifying factor. So, in my
mind, there is enough evidence to try out the idea. But the
implementation is considered experimental at present, and we do
welcome critical discussion around the SIP once it appears.

Cheers

-- Martin

Lukas Rytz

unread,
Feb 14, 2012, 4:00:33 PM2/14/12
to Erik Osheim, Ivan Todoroski, Daniel Sobral, scala-...@googlegroups.com


On Tuesday, 14. February 2012 at 19:09, Erik Osheim wrote:


That said, if the compiler API is stable, compiler plugins get more
advanced, and the community is ready to embrace them, then I agree that
maybe macros aren't needed.

An advantage of macros is that they feel much lighter than compiler plugins.

The barrier to include a new compiler plugin into Scala releases is higher than
writing some library function as a macro def: for instance, I never heard of plans
to write a compiler-plugin to optimize Range.foreach and ship it with the Scala
release. Also, enabling a plugin needs changes to the build configuration, but
people can use a macro without even knowing it.

Josh Suereth

unread,
Feb 14, 2012, 4:02:03 PM2/14/12
to Daniel Sobral, Ivan Todoroski, Rex Kerr, Paul Brauner, scala-...@googlegroups.com
A few points:

  • Scala virtualized is not about code generation, necessarily.   Scala-virtualized is about *virtualizing* concepts in the scala compiler so intermediate representation (or ASTs) can be targeted at different platforms.
  • *Lightweight modular staging* is the ability to "stage" scala code and adapt it to a different backend at (potentially) runtime.  *THIS* is what I think you're confusing with scala-virtualized.
  • Scala-virtualized and Macros do not conflict at all.   Macros are used to generate ASTs.  Macros *may* conflict with compiler plugins, since they allow you to do similar work as a compiler plugin but inside your scala code rather than as a compiler add-on.   Scala Macros should work well with Scala-virtualized.
  • Scala Macros are doing a bit of good to simplify the manipulation of the type-checker and the AST.  Hopefully, if nothing else, scala macros will make the compiler and compiler plugins more approachable for everyone.
So basically, I'm not really sure what points you're trying to make here...   Macros solve a need that *could* be solved by compiler plugins (see dick wall's subcut plugin for examples of where macros could have been used instead of a plugin).

LMS and scala-virtualized have little to do with macros.   Macros are front-end for generating Scala code.   LMS is about targetting different backend with *THE SAME* scala code (which could have been generated by a macro).

Erik Osheim

unread,
Feb 14, 2012, 4:06:15 PM2/14/12
to Lukas Rytz, Ivan Todoroski, Daniel Sobral, scala-...@googlegroups.com
On Tue, Feb 14, 2012 at 10:00:33PM +0100, Lukas Rytz wrote:
> An advantage of macros is that they feel much lighter than compiler plugins.
>
> The barrier to include a new compiler plugin into Scala releases is higher than
> writing some library function as a macro def: for instance, I never heard of plans
> to write a compiler-plugin to optimize Range.foreach and ship it with the Scala
> release. Also, enabling a plugin needs changes to the build configuration, but
> people can use a macro without even knowing it.

I totally agree! I was just trying to be totally explicit (in the
context of Ivan's skepticism that macros are needed) about what I felt
like they added over compiler plugins, and why I feel like I need them.

-- Erik

Paul Brauner

unread,
Feb 14, 2012, 7:12:58 PM2/14/12
to Josh Suereth, Ivan Todoroski, scala-debate, Rex Kerr, Daniel Sobral

It seems to me it's not entirely true that staging and macros don't overlap. For instance, you can achieve in template haskell (macros) what can be done in meta ocaml (staging). Similarity you can achieve with lisp/scheme macros what can be done with quote/unquote/eval.

But I agree that this discussion overlooks the fact that multi staged programming is just a special case of scala virtualized (if I understand correctly, the LMS paper is a special application of scala virtualized).

Paul Brauner

unread,
Feb 14, 2012, 10:00:14 PM2/14/12
to scala-...@googlegroups.com, Josh Suereth, Ivan Todoroski, Rex Kerr, Daniel Sobral
s/Similarity/Similarly/
Stupid phone.

Ivan Todoroski

unread,
Feb 15, 2012, 3:00:38 PM2/15/12
to martin odersky, Daniel Sobral, Rex Kerr, Paul Brauner, scala-...@googlegroups.com
Hi Martin,

Who do you see as the target audience for macros?

a) regular rank-and-file developers looking to reduce repeated boiler
plate and create simple DSLs specific to their projects

b) hardcore library designers who in the absence of macros would have
resorted to compiler plugins for their advanced tricks, but will now use
macros for that purpose


If it's A, what is your view on the problems raised in the first 3
points of the original email that started this thread, namely having to
learn an arcane AST sub-language to create macros and the difficulty of
debugging macros?

This would seem to raise the threshold on the amount of boilerplate and
duplication a regular developer would tolerate before finally sitting
down to learn the vagaries of macros to reduce it.


If it's B, then what should the busy regular developer use for reducing
boiler plate and creating simple DSLs for their work, if macros remain
generally difficult for them to use?


I do appreciate your response regarding plans for inclusion of
scala-macros vs scala-virtualized, it clarifies the direction of Scala
somewhat.

martin odersky

unread,
Feb 15, 2012, 3:25:38 PM2/15/12
to Ivan Todoroski, Daniel Sobral, Rex Kerr, Paul Brauner, scala-...@googlegroups.com
On Wed, Feb 15, 2012 at 9:00 PM, Ivan Todoroski <grnch...@gmx.net> wrote:
> Hi Martin,
>
> Who do you see as the target audience for macros?
>
> a) regular rank-and-file developers looking to reduce repeated boiler plate
> and create simple DSLs specific to their projects
>
> b) hardcore library designers who in the absence of macros would have
> resorted to compiler plugins for their advanced tricks, but will now use
> macros for that purpose
>
>
> If it's A, what is your view on the problems raised in the first 3 points of
> the original email that started this thread, namely having to learn an
> arcane AST sub-language to create macros and the difficulty of debugging
> macros?
>
> This would seem to raise the threshold on the amount of boilerplate and
> duplication a regular developer would tolerate before finally sitting down
> to learn the vagaries of macros to reduce it.
>
>
> If it's B, then what should the busy regular developer use for reducing
> boiler plate and creating simple DSLs for their work, if macros remain
> generally difficult for them to use?
>
I think it's rather B. And, Scala already has a lot of mechanisms that
reduce boilerplate even without resorting to macros and scala
virtualized. So I am not at all concerned that the bar for using
either is high. In fact, I'd prefer it that way.

I see macros as a useful middle road. Every language designer is faced
with a constant stream of suggestions to enlarge the language. And a
lot of these make sense. But following them would make the language
bigger and that's something I am very reluctant to do. If at all
possible, I'd like to make it smaller! Macros are a more stable and
systematic way to achieve more convenient syntax than compiler
plugins. That's why I think they are promising. The thing that scares
me deeply is that some people will inevitably go overboard with them
and misuse them.

It's a conundrum. Scala's philosophy is to make things possible, and
not erect walls against misuse. Trust the competency of your
programmers. But posts like this one make me realize the danger of
doing this:

http://yz.mit.edu/wp/true-scala-complexity/

It seems that everyone will at some point in their development as a
developer reach "peak abstraction",

http://lambda-the-ultimate.org/node/4442

and Scala's problem is that peak abstraction is so much higher (and
therefore more scary) than in other languages. The question I am
grappling with is, how can we get the advantages of macros without the
potential for misuse?

Cheers

-- Martin

--
Martin Odersky
Prof., EPFL and Chairman, Typesafe
PSED, 1015 Lausanne, Switzerland
Tel. EPFL: +41 21 693 6863
Tel. Typesafe: +41 21 691 4967

Ivan Todoroski

unread,
Feb 15, 2012, 3:35:32 PM2/15/12
to Daniel Sobral, scala-...@googlegroups.com
On 14.02.2012 21:34, Daniel Sobral wrote:
> Why do you assume macros are complicated? Here's a CFor:
>
> macro def cfor(_this, start, cond, incr)(body) = c"""
> ${start}
> while(${cond}) {
> ${body}
> ${incr}
> }"""

Now that's interesting. At the time when I was researching scala-macros,
all the examples I could find were very similar to this one from the
Macros SIP:

class Queryable[T, Repr](query: Query) {
macro def filter(p: T => Boolean)(implicit ctx: CompilerContext):
Repr = {
val ast = Block()
val b = ctx.gensym(b, classOf[QueryableBuilder])
ast += Assign(b, Call(newBuilder, this))
ast += Call(query=, b, Create(classOf[Filter], Call(query, this),
reify(p)))
ast += Call(result, b)
ast
}
}

It even praises what a nice example of macros this is in the very next
sentence, which I barely read because my eyes were already bleeding. :)

Now I see in the Quasiquotations SIP this macro being rewritten as:

class Queryable[T, Repr](query: Query) {
macro def filter(p: T => Boolean): Repr = c
val b = $this.newBuilder
b.query = Filter($this.query, $reify(p))
b.result

}

This certainly looks much nicer, but I still don't really understand it
fully and the rest of the SIPs are not much help there.

Maybe it's just a matter of lack of documentation and communication
about what these shiny new things are and how to use them. Not so much
how they are implemented and how cool they are, but more about how a
busy developer who is not a Scala wizard can use them to solve practical
problems like refactoring, reducing boiler plate, and raising the
general level of abstractions.

On the other hand, I didn't have nearly as much trouble understanding
and following the scala-virtualized stuff. It felt easier and more
natural for someone who already understood existing Scala functionality
like implicits, pimping and operator overloading. It felt like a logical
extension of those concepts.

You can certainly dismiss me as not trying hard enough, but I am a Java
developer who really *really* likes the elegance of Scala and is trying
his best to learn about it in his very limited free time, and is even
already trying to use it in non-critical projects to see how it goes. I
think many developers are in a similar situation.

And for the record, I disagree 100% with the "Scala is too complex"
camp. I am not whining about complexity here, I am seriously trying to
learn how to use Scala to its fullest extent, but the cognitive load of
using advanced features is a very real factor in determining how much
boiler plate and code duplication one would tolerate before reaching for
an advanced Scala feature to eliminate it.

Ivan Todoroski

unread,
Feb 15, 2012, 3:42:01 PM2/15/12
to martin odersky, Daniel Sobral, Rex Kerr, Paul Brauner, scala-...@googlegroups.com
On 15.02.2012 21:25, martin odersky wrote:
> Scala already has a lot of mechanisms that
> reduce boilerplate even without resorting to macros and scala
> virtualized.

Oh absolutely! That is in fact the single biggest reason that drew me to
Scala. The level of abstraction achievable even with the current stable
version of Scala is astounding.

However there are still some warts here and there, such as the inability
to use things like == in DSLs and having to come up with ===, which I
hoped to be able to get around using something like scala-virtualized
that's relatively easy to use.

Still, I appreciate the difficulty faced by language designers and the
various trade offs that have to be made.

Rex Kerr

unread,
Feb 15, 2012, 3:57:08 PM2/15/12
to martin odersky, Ivan Todoroski, Daniel Sobral, Paul Brauner, scala-...@googlegroups.com
On Wed, Feb 15, 2012 at 3:25 PM, martin odersky <martin....@epfl.ch> wrote:
Macros are a more stable and
systematic way to achieve more convenient syntax than compiler
plugins. That's why I think they are promising. The thing that scares
me deeply is that some people will inevitably go overboard with them
and misuse them.

Code generators are even more scary, though, but right now the options are sometimes
  1. Have code that runs 30x slower than C++
  2. Repeat yourself a gazillion times
  3. Write a code generator
But picking (3) leaves one with generator code like


def opOverElement(meth: String, e: Entity, op: (String,String) => String) = {
  if (isScalar && e.isScalar) None
  else onlyIf (
    meth!=":" && e.isScalar && r==GenericType && e.r==GenericType &&
    ((t^e.t)==e.t || (t==FloatType && (e.t==IntType || e.t==LongType)))
  ) {

    "def "+meth+"("+e.paramary+") = " +
    Entity(n, m, ^(e), %(e)).create((i,j) => {
      Expr( op(\\(i,j,^(e)).text, e.\\(1,1,^(e),_+"0").text
    }) , ^(e) ).mathed)

  }
}

 
which doesn't exactly strike me as "not overboard", even though I wrote it to be as clear to me as possible.  With code generators I find myself in a constant three-way battle between flexibility, clarity, and compactness, and usually lose on at least one if not two counts.  (The above is an example of loss of clarity; the reason I sacrifice clarity for compactness is that when I need the generated code to do something nontrivial, it really helps if the generator is not spread out over several screens so I can actually see what's being produced in addition to all the machinery to produce it.)

Macros would provide a great fourth option even if they're not as general as an arbitrary code generator, since anything that can lower "repeat yourself a gazillion times" to a more moderate/manageable number will help avoid resorting to 1. or 3..

The question I am
grappling with is, how can we get the advantages of macros without the
potential for misuse?

Great examples that show off how to use it for clarity and performance (instead of wow-look-what-I-can-do) would be a start--sometimes one can fix these things culturally.

  --Rex

Simon Ochsenreither

unread,
Feb 15, 2012, 4:16:37 PM2/15/12
to scala-...@googlegroups.com, Ivan Todoroski, Daniel Sobral, Rex Kerr, Paul Brauner
Hi,

in my opinion the fear that people abuse macros would be substantially reduced if it ships with some real world libraries showing _when_ and _how_ to use them idiomatically.

The most unfortunate thing to do would be shipping it as a feature in search of a problem. This was imho the case with scala.Dynamic. Dynamic appeared with some interesting prototypes of "possible" things, but I haven't seen a single reasonable "real world" usage of it yet.

In the end I think it would be more reasonable to go ahead and release 2.10 without macros (or behind a compiler switch like it currently is) and officially ship macros when they are ready. (Ready == "there is a real library solving a real problem using it, which can be recommended to developers without having to worry")

The name "macro" is a 100% guarantee for a marketing nightmare, so it shouldn't be made worse by shipping it without any substantial example which can be explained to developers.

Thanks and bye,


Simon

Rex Kerr

unread,
Feb 15, 2012, 4:51:34 PM2/15/12
to Simon Ochsenreither, scala-...@googlegroups.com, Ivan Todoroski, Daniel Sobral, Paul Brauner
On Wed, Feb 15, 2012 at 4:16 PM, Simon Ochsenreither <simon.och...@googlemail.com> wrote:
In the end I think it would be more reasonable to go ahead and release 2.10 without macros (or behind a compiler switch like it currently is) and officially ship macros when they are ready. (Ready == "there is a real library solving a real problem using it, which can be recommended to developers without having to worry")

cfor and avoiding code generators for TupleN/FunctionN are obvious choices.

I'd happy contribute a Muple library if anyone was interested.  (Muples are mutable and specialized tuples, which I use in my code to speed certain types of fold by ~5x without losing expressive power or sacrificing safety *if* used properly.  I'm pretty sure that with macros I can not even expose the mutable side of them and just have a high-performance fold that yields a tuple in the end...but I guess it will depend on the exact details of the implementation.)
 
The name "macro" is a 100% guarantee for a marketing nightmare, so it shouldn't be made worse by shipping it without any substantial example which can be explained to developers.

One could call it something else, though I don't know that any of the alternatives are less scary marketing-wise.

  --Rex

√iktor Ҡlang

unread,
Feb 15, 2012, 4:58:19 PM2/15/12
to Rex Kerr, Simon Ochsenreither, scala-...@googlegroups.com, Ivan Todoroski, Daniel Sobral, Paul Brauner
Higher-order sourcecode?
 


  --Rex




--
Viktor Klang

Akka Tech Lead
Typesafe - The software stack for applications that scale

Twitter: @viktorklang

Tiark Rompf

unread,
Feb 16, 2012, 6:04:16 AM2/16/12
to Rex Kerr, Ivan Todoroski, Daniel Sobral, Paul Brauner, scala-debate
On Feb 15, 2012, at 9:57 PM, Rex Kerr wrote:

On Wed, Feb 15, 2012 at 3:25 PM, martin odersky <martin....@epfl.ch> wrote:
Macros are a more stable and
systematic way to achieve more convenient syntax than compiler
plugins. That's why I think they are promising. The thing that scares
me deeply is that some people will inevitably go overboard with them
and misuse them.

Code generators are even more scary, though, but right now the options are sometimes
  1. Have code that runs 30x slower than C++
  2. Repeat yourself a gazillion times
  3. Write a code generator
But picking (3) leaves one with generator code like


It's 2012. You can write code generators like this:

val distances = Stream[Double](data.numRows, data.numRows){ (i,j) => dist(data(i),data(j)) }
val densities = DenseVector[Int](data.numRows, true)

for (row <- distances.rows) {
if(densities(row.index) == 0) {
val neighbors = row find { _ < apprxWidth }
densities(neighbors) = row count { _ < kernelWidth }
}
}

and emit Scala code that runs as fast as C [1]

- Tiark

Rex Kerr

unread,
Feb 16, 2012, 10:43:15 AM2/16/12
to Tiark Rompf, scala-debate
On Thu, Feb 16, 2012 at 6:04 AM, Tiark Rompf <tiark...@epfl.ch> wrote:

On Feb 15, 2012, at 9:57 PM, Rex Kerr wrote:

On Wed, Feb 15, 2012 at 3:25 PM, martin odersky <martin....@epfl.ch> wrote:
Macros are a more stable and
systematic way to achieve more convenient syntax than compiler
plugins. That's why I think they are promising. The thing that scares
me deeply is that some people will inevitably go overboard with them
and misuse them.

Code generators are even more scary, though, but right now the options are sometimes
  1. Have code that runs 30x slower than C++
  2. Repeat yourself a gazillion times
  3. Write a code generator
But picking (3) leaves one with generator code like


It's 2012. You can write code generators like this:

val distances = Stream[Double](data.numRows, data.numRows){ (i,j) => dist(data(i),data(j)) }
val densities = DenseVector[Int](data.numRows, true)

for (row <- distances.rows) {
if(densities(row.index) == 0) {
val neighbors = row find { _ < apprxWidth }
densities(neighbors) = row count { _ < kernelWidth }
}
}

and emit Scala code that runs as fast as C [1]

- Tiark

That is very cool (impressive performance in your test case with OptiML, though the C++ doesn't look *quite* optimally tuned to me (e.g. it allocates memory every outer iteration instead of just once per thread in order to make the parallelization easier)), but doesn't abstract over number of arguments and such, which is almost invariably why I end up writing code generators.

For example, the opOverElement mess that I wrote generates element-wise binary operations distributed over matrices or vectors of fixed size while handling the type conversions in a high-performance but sensible way.

Anyway, I would certainly not turn down the performance advantages of virtualization if they were available in the standard Scala distribution, but that alone wouldn't solve the 1-3 problem (though it might make the 30x closer to 5x if it meant that I could specialize collections).

  --Rex

Adam Jorgensen

unread,
Feb 17, 2012, 8:31:53 AM2/17/12
to scala-...@googlegroups.com
I don't really see the issue with having 2 or more projects that tackle a similar problem or set of problems. More projects equals more options and that's a good thing in my opinion.

Alex Repain

unread,
Feb 17, 2012, 9:24:36 AM2/17/12
to Adam Jorgensen, scala-...@googlegroups.com


2012/2/17 Adam Jorgensen <adam.jor...@gmail.com>

I don't really see the issue with having 2 or more projects that tackle a similar problem or set of problems. More projects equals more options and that's a good thing in my opinion.

But most probably you don't want both the features on at same time when using language... At least not before there are proofs they won't screw each other up.

Adam Jorgensen

unread,
Feb 18, 2012, 1:14:19 PM2/18/12
to scala-...@googlegroups.com
I suppose. I would like the freedom to shoot myself in the foot if I so choose though :-)

Reminds me of the name of a C++ book I once saw: "C++: Enough rope to shoot yourself in the foot". Still makes me laugh :-)

Bernd Johannes

unread,
Feb 18, 2012, 4:15:58 PM2/18/12
to scala-...@googlegroups.com, Simon Ochsenreither

Am Mittwoch, 15. Februar 2012, 22:16:37 schrieb Simon Ochsenreither:


> The name "macro" is a 100% guarantee for a marketing nightmare, so it

> shouldn't be made worse by shipping it without any substantial example

> which can be explained to developers.


I can already imagine the headlines:


"after the longest suicide note in history scala has finally managed to fully erupt in insanity by adding #MACROS"


Well... to be honest - the first thing I thought when I heard "#MACROS" was "Oh dear - they won't do that - will they?". Now I know that we are talking about some substantially different things than those macros I was thinking about (flashback in the 80's... shudder).


But to me "macro" surely has a real negative bias. And I think I am not alone. So maybe its really worth thinking about names here. It won't be the first time in history where something is shipped with a label that's not as clear and technical as it could be (but most often for good reason - marketing is also a part of a product).


Greetings

Bernd


Chapin, Peter @ VTC

unread,
Feb 18, 2012, 5:09:34 PM2/18/12
to scala-...@googlegroups.com

I think “macro” is fine, personally. Yes there is some bad connotations associated with the word in some communities, but then there’s Lisp (Scheme, Clojure) and other languages that use the word to mean something quite different than what it means to C/C++ people. We would just be joining the ranks of those other languages. No problem.

 

Peter

√iktor Ҡlang

unread,
Feb 18, 2012, 5:20:20 PM2/18/12
to Chapin, Peter @ VTC, scala-...@googlegroups.com
"source transformers"?

Daniel Sobral

unread,
Feb 18, 2012, 6:55:02 PM2/18/12
to √iktor Ҡlang, Chapin, Peter @ VTC, scala-...@googlegroups.com
Don't bend over trying to appease people who will judge you based on
their misconceptions. If they decide to misjudge it, THEY will call it
macros, no matter name we chose.

2012/2/18 √iktor Ҡlang <viktor...@gmail.com>:

--

Lukas Rytz

unread,
Feb 18, 2012, 7:02:53 PM2/18/12
to √iktor Ҡlang, Chapin, Peter @ VTC, scala-...@googlegroups.com
"source transformer def foreach(...) = ..."?
macro sounds good to me.

On Saturday, February 18, 2012, √iktor Ҡlang wrote:
"source transformers"?

On Sat, Feb 18, 2012 at 11:09 PM, Chapin, Peter @ VTC <PCh...@vtc.vsc.edu> wrote:

I think “macro” is fine, personally. Yes there is some bad connotations associated with the word in some communities, but then there’s Lisp (Scheme, Clojure) and other languages that use the word to mean something quite different than what it means to C/C++ people. We would just be joining the ranks of those other languages. No problem.

 

Peter

 

From: scala-...@googlegroups.com [mailto:scala-...@googlegroups.com] On Behalf Of Bernd Johannes
Sent: Saturday, February 18, 2012 16:16
To: scala-...@googlegroups.com
Cc: Simon Ochsenreither
Subject: Re: [scala-debate] scala-macros vs scala-virtualized

 

Am Mittwoch, 15. Februar 2012, 22:16:37 schrieb Simon Ochsenreither:

 

> The name "macro" is a 100% guarantee for a marketing nightmare, so it

> shouldn't be made worse by shipping it without any substantial example

> which can be explained to developers.

 

I can already imagine the headlines:

 

"after the longest suicide note in history scala has finally managed to fully erupt in insanity by adding #MACROS"

 

Well... to be honest - the first thing I thought when I heard "#MACROS" was "Oh dear - they won't do that - will they?". Now I know that we are talking about some substantially different things than those macros I was thinking about (flashback in the 80's... shudder).

 

But to me "macro" surely has a real negative bias. And I think I am not alone. So maybe its really worth thinking about names here. It won't be the first time in history where something is shipped with a label that's not as

--

Martin McNulty

unread,
Feb 19, 2012, 3:56:18 AM2/19/12
to scala-debate
I can see it now:

    S O U R C E   T R A N S F O R M E R S
             compiler plugins in disguise

2012/2/19 Lukas Rytz <lukas...@epfl.ch>

Alex Cruise

unread,
Feb 20, 2012, 2:09:58 PM2/20/12
to scala-debate
I think "compile-time metaprogramming" will be effectively bikeshed-proof.  :)

-0xe1a

Simon Ochsenreither

unread,
Feb 20, 2012, 2:23:16 PM2/20/12
to scala-...@googlegroups.com

“type loaders and language-quotations.” (http://confluence.jetbrains.net/display/Kotlin/FAQ)

Maybe the name could focus more on the benefit for users and less on the implementation details.

Erik Osheim

unread,
Feb 20, 2012, 2:28:29 PM2/20/12
to Simon Ochsenreither, scala-...@googlegroups.com
On Mon, Feb 20, 2012 at 11:23:16AM -0800, Simon Ochsenreither wrote:
> Maybe the name could focus more on the benefit for users and less on the
> implementation details.

So is that a vote for transmogrifier [1] then? ;)

Sorry, couldn't resist.

-- Erik

[1] http://calvinandhobbes.wikia.com/wiki/Transmogrifier

Naftoli Gugenheim

unread,
Feb 20, 2012, 6:40:51 PM2/20/12
to Daniel Sobral, √iktor Ҡlang, Chapin, Peter @ VTC, scala-...@googlegroups.com
On Saturday, February 18, 2012, Daniel Sobral wrote:
Don't bend over trying to appease people who will judge you based on
their misconceptions. If they decide to misjudge it, THEY will call it
macros, no matter name we chose.


If the issue is whether we should be offended by such misjudgements, or whether we should be friends with such people, or alternatively if the issue was only people who are purposely trying to scare people away from scala, then that would be a valid argument.

But there is another issue, probably a more important one. A name like "macro" has the potential to easily scare away many people who simply don't know better, people who don't have much of an impression of scala yet and will now have a negative one simply due to misunderstanding.

Also, the field of advertising is all about how things impact one subconsciously, so I can't really comment but possibly that would be relevant to this. Also it could be the idea is that most people most of the time don't think into most things but walk away with whatever impression they happen to have formed. And when the conscious mind isn't deciding what impression to form, the subconscious mind is. Certainly in this case where it's not a matter of subconscious vs. conscious, but first-glance impression or skimming a screenfull vs. reading up on it properly, there could be many people out there who would pass over scala due to simple misunderstanding.

Naftoli Gugenheim

unread,
Feb 20, 2012, 6:41:21 PM2/20/12
to Lukas Rytz, √iktor Ҡlang, Chapin, Peter @ VTC, scala-...@googlegroups.com
Perhaps we should keep macro as the official technical name, and keyword, but any "headlines," list of features, etc. (i.e., anything that doesn't require reading a full explanation to walk away with the message) shouldn't introduce it via the word "macro." For instance something like "Scala gets a lightweight alternative to compiler plugins (termed macros after their counterparts in languages like Nemerle, Lisp, etc.)" (I'm sure someone can word that better but hopefully the idea is clear).

Cédric Beust ♔

unread,
Feb 20, 2012, 6:48:49 PM2/20/12
to Naftoli Gugenheim, Daniel Sobral, √iktor Ҡlang, Chapin, Peter @ VTC, scala-...@googlegroups.com
On Mon, Feb 20, 2012 at 3:40 PM, Naftoli Gugenheim <nafto...@gmail.com> wrote:
But there is another issue, probably a more important one. A name like "macro" has the potential to easily scare away many people who simply don't know better, people who don't have much of an impression of scala yet and will now have a negative one simply due to misunderstanding.

Case in point: in their latest episode, the Javaposse was going through an old article that was giving several reasons why "Java sucks", one of them was that it "doesn't have macros". Everybody on the Posse agreed that this was actually a good thing, that we really, really didn't want macros.

It took me a few seconds to realize that they were all (both the author of the article and the Posse dudes) talking about C-style macros, not Lisp-style ones.

For better of for worse, there are many, many more people who associate "maros" with #define and #ifdef than with Lisp macros.

-- 
Cédric

Josh Suereth

unread,
Feb 20, 2012, 7:04:17 PM2/20/12
to Naftoli Gugenheim, scala-...@googlegroups.com, √iktor Ҡlang, Chapin, Peter @ VTC, Lukas Rytz

I like the term "typesafe macros" which implies more magikz and awesomez than just "macro"

Dave Griffith

unread,
Feb 20, 2012, 7:31:15 PM2/20/12
to scala-debate

There's a real difference here between C-style macros and Lisp-style-
but-typesafe macros. C-style macros mostly cover use cases that Scala
handles with closures, by-name parameters, continuations, monads, or
the raw, throbbing power of the JVM JIT optimizer. They are mostly
remembered poorly, probably because most of the macros actually
created were pretty damn poor. I remember some exception-handling
macros back in the day, wrapping setjmp/longjmp. Gah, it was like
watching a dog try to play the trombone. If I was coming to Scala
anew and heard it had macros, I could easily assume it was because
macros were a workaround for the sort of functionality it was stuck
with. Not worth my time. OTOH, if I knew Lisp, and had heard that
some fancy new Lisp followers had made their macros type-safe, well
that's a horse of a different color.

"Compile-time type-safe metaprogramming". Make 'em interested, not
ill.

--Dave

On Feb 20, 7:04 pm, Josh Suereth <joshua.suer...@gmail.com> wrote:
> I like the term "typesafe macros" which implies more magikz and awesomez
> than just "macro"
> On Feb 20, 2012 6:41 PM, "Naftoli Gugenheim" <naftoli...@gmail.com> wrote:
>
>
>
>
>
>
>
> > Perhaps we should keep macro as the official technical name, and keyword,
> > but any "headlines," list of features, etc. (i.e., anything that doesn't
> > require reading a full explanation to walk away with the message) shouldn't
> > introduce it via the word "macro." For instance something like "Scala gets
> > a lightweight alternative to compiler plugins (termed macros after their
> > counterparts in languages like Nemerle, Lisp, etc.)" (I'm sure someone can
> > word that better but hopefully the idea is clear).
>
> > On Saturday, February 18, 2012, Lukas Rytz wrote:
>
> >> "source transformer def foreach(...) = ..."?
> >> macro sounds good to me.
>
> >> On Saturday, February 18, 2012, √iktor Ҡlang wrote:
>
> >>> "source transformers"?
>
> >>> On Sat, Feb 18, 2012 at 11:09 PM, Chapin, Peter @ VTC <
> >>> PCha...@vtc.vsc.edu> wrote:
>
> >>> I think “macro” is fine, personally. Yes there is some bad connotations
> >>> associated with the word in some communities, but then there’s Lisp
> >>> (Scheme, Clojure) and other languages that use the word to mean something
> >>> quite different than what it means to C/C++ people. We would just be
> >>> joining the ranks of those other languages. No problem.****
>
> >>> ** **
>
> >>> Peter****
>
> >>> ** **
>
> >>> *From:* scala-...@googlegroups.com [mailto:
> >>> scala-...@googlegroups.com] *On Behalf Of *Bernd Johannes
> >>> *Sent:* Saturday, February 18, 2012 16:16
> >>> *To:* scala-...@googlegroups.com
> >>> *Cc:* Simon Ochsenreither
> >>> *Subject:* Re: [scala-debate] scala-macros vs scala-virtualized****
>
> >>> ** **
>
> >>> Am Mittwoch, 15. Februar 2012, 22:16:37 schrieb Simon Ochsenreither:****
>
> >>> ** **
>
> >>> > The name "macro" is a 100% guarantee for a marketing nightmare, so it*
> >>> ***
>
> >>> > shouldn't be made worse by shipping it without any substantial example
> >>> ****
>
> >>> > which can be explained to developers.****
>
> >>> ** **
>
> >>> I can already imagine the headlines:****
>
> >>> ** **
>
> >>> "after the longest suicide note in history scala has finally managed to
> >>> fully erupt in insanity by adding #MACROS"****
>
> >>> ** **
>
> >>> Well... to be honest - the first thing I thought when I heard "#MACROS"
> >>> was "Oh dear - they won't do that - will they?". Now I know that we are
> >>> talking about some substantially different things than those macros I was
> >>> thinking about (flashback in the 80's... shudder).****
>
> >>> ** **
>
> >>> But to me "macro" surely has a real negative bias. And I think I am not
> >>> alone. So maybe its really worth thinking about names here. It won't be the
> >>> first time in history where something is shipped with a label that's not as
>
> >>> --
> >>> Viktor Klang
>
> >>> Akka Tech Lead
> >>> Typesafe <http://www.typesafe.com/> - The software stack for

Daniel Sobral

unread,
Feb 21, 2012, 8:23:21 AM2/21/12
to Cédric Beust ♔, Naftoli Gugenheim, √iktor Ҡlang, Chapin, Peter @ VTC, scala-...@googlegroups.com
2012/2/20 Cédric Beust ♔ <ced...@beust.com>:

Which is a bit ridiculous, since they are pre-processor directives.
C++ templates are much more like real macros.

Justin du coeur

unread,
Feb 21, 2012, 10:16:10 AM2/21/12
to scala-...@googlegroups.com
On Mon, Feb 20, 2012 at 7:04 PM, Josh Suereth <joshua....@gmail.com> wrote:

I like the term "typesafe macros" which implies more magikz and awesomez than just "macro"

Honestly, having done a fair number of years in the C++ coal mines, I don't react nearly as badly to macros as some people seem to.  They were far from wonderful, but they proved a thousand times to be critical to making the language usable, by cutting the boilerplate down to manageable levels.  An incredibly dangerous tool, and a pain in the ass when it failed, but not such a horror.

So to me, the phrase "typesafe macros" comes across as pretty intriguing.  Simply adding the word "typesafe" tells me that these aren't your father's macros.  And any power programmer knows that metaprogramming can be damned useful sometimes... 

Paul Brauner

unread,
Feb 21, 2012, 10:27:55 AM2/21/12
to Justin du coeur, scala-...@googlegroups.com
"typesafe macros" makes me immediately think of things like meta ocaml or lightweight multi staged scala: macros that will be executed at runtime but for which we have a proof they will generate well-typed (and valid) code. This is no surprise compile time macros are typesafe: they just generate code that will be fed to the typechecker. 

Justin du coeur

unread,
Feb 21, 2012, 10:34:03 AM2/21/12
to scala-...@googlegroups.com
On Tue, Feb 21, 2012 at 10:27 AM, Paul Brauner <polu...@gmail.com> wrote:
"typesafe macros" makes me immediately think of things like meta ocaml or lightweight multi staged scala: macros that will be executed at runtime but for which we have a proof they will generate well-typed (and valid) code. This is no surprise compile time macros are typesafe: they just generate code that will be fed to the typechecker. 

True, of course.  That said, Scala by its nature doesn't lay you anywhere near the sorts of traps that C++ does with its macro system.  (Where it is fairly easy to do things that are technically legal, and will compile okay, but don't actually make sense...)

√iktor Ҡlang

unread,
Feb 21, 2012, 10:34:28 AM2/21/12
to Paul Brauner, Justin du coeur, scala-...@googlegroups.com
I think they're fant-AST-ic

√iktor Ҡlang

unread,
Feb 21, 2012, 10:36:44 AM2/21/12
to Paul Brauner, Justin du coeur, scala-...@googlegroups.com
How about Compiler Pluglets

2012/2/21 √iktor Ҡlang <viktor...@gmail.com>

Naftoli Gugenheim

unread,
Feb 22, 2012, 11:17:54 PM2/22/12
to Paul Brauner, Justin du coeur, scala-...@googlegroups.com
Well C++ macros are compile-time, so doesn't "macro" convey that much?
Message has been deleted

Simon Ochsenreither

unread,
Feb 23, 2012, 7:30:58 AM2/23/12
to scala-...@googlegroups.com, Paul Brauner, Justin du coeur
Maybe macros  don't need to be that much exposed if there would be "real world" features using it. E. g. show off the features on top of macros instead of macros?

ven

unread,
Feb 23, 2012, 8:53:41 AM2/23/12
to scala-debate
I think the name should correspond to the syntax and for that a single
term is needed?
What is about "template" or "stencil"?


On 23 Feb., 13:30, Simon Ochsenreither

Bernd Johannes

unread,
Feb 24, 2012, 3:16:02 AM2/24/12
to scala-...@googlegroups.com, Daniel Sobral, Cédric Beust ♔, Naftoli Gugenheim

That's right. But since we're in debate I can add noise...


The point was raised that FUD preacher will call it "macro" no matter how scala is calling it.


But most of the people (supposed to be initially interested) will ask themself "why don't they call it 'macro' if it is a macro?" and will try to find an answer.


Based on their prior knowledge they will either come to the conclusion: "well, it's about macros" because they expect macros to be the scala-way. Then they won't misjudge macros because they don't mistake macros for text preprocessing.


But if they adhere to the concept of macros to be "source text mangling" they will discover (for good) that scala macros are something else entirely.


So we will have a fraction which complaints about "old wine in new flasks" - which is not too bad. And we have a fraction which is NOT scared away by a prejudice without revalidation (which most often won't happen).


And we don't have to write another ton of block posts "why scala macros are no c-style macros"... which will receive the responses "why scala does everything different just to bewilder everyone".


Just my 5 cents

Bernd


Dennis Haupt

unread,
Feb 24, 2012, 4:07:47 AM2/24/12
to Bernd Johannes, scala-...@googlegroups.com, nafto...@gmail.com, ced...@beust.com, dcso...@gmail.com
just call it metaprogramming

-------- Original-Nachricht --------
> Datum: Fri, 24 Feb 2012 09:16:02 +0100
> Von: Bernd Johannes <bjoh...@bacon.de>
> An: scala-...@googlegroups.com
> CC: Daniel Sobral <dcso...@gmail.com>, "Cédric Beust ♔" <ced...@beust.com>, Naftoli Gugenheim <nafto...@gmail.com>
> Betreff: Re: [scala-debate] scala-macros vs scala-virtualized

Daniel Sobral

unread,
Feb 24, 2012, 5:35:56 AM2/24/12
to Josh Suereth, Naftoli Gugenheim, scala-...@googlegroups.com, √iktor Ҡlang, Chapin, Peter @ VTC, Lukas Rytz
Surely you meant hygienic macros? :-)

--

√iktor Ҡlang

unread,
Feb 24, 2012, 5:43:30 AM2/24/12
to Daniel Sobral, Josh Suereth, Naftoli Gugenheim, scala-...@googlegroups.com, Chapin, Peter @ VTC, Lukas Rytz


2012/2/24 Daniel Sobral <dcso...@gmail.com>

Surely you meant hygienic macros? :-)

Let's call em Hacros

Dennis Haupt

unread,
Feb 24, 2012, 5:46:46 AM2/24/12
to Daniel Sobral, joshua....@gmail.com, lukas...@epfl.ch, PCh...@vtc.vsc.edu, viktor...@gmail.com, scala-...@googlegroups.com, nafto...@gmail.com
custom code sugar
programmable code monkey
static macros
code generation
coding for lazy people
elegance enhancer
code mapper


-------- Original-Nachricht --------
> Datum: Fri, 24 Feb 2012 08:35:56 -0200
> Von: Daniel Sobral <dcso...@gmail.com>
> An: Josh Suereth <joshua....@gmail.com>
> CC: Naftoli Gugenheim <nafto...@gmail.com>, "scala-...@googlegroups.com" <scala-...@googlegroups.com>, "√iktor Ҡlang" <viktor...@gmail.com>, "Chapin, Peter @ VTC" <PCh...@vtc.vsc.edu>, Lukas Rytz <lukas...@epfl.ch>
> Betreff: Re: [scala-debate] scala-macros vs scala-virtualized

Kevin Wright

unread,
Feb 24, 2012, 6:26:28 AM2/24/12
to Dennis Haupt, Daniel Sobral, joshua....@gmail.com, lukas...@epfl.ch, PCh...@vtc.vsc.edu, viktor...@gmail.com, scala-...@googlegroups.com, nafto...@gmail.com
How about "transformers"?

Alec Zorab

unread,
Feb 24, 2012, 7:10:34 AM2/24/12
to Kevin Wright, Dennis Haupt, Daniel Sobral, joshua....@gmail.com, lukas...@epfl.ch, PCh...@vtc.vsc.edu, viktor...@gmail.com, scala-...@googlegroups.com, nafto...@gmail.com
Macros in disguise?

Daniel Sobral

unread,
Feb 24, 2012, 7:47:28 AM2/24/12
to Alec Zorab, Kevin Wright, Dennis Haupt, joshua....@gmail.com, lukas...@epfl.ch, PCh...@vtc.vsc.edu, viktor...@gmail.com, scala-...@googlegroups.com, nafto...@gmail.com
Ignoring all this definition-muddling exercise, these macros _are_
hygienic, are they not?

Josh Suereth

unread,
Feb 24, 2012, 8:42:56 AM2/24/12
to Dennis Haupt, Bernd Johannes, scala-...@googlegroups.com, nafto...@gmail.com, ced...@beust.com, dcso...@gmail.com
Because Boost:mpl never scared anyone off.

Oleg Ilyenko

unread,
Feb 25, 2012, 7:59:28 AM2/25/12
to scala-debate
What do you think about the name "AST Transformations". It nicely
captures the nature of Scala macros, which are really just
transforming one AST into another AST. There is also no legacy baggage
around this term, so people will not mistaken it for something like C/C
++ macros.

From the other hand this term can discourage or scary normal Scala
developers from using this feature. I don't think, that many people
would think in this way when writing normal application (not a
library): "Good, I've got this functionality... should I write
function... or maybe it would be better to extract this code in some
class or trait... no, I definitely need AST transformation here" :) I
think for most of the people (including myself) term "AST" is
connected to language/compiler implementation, so they don't need to
worry much about it or the fact that they are not fully understand or
familiar with it.

One good example of "AST Transformations" term usage is groovy. I have
not heard people complaining about it.

Of course, some nice prefixes can be added to it like: "Typesafe AST
Transformations" or "Compile-time AST Transformations".

Best regards,
Oleg

On Feb 15, 10:16 pm, Simon Ochsenreither
<simon.ochsenreit...@googlemail.com> wrote:
> Hi,
>
> in my opinion the fear that people abuse macros would be substantially
> reduced if it ships with some real world libraries showing _when_ and _how_
> to use them idiomatically.
>
> The most unfortunate thing to do would be shipping it as a feature in
> search of a problem. This was imho the case with scala.Dynamic. Dynamic
> appeared with some interesting prototypes of "possible" things, but I
> haven't seen a single reasonable "real world" usage of it yet.
>
> In the end I think it would be more reasonable to go ahead and release 2.10
> without macros (or behind a compiler switch like it currently is) and
> officially ship macros when they are ready. (Ready == "there is a real
> library solving a real problem using it, which can be recommended to
> developers without having to worry")
>
> The name "macro" is a 100% guarantee for a marketing nightmare, so it
> shouldn't be made worse by shipping it without any substantial example
> which can be explained to developers.
>
> Thanks and bye,
>
> Simon

Simon Ochsenreither

unread,
Feb 25, 2012, 8:11:22 AM2/25/12
to scala-...@googlegroups.com
Tree Transformations? Or maybe Tree Expressions? Expression Trees (like C#)?

John Nilsson

unread,
Feb 25, 2012, 12:34:12 PM2/25/12
to Simon Ochsenreither, scala-...@googlegroups.com

Expression Trees are a runtime thing though.

While on the topic of C# expressions. I really like to have the feature available but in my experience they have one major flaw (besides being severely limited) that contaminates the entire language in that they look like ordinary code. This makes it easy to fall in the trap of assuming ordinary semantics and then being surprised when it's not. Unfortunately it means that you must be suspicious of any code you look at which creates somewhat of a mental burden.

Now Scala allready have this problem with its uniform and free syntax but adding this feature to the mix can potentially make the issue worse. It's like the operator overloading problem on steroids.

So some thought should be directed at mitigating this problem when introducing the feature. Either som syntactic delimiter, or convention, to signal "here be dragons", or maybe some IDE feature to highlight such code, maybe change font or letter spacing.

BR,
John

Alex Repain

unread,
Feb 25, 2012, 1:33:43 PM2/25/12
to John Nilsson, Simon Ochsenreither, scala-...@googlegroups.com


2012/2/25 John Nilsson <jo...@milsson.nu>

Expression Trees are a runtime thing though.

While on the topic of C# expressions. I really like to have the feature available but in my experience they have one major flaw (besides being severely limited) that contaminates the entire language in that they look like ordinary code. This makes it easy to fall in the trap of assuming ordinary semantics and then being surprised when it's not. Unfortunately it means that you must be suspicious of any code you look at which creates somewhat of a mental burden.

Now Scala allready have this problem with its uniform and free syntax but adding this feature to the mix can potentially make the issue worse. It's like the operator overloading problem on steroids.

So some thought should be directed at mitigating this problem when introducing the feature. Either som syntactic delimiter, or convention, to signal "here be dragons", or maybe some IDE feature to highlight such code, maybe change font or letter spacing.


Agreed, but an IDE feature is nothing in the language. Be aware that a lot of developers (or other programmers) don't use the so-called IDEs. I'd prefer to see the warning sign even when using VI or some other ultra-basic text editor. That leaves only language syntax or some annotation-based system.

As for the name, 'AST transformations' seems nicer to me than the other alternatives, although it also sounds like a pro-only feature. So my question is : will the 'basic' developer need macros that much ? If not, it's okay to call it whatever we want, and just not over-advertise it, which could be counter-productive.

Chris Marshall

unread,
Feb 26, 2012, 10:46:42 PM2/26/12
to simon.och...@googlemail.com, scala-...@googlegroups.com
It's quite clear to me that they should be called recursionsplursions


Date: Sat, 25 Feb 2012 05:11:22 -0800
From: simon.och...@googlemail.com
To: scala-...@googlegroups.com
Subject: [scala-debate] Re: scala-macros vs scala-virtualized

Mirko Stocker

unread,
Feb 27, 2012, 2:34:29 AM2/27/12
to scala-...@googlegroups.com
On Saturday 25 February 2012 04:59:28 Oleg Ilyenko wrote:
> or "Compile-time AST Transformations".

CATs! Nobody could object to Scala supporting cats, right?

Cheers,

Mirko

--
Mirko Stocker | m...@misto.ch
Work: http://ifs.hsr.ch | http://infoq.com
Personal: http://misto.ch | http://twitter.com/m_st

Kevin Wright

unread,
Feb 27, 2012, 3:07:13 AM2/27/12
to Mirko Stocker, scala-...@googlegroups.com
Even better... If anyone wants to come along and criticise the feature, then the obvious insult is "kittens", which still sounds good! 

Stefan Zeiger

unread,
Feb 27, 2012, 3:55:19 AM2/27/12
to scala-...@googlegroups.com
On 2012-02-27 8:34, Mirko Stocker wrote:
> On Saturday 25 February 2012 04:59:28 Oleg Ilyenko wrote:
>> or "Compile-time AST Transformations".
> CATs! Nobody could object to Scala supporting cats, right?

That gives a whole new meaning to
http://spl.smugmug.com/Humor/Lambdacats/catt/960526154_fqpKB-O-1.jpg

-sz

Dave

unread,
Feb 27, 2012, 7:22:10 AM2/27/12
to scala-debate

Dave

unread,
Feb 27, 2012, 7:28:21 AM2/27/12
to scala-debate
And a catmacro <: image macro
and the name "macro" comes from computer science macros.
So that closes the circle.

http://en.wikipedia.org/wiki/Image_macro

"EtymologyThe term "image macro" was first used on the Something Awful
forums.[3] The name derived from the fact that the 'macros' were a
short bit of text a user could enter that the forum software would
automatically parse and expand into the code for a pre-defined image,
[3] relating to the computer science topic of a macro, defined as "a
rule or pattern that specifies how a certain input sequence (often a
sequence of characters) should be mapped to an output sequence (also
often a sequence of characters) according to a defined procedure.""



On 27 feb, 13:22, Dave <dave.mahabiers...@hotmail.com> wrote:
> lolcat = catmacrohttp://en.wikipedia.org/wiki/Lolcat
> > -sz- Tekst uit oorspronkelijk bericht niet weergeven -
>
> - Tekst uit oorspronkelijk bericht weergeven -

Dave

unread,
Feb 27, 2012, 11:09:15 AM2/27/12
to scala-debate
>LoL is also the title of a book about macros: http://letoverlambda.com/
>
>BR
> John
LoL = Let over Lambda
Thanks for the link. Didn't know that.

If lol cat = cat macro
then
lol = macro

Instead of

def macro tree(a: Any) = {
val util = Util(_context); import util._
reify(a)
}

def lol tree(a: Any) = {
val util = Util(_context); import util._
reify(a)
}

maybe as an easter egg.

Also

LOLCODE programming language is proven Turing complete
see: http://en.wikipedia.org/wiki/LOLCODE
which has a sort of lolspeak syntax

I find the error handling example the funniest:
HAI
CAN HAS STDIO?
PLZ OPEN FILE "LOLCATS.TXT"?
AWSUM THX
VISIBLE FILE
O NOES
INVISIBLE "ERROR!"
KTHXBYE





√iktor Ҡlang

unread,
Feb 27, 2012, 11:12:08 AM2/27/12
to John Nilsson, scala-debate
John, where did our LOLCODE Scala combinator parser go?
 

John Nilsson

unread,
Mar 1, 2012, 8:11:02 PM3/1/12
to √iktor Ҡlang, scala-debate
2012/2/27 √iktor Ҡlang <viktor...@gmail.com>

John, where did our LOLCODE Scala combinator parser go?

Well, it wasn't exactly "complete"... but here you go:

package lolz

import scala.util.parsing.combinator._

trait LolAST
trait Expression extends LolAST
trait Statement extends LolAST
case class Variable(name:String) extends Expression
case class StringLit(s:String) extends Expression
case class VarDecl(v:Variable) extends Statement
case class VarAssign(v:Variable,expr:Expression) extends Statement
case class Gimmeh(v:Variable) extends Statement
case class Visible(e:Expression) extends Statement

object LolcodeParser extends JavaTokenParsers with PackratParsers  {

    def pgm = "HAI"~>(((stmt)+)<~"KTHXBYE")
   
    def stmt:Parser[Statement] = varDecl | varAssign | gimmeh | visible
       
    def varDecl:Parser[VarDecl] = "I"~"HAS"~"A"~>variable ^^ VarDecl
   
    def varAssign:Parser[VarAssign] = variable~("R"~>expr) ^^ {
        case v~e => VarAssign(v,e)
    }
   
    def gimmeh:Parser[Gimmeh] = "GIMMEH"~>variable ^^ Gimmeh
    def visible:Parser[Visible] = "VISIBLE"~>expr ^^ Visible
   
    def expr:Parser[Expression] = variable | stringLiteral ^^ {
        case v:Variable => v
        case s:String => StringLit(s)
    }
   
    def variable:Parser[Variable] = ident ^^ Variable
   
    def parse(s:String) = parseAll(pgm,new java.io.StringReader(s))
}

Shelby

unread,
Sep 27, 2013, 9:47:46 AM9/27/13
to scala-...@googlegroups.com, Ivan Todoroski, Daniel Sobral, Rex Kerr, Paul Brauner
On Thursday, February 16, 2012 4:25:38 AM UTC+8, Martin wrote:
On Wed, Feb 15, 2012 at 9:00 PM, Ivan Todoroski <grnch...@gmx.net> wrote:
> Hi Martin,
>
> Who do you see as the target audience for macros?
>
> a) regular rank-and-file developers looking to reduce repeated boiler plate
> and create simple DSLs specific to their projects
>
> b) hardcore library designers who in the absence of macros would have
> resorted to compiler plugins for their advanced tricks, but will now use
> macros for that purpose
>
>
> If it's A, what is your view on the problems raised in the first 3 points of
> the original email that started this thread, namely having to learn an
> arcane AST sub-language to create macros and the difficulty of debugging
> macros?
>
> This would seem to raise the threshold on the amount of boilerplate and
> duplication a regular developer would tolerate before finally sitting down
> to learn the vagaries of macros to reduce it.
>
>
> If it's B, then what should the busy regular developer use for reducing
> boiler plate and creating simple DSLs for their work, if macros remain
> generally difficult for them to use?
>
I think it's rather B. And, Scala already has a lot of mechanisms that
reduce boilerplate even without resorting to macros and scala
virtualized. So I am not at all concerned that the bar for using
either is high. In fact, I'd prefer it that way.

I see macros as a useful middle road. Every language designer is faced
with a constant stream of suggestions to enlarge the language. And a
lot of these make sense. But following them would make the language
bigger and that's something I am very reluctant to do. If at all
possible, I'd like to make it smaller! Macros are a more stable and
systematic way to achieve more convenient syntax than compiler
plugins. That's why I think they are promising. The thing that scares
me deeply is that some people will inevitably go overboard with them
and misuse them.

It's a conundrum. Scala's philosophy is to make things possible, and
not erect walls against misuse. Trust the competency of your
programmers. But posts like this one make me realize the danger of
doing this:

  http://yz.mit.edu/wp/true-scala-complexity/

It seems that everyone will at some point in their development as a
developer reach "peak abstraction",

  http://lambda-the-ultimate.org/node/4442

and Scala's problem is that peak abstraction is so much higher (and
therefore more scary) than in other languages. The question I am
grappling with is, how can we get the advantages of macros without the
potential for misuse?


If macros transform some Scala syntax to some other Scala syntax, and the IDE allows toggling to the transformed syntax and debugging it, then how can it be abused in any way that Scala syntax could not already be abused?

As it stands now, the output of the macro can't be seen nor debugged as source code, so as far I can estimate that could make understanding and using them difficult in some cases.

Are there any plans and/or ETA on such IDE support? Is the output of the AST transformation not always presentable in Scala syntax?
 

Cheers

 -- Martin

>
> I do appreciate your response regarding plans for inclusion of scala-macros
> vs scala-virtualized, it clarifies the direction of Scala somewhat.
>
>
>
>
> On 14.02.2012 21:56, martin odersky wrote:
>>
>> Some perspective: First, scala-virtualized and Scala macros are both
>> experimental at the present stage. The discussion on this thread has
>> already worked out the main points of differences, so I won't go into
>> that.
>>
>> Scala-virtualized is a research project, done at EPFL and Stanford. We
>> hope that some number of features will make it into main Scala.
>> Adriaan's pattern matcher looks like an excellent first candidate, and
>> other elements might follow. But there's no a priori intention to
>> migrate all elements of scala virtualized.
>>
>> Macros have a shorter term horizon. There will be a SIP soon, and if
>> things go well we might see them soon in trunk, probably enabled by
>> some flag.
>>
>> The intention of macros, and of Scala's language design in general, is
>> to simplify things. We have already managed to replace code lifting by
>> macros, and hopefully other features will follow. For instance, there
>> was a strong push to somehow eliminate the implicit parameter in
>>
>>  atomic { implicit transaction => ... }
>>
>> and several other related situations. With macros this is trivial.
>> Without them, it requires a sophisticated dance with scoping rules.
>> Optimizations such as on Range.foreach are another use case. I believe
>> that in the long run macros can be a simplifying factor. So, in my
>> mind, there is enough evidence to try out the idea. But the
>> implementation is considered experimental at present, and we do
>> welcome critical discussion around the SIP once it appears.
>>
>> Cheers
>>
>>  -- Martin
>>
>

--
Martin Odersky
Prof., EPFL and Chairman, Typesafe
PSED, 1015 Lausanne, Switzerland
Tel. EPFL: +41 21 693 6863
Tel. Typesafe: +41 21 691 4967

Eugene Burmako

unread,
Sep 30, 2013, 11:13:43 AM9/30/13
to scala-...@googlegroups.com, Ivan Todoroski, Daniel Sobral, Rex Kerr, Paul Brauner, Alexander Podkhalyuzin
There are plans to provide IDE support, but the ETA is not yet clear. There's a prototype debugger for Intellij [1], though I'm not sure what's going to happen with it in the future.

Reply all
Reply to author
Forward
0 new messages