Global has a macro definition

323 views
Skip to first unread message

Grzegorz Kossakowski

unread,
Aug 7, 2013, 3:47:18 AM8/7/13
to scala-internals
Hi,

I was just testing improved incremental compilation algorithm with scala compiler code.

In particular, I added some more debugging logging to incremental compiler which results in this line when you compile Scala compiler:

[debug] Public API is considered to be changed because /Users/grek/scala/scala-master/src/compiler/scala/tools/nsc/Global.scala contains a macro definition.

I was very confused because I thought we didn't add any macros to Scala compiler code base. It turns out that Global inherits macro definitions from SymbolTable which inherits them from macros.Universe which inherits them from reflect.api.Universe.

In the  reflect.api.Universe we have the following macro defined:

def reify[T](expr: T): Expr[T] = macro ???

Also, Universe inherits from Quasiquotes which also have two macros defined. That's how Global ends up having macros defined.

Why is this important? Well, the line quoted above gives you a hint. If a given file has a macro definition then sbt would always assume that the API of that given file has changed and all files that depend on that file has to be recompiled. Sbt has to assume that because we have no way to tell if macro implementation has changed in a way that affects its clients.

This results in sbt recompiling the whole 290 source files upon whitepsace change in Global. Given the fact sbt doesn't track macro usage and it would be be relatively difficult to add that functionality (I don't plan to do that) I wonder if we could restructure our code so macros like reify do not end up being members of Global. 

I also wonder if anybody was aware that Global defines macros. I was surprised.

Thoughts?

--
Grzegorz Kossakowski
Scalac hacker at Typesafe
twitter: @gkossakowski

Eugene Burmako

unread,
Aug 7, 2013, 4:02:32 AM8/7/13
to scala-i...@googlegroups.com
Theoretically we could move both reify and quasiquotes away from the reflection cake, but in practice this hits a number of limitations of path-dependent support in scalac.

For example, this is at what we have to go through with TreeCreators/TypeCreators that have to live outside the cake [1]. And that's just EXPRmode, where we can work around a lot of warts by casting. There also exists PATTERNmode (important for quasiquotes), where our hands are basically tied.

Along with the solution that you've mentioned, there also exists the one that involves refactoring reflection our of the compiler's cake, but it's also a long shot.

[1] https://github.com/xeno-by/scala/blob/e9ccb416b307d853120411572a57cb57867a9afc/src/compiler/scala/reflect/reify/utils/Extractors.scala#L11
> --
> You received this message because you are subscribed to the Google Groups "scala-internals" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to scala-interna...@googlegroups.com.
> For more options, visit https://groups.google.com/groups/opt_out.
>  
>  
>

Jason Zaugg

unread,
Aug 7, 2013, 4:18:26 AM8/7/13
to scala-i...@googlegroups.com
On Wed, Aug 7, 2013 at 9:47 AM, Grzegorz Kossakowski <grzegorz.k...@gmail.com> wrote:
This results in sbt recompiling the whole 290 source files upon whitepsace change in Global. Given the fact sbt doesn't track macro usage and it would be be relatively difficult to add that functionality (I don't plan to do that)

How about we add a optimistic mode to SBT when faced with the opacity of macros? If a macro implementation changes, the user would need to manually clean. Even with the current pessimistic approach, that will sometimes be required, as the macro implementation can be in a different file to the macro def.

-jason

Denys Shabalin

unread,
Aug 7, 2013, 4:34:38 AM8/7/13
to scala-i...@googlegroups.com
We've tried having quasiquotes outside of the cake originally but as it turned out due to poor support of path-dependent types (you have to pass the universe around manually) we got:

1) extremely verbose api with significant amount of boilerplate that just about negated most of the simplicity of the quasiquotes.
2) a lot of trouble with casting and making path-dependent prefixes lineup

The situation would change when there is an improvement of handling of path dependent types on Scala side. There was a relevant proposal to introduce double-bracket parameters that will notify compiler that universe is path-dependant prefix that should be inferred by compiler:

def transform[[u: api.Universe]](tree: u.Tree): u.Tree = ...

This way compiler will know that `u` should be inferred as prefix of `tree` if you don't provide it manually.


On Wed, Aug 7, 2013 at 9:47 AM, Grzegorz Kossakowski <grzegorz.k...@gmail.com> wrote:

Grzegorz Kossakowski

unread,
Aug 7, 2013, 1:02:03 PM8/7/13
to scala-internals
As a short term solution this is probably a good idea. What would be the default option for that switch?

However, I really dislike software switches. I believe it's poor software design if one has to configure software to work properly. It basically shifts the burden from software to the user. I really don't expect people coding in Scala to realize that incremental compiler is recompiling too much, read debug logs to see that macros they don't even see are involved and then figure out some obscure option. I'd say that until incremental compiler gets proper macro support which involves analyzing macro bodies we should not support macros in any special way in incremental compilation and tell macro writes that they just have to clean all the time when they work on macro implementations. WDYT?

Josh Suereth

unread,
Aug 7, 2013, 1:12:51 PM8/7/13
to scala-internals
I think this is kinda of a poor choice.   I like Jason's idea.  I also think we should try to add special tracking for macros.  That should probably be a top priority given the prevalence of macros everywhere (e.g. Play's JSON support).  I'd hate to have people unable to parse their JSON because the macro was not re-run upon change of a class file.... (Perhaps we could validate if the current incremental compiler has issues here first).

The incremental compiler should, ideally, always just be an optimisation over running scalac on all sources.   If we lead to incorrect compilations we're only asking to tick of users.  I'd rather have the incremental compiler recompile the world for a macro (and make people avoid them if they're so bad) than have stale code and hard-to-track down bugs because the incremental compiler chose speed over correctness.


- Josh

Grzegorz Kossakowski

unread,
Aug 7, 2013, 1:17:31 PM8/7/13
to scala-internals
On 7 August 2013 01:34, Denys Shabalin <denys.s...@typesafe.com> wrote:
We've tried having quasiquotes outside of the cake originally but as it turned out due to poor support of path-dependent types (you have to pass the universe around manually) we got:

1) extremely verbose api with significant amount of boilerplate that just about negated most of the simplicity of the quasiquotes.
2) a lot of trouble with casting and making path-dependent prefixes lineup

The situation would change when there is an improvement of handling of path dependent types on Scala side. There was a relevant proposal to introduce double-bracket parameters that will notify compiler that universe is path-dependant prefix that should be inferred by compiler:

def transform[[u: api.Universe]](tree: u.Tree): u.Tree = ...

This way compiler will know that `u` should be inferred as prefix of `tree` if you don't provide it manually.

Could we factor out quasiquotes (at least macros in them) and reify macro to be defined in a class that is baked into a cake through composition and not inheritance?

The problem is not that those macros end up being in the cake in any manner. The problem is that they are inherited and end up being members of Global itself as 1562 other members:

Welcome to Scala version 2.10.2 (Java HotSpot(TM) 64-Bit Server VM, Java 1.6.0_39).
Type in expressions to have them evaluated.
Type :help for more information.

scala> :power

scala> typeOf[scala.tools.nsc.Global]
res0: $r.intp.global.Type = scala.tools.nsc.Global

scala> res0.members.size
res1: Int = 1562

Paul Phillips

unread,
Aug 7, 2013, 1:20:38 PM8/7/13
to scala-i...@googlegroups.com

On Wed, Aug 7, 2013 at 10:17 AM, Grzegorz Kossakowski <grzegorz.k...@gmail.com> wrote:
Could we factor out quasiquotes (at least macros in them) and reify macro to be defined in a class that is baked into a cake through composition and not inheritance?

Although one has to apply a low standard for what constitutes composition, that of course is what's going on with the dozens of objects in Global. Presumably if it works for them it will work here.

  // phaseName = "flatten"
  object flatten extends {
    val global: Global.this.type = Global.this
    val runsAfter = List("constructors")
    val runsRightAfter = None
  } with Flatten

  // phaseName = "mixin"
  object mixer extends {
    val global: Global.this.type = Global.this
    val runsAfter = List("flatten", "constructors")
    val runsRightAfter = None
  } with Mixin

  // phaseName = "cleanup"
  object cleanup extends {
    val global: Global.this.type = Global.this
    val runsAfter = List("mixin")
    val runsRightAfter = None
  } with CleanUp

  // phaseName = "icode"
  object genicode extends {
    val global: Global.this.type = Global.this
    val runsAfter = List("cleanup")
    val runsRightAfter = None
  } with GenICode

 // etc etc etc

Eugene Burmako

unread,
Aug 7, 2013, 1:21:47 PM8/7/13
to scala-i...@googlegroups.com
You mean that users will then have to write `c.something.reify` or put an additional import to use quasiquotes?


On Wednesday, 7 August 2013, Grzegorz Kossakowski <grzegorz.k...@gmail.com> wrote:

Paul Phillips

unread,
Aug 7, 2013, 1:25:09 PM8/7/13
to scala-i...@googlegroups.com

On Wed, Aug 7, 2013 at 10:21 AM, Eugene Burmako <eugene....@epfl.ch> wrote:
You mean that users will then have to write `c.something.reify` or put an additional import to use quasiquotes?

You might be able to implicit your way there, see:


Mark Harrah

unread,
Aug 7, 2013, 1:26:29 PM8/7/13
to scala-i...@googlegroups.com
There is already special tracking for macros. It is what Jason describes as pessimistic (although it is really partly optimistic and partly pessimistic as he indicates). The current algorithm should work fine for macro consumers for the most part. It is when developing a macro in a multi-project build that is the problem.

> The incremental compiler should, ideally, always just be an optimisation
> over running scalac on all sources.

Yes, this is the goal.

> If we lead to incorrect compilations
> we're only asking to tick of users. I'd rather have the incremental
> compiler recompile the world for a macro (and make people avoid them if
> they're so bad) than have stale code and hard-to-track down bugs because
> the incremental compiler chose speed over correctness.

We're mainly talking about incremental compilation when developing a macro, not using one. Recompiling the world is unrealistic when developing a project with a macro, since incremental compilation almost disappears (unnecessarily, usually). In practice, I have found the current algorithm to be good enough given the constraints. Yes, sometimes you have to touch the file with the macro definition to mark it as changed. They are an advanced, special feature and I think it is reasonable to have to know that when working on macros.

-Mark

>
> - Josh

Grzegorz Kossakowski

unread,
Aug 7, 2013, 1:32:37 PM8/7/13
to scala-internals
On 7 August 2013 10:12, Josh Suereth <joshua....@gmail.com> wrote:


I think this is kinda of a poor choice.   I like Jason's idea.  I also think we should try to add special tracking for macros.  That should probably be a top priority given the prevalence of macros everywhere (e.g. Play's JSON support).  I'd hate to have people unable to parse their JSON because the macro was not re-run upon change of a class file.... (Perhaps we could validate if the current incremental compiler has issues here first).

As I mentioned, I'm ok with adding a switch (what Jason proposes) as a short term solution but do we agree that introducing more configuration options is not the way to go in general?

Current incremental compiler has issues as Jason mentioned. It's enough to split definition of a macro and macro implementation between two source files to fool incremental compiler. Also, macro implementation might call methods from other files and if implementation of those methods changes then incremental compiler won't detect it.

Also, arguments to macro invocations are supported properly so Play users don't need to worry when they use some JSON macro and change arguments to macro invocation. Those are handled properly. Only changes to implementation of macros are not handled properly and adding proper support in incremental compiler for that is not top priority.


The incremental compiler should, ideally, always just be an optimisation over running scalac on all sources.   If we lead to incorrect compilations we're only asking to tick of users.  I'd rather have the incremental compiler recompile the world for a macro (and make people avoid them if they're so bad) than have stale code and hard-to-track down bugs because the incremental compiler chose speed over correctness.

At the moment incremental compiler recompiles the world for wrong reasons and it misses the case when it actually should recompile. I'm saying that we should get rid of false positives given the fact that we have false negatives anyway.

Also, I agree that incremental compiler should be just an optimisation over running scalac and it is as long as you don't work with macro implementations. As it stands today, macro definitions are really out of scope of incremental compilation due to lack of resources to really support it properly.

Grzegorz Kossakowski

unread,
Aug 7, 2013, 1:35:14 PM8/7/13
to scala-internals
On 7 August 2013 10:21, Eugene Burmako <eugene....@epfl.ch> wrote:
You mean that users will then have to write `c.something.reify` or put an additional import to use quasiquotes?

Yes.

Paul Phillips

unread,
Aug 7, 2013, 1:40:35 PM8/7/13
to scala-i...@googlegroups.com

On Wed, Aug 7, 2013 at 10:32 AM, Grzegorz Kossakowski <grzegorz.k...@gmail.com> wrote:
As I mentioned, I'm ok with adding a switch (what Jason proposes) as a short term solution but do we agree that introducing more configuration options is not the way to go in general?

It is noble to resist unnecessary options. But when you are talking about a fundamental tradeoff which cannot be made for everyone (and "speed vs. correctness" is undoubtedly one such) then resisting configuration translates to insisting on deciding that tradeoff for everyone.

Having had years to enjoy the prioritization of correctness over speed - thus my "incremental compiler" is still ant+scalac+selective-rm-rf - I am not particularly keen on being protected from making the decision myself.

Eugene Burmako

unread,
Aug 7, 2013, 1:44:01 PM8/7/13
to scala-i...@googlegroups.com
And that, if I understand correctly, will allow us to better support incremental compilation when developing scalac, right?


On Wednesday, 7 August 2013, Grzegorz Kossakowski <grzegorz.k...@gmail.com> wrote:

Grzegorz Kossakowski

unread,
Aug 7, 2013, 1:47:49 PM8/7/13
to scala-internals
On 7 August 2013 10:44, Eugene Burmako <eugene....@epfl.ch> wrote:
And that, if I understand correctly, will allow us to better support incremental compilation when developing scalac, right?

Yes. The switch we are discussing is an alternative solution which probably will solve the problem as well.

There's just other point: I'd would really like for Global to have less members than more and macros as nested members of global feel like one step too far.

The composition over inheritance should be used more in general (see my other thread on roles of Global) because even without macros inheritance is much more difficult to support in incremental compiler.

Eugene Burmako

unread,
Aug 7, 2013, 1:55:43 PM8/7/13
to scala-i...@googlegroups.com
I was asking because it doesn't feel right to sacrifice users' convenience for ours. I think it'd be preferable to explore alternative options.

I don't feel very strongly about reify, because with the arrival of quasiquotes its usefulness is severely limited. However speaking of quasiquotes, writing macros is boilerplatey enough as it is to require users to write an additional import.

However let's first wait for Denys so that he can comment whether it'd be possible to encapsulate quasiquotes one level deeper in global without imposing the import tax.


On Wednesday, 7 August 2013, Grzegorz Kossakowski <grzegorz.k...@gmail.com> wrote:

Grzegorz Kossakowski

unread,
Aug 7, 2013, 2:10:47 PM8/7/13
to scala-internals
On 7 August 2013 10:55, Eugene Burmako <eugene....@epfl.ch> wrote:
I was asking because it doesn't feel right to sacrifice users' convenience for ours. I think it'd be preferable to explore alternative options.

It's not convenience only. Do we really think have a class with 1500+ members in it is a sign of good software engineering and proper structure?

As I said we are drifting away from original issue: macros in Global causing problems for incremental compilation. It's most likely to be solved on sbt side. However, my discovery shows our bigger problem: we have too much stuff jammed into Global. Quasiquotes is just one instance of that general problem.
 
I don't feel very strongly about reify, because with the arrival of quasiquotes its usefulness is severely limited. However speaking of quasiquotes, writing macros is boilerplatey enough as it is to require users to write an additional import. 

However let's first wait for Denys so that he can comment whether it'd be possible to encapsulate quasiquotes one level deeper in global without imposing the import tax.

I find explicit imports to be totally ok. You are explicit on what you depend on which is probably a good idea.

Josh Suereth

unread,
Aug 7, 2013, 2:38:46 PM8/7/13
to scala-internals
Yeah.   I think software needs very good defaults for users who don't want to think, but also high configurability.   This should be a configuration option.   I'd like to see Speed vs. Correctness an opt-in decision.  We know there are trade-offs no matter what we do.  Let the advanced users make their own decisions, but provide a  good default for the rest.

Paul Phillips

unread,
Aug 7, 2013, 3:38:37 PM8/7/13
to scala-i...@googlegroups.com

On Wednesday, August 7, 2013, Grzegorz Kossakowski wrote:
I find explicit imports to be totally ok. You are explicit on what you depend on which is probably a good idea. 

If global has 1500 members, and by far the most common usage is import global._, then this falls a bit short of explicit. Replacing that line of boilerplate with five similar lines won't feel like progress, and won't be any more explicit. With no mechanism to abstract over imports, it becomes pure tax to be saddled with more targets.

One's ambitions for explication should be well tempered by which facilities the language provides, and even more so by which it does not.  

Eugene Burmako

unread,
Aug 7, 2013, 3:40:04 PM8/7/13
to scala-i...@googlegroups.com
How would you abstract over imports?

Erik Osheim

unread,
Aug 7, 2013, 4:00:22 PM8/7/13
to scala-i...@googlegroups.com
On Wed, Aug 07, 2013 at 09:40:04PM +0200, Eugene Burmako wrote:
> How would you abstract over imports?

In Spire we do this by putting relevant types/methods into traits, and
then creating various objects which extend some logical group of
them. For example:

trait DogRelated { ... }
trait CatRelated { ... }
trait ZebraRelated { ... }

object pets extends DogRelated with CatRelated
object zoo extends CatRelated with ZebraRelated
object animals extends DogRelated with CatRelated with ZebraRelated

This way, users can choose whether to import zoo._ or pets._ or
whatever, rather than having an ever-growing list of all the
subcategories we eventually have.

Maybe Paul means something similar?

-- Erik

Paul Phillips

unread,
Aug 7, 2013, 4:15:39 PM8/7/13
to scala-i...@googlegroups.com

On Wed, Aug 7, 2013 at 12:40 PM, Eugene Burmako <eugene....@epfl.ch> wrote:
How would you abstract over imports?

There are many possibilities, but the minimum, which would make a huge difference, would be along the lines of 

val aggregatedImports = {
  import foo._
  import bar._
  import baz._
}

// elsewhere
import aggregatedImports._

There are details, but they are not particularly interesting.

Paul Phillips

unread,
Aug 7, 2013, 4:16:41 PM8/7/13
to scala-i...@googlegroups.com

On Wed, Aug 7, 2013 at 1:00 PM, Erik Osheim <er...@plastic-idolatry.com> wrote:
Maybe Paul means something similar?

Right, but without the spectacular conflation of concerns taking place when you have to actually compose a bunch of traits into a class and have a lazily loaded object when all you want to do is manage the namespace.

Jason Zaugg

unread,
Aug 7, 2013, 4:43:56 PM8/7/13
to scala-i...@googlegroups.com
On Wed, Aug 7, 2013 at 7:12 PM, Josh Suereth <joshua....@gmail.com> wrote:
I think this is kinda of a poor choice.   I like Jason's idea.  I also think we should try to add special tracking for macros.  That should probably be a top priority given the prevalence of macros everywhere (e.g. Play's JSON support).  I'd hate to have people unable to parse their JSON because the macro was not re-run upon change of a class file.... (Perhaps we could validate if the current incremental compiler has issues here first).

The incremental compiler should, ideally, always just be an optimisation over running scalac on all sources.   If we lead to incorrect compilations we're only asking to tick of users.  I'd rather have the incremental compiler recompile the world for a macro (and make people avoid them if they're so bad) than have stale code and hard-to-track down bugs because the incremental compiler chose speed over correctness.

These problems only crop up if you are simultaneously a macro author and consumer. So users of Play/async/etc don't need this.

-jason

Josh Suereth

unread,
Aug 7, 2013, 6:01:15 PM8/7/13
to scala-internals
Ah, cool.  So the pain is limited to advanced library authors regardless?  That leans ever more towards having configuration and trying to do the right thing as often as possible.

Miles Sabin

unread,
Aug 9, 2013, 12:12:24 PM8/9/13
to scala-internals
On Wed, Aug 7, 2013 at 9:34 AM, Denys Shabalin
<denys.s...@typesafe.com> wrote:
> The situation would change when there is an improvement of handling of path
> dependent types on Scala side. There was a relevant proposal to introduce
> double-bracket parameters that will notify compiler that universe is
> path-dependant prefix that should be inferred by compiler:
>
> def transform[[u: api.Universe]](tree: u.Tree): u.Tree = ...

Out of interest, have you explored the following option, which seems
to have all the right family polymorphic characteristics,

scala> object api { class Universe { class Tree } }
defined module api

scala> val (u1, u2) = (new api.Universe, new api.Universe)
u1: api.Universe = api$Universe@4384d01f
u2: api.Universe = api$Universe@447bc5da

scala> val (t1, t2) = (new u1.Tree, new u2.Tree)
t1: u1.Tree = api$Universe$Tree@6f3ff160
t2: u2.Tree = api$Universe$Tree@4f000eaf

scala> def sameU(u: api.Universe)(t1: u.Tree, t2: u.Tree) = (t1, t2)
sameU: (u: api.Universe)(t1: u.Tree, t2: u.Tree)(u.Tree, u.Tree)

scala> sameU(u1)(t1, t1) // Should compile
res1: (u1.Tree, u1.Tree) =
(api$Universe$Tree@6f3ff160,api$Universe$Tree@6f3ff160)

scala> sameU(u1)(t1, t2) // Should not compile
<console>:13: error: type mismatch;
found : u2.Tree
required: u1.Tree
sameU(u1)(t1, t2) // Should not compile
^

scala> def transform[U <: api.Universe](tree: U#Tree): U#Tree = tree
transform: [U <: api.Universe](tree: U#Tree)U#Tree

scala> sameU(u1)(t1, transform(t1)) // Should compile
res3: (u1.Tree, u1.Tree) =
(api$Universe$Tree@6f3ff160,api$Universe$Tree@6f3ff160)

scala> sameU(u1)(t2, transform(t1)) // Should not compile
<console>:14: error: type mismatch;
found : u2.Tree
required: u1.Tree
sameU(u1)(t2, transform(t1)) // Should not compile
^

scala> sameU(u1)(transform(t1), transform(t1)) // Should compile
res5: (u1.Tree, u1.Tree) =
(api$Universe$Tree@6f3ff160,api$Universe$Tree@6f3ff160)

scala> sameU(u1)(transform(t1), transform(t2)) // Should not compile
<console>:14: error: type mismatch;
found : u2.Tree
required: u1.Tree
sameU(u1)(transform(t1), transform(t2)) // Should not compile
^

What scenarios do you have in mind that this doesn't cover?

Cheers,


Miles

--
Miles Sabin
tel: +44 7813 944 528
skype: milessabin
gtalk: mi...@milessabin.com
g+: http://www.milessabin.com
http://twitter.com/milessabin

Denys Shabalin

unread,
Aug 14, 2013, 11:22:30 AM8/14/13
to scala-i...@googlegroups.com
Let's assume that quasiquote are outside of the cake:

import scala.reflect.api.Quasiquote
import scala.reflect.runtime.universe._
val x: Tree = ...
q"foo($x)"

The quote will be desugared into:

StringContext("foo(", ")").q(x)

Which then will trigger implicit search and transform into:

Quasiquote(StringContext("foo(", ")")).q(x)

Let's assume that quasiquote only support splicing trees for simplicity and try to find a working signature for q. The very first approximation:

def q(trees: Tree*): Tree 

This can't work because Tree is path-dependant and we need it's prefix (universe):

def q(u: Universe)(trees: u.Tree*): u.Tree

But this won't work either because this doesn't match the string interpolation's desugaring any more. Here i would love to have Scala automatically find out that if you pass runtime.universe.Tree than prefix is runtime.universe. Hence the double-bracket proposal that tells compiler that prefix is meant to be inferred (i.e. so that you don't have to pass it around manually any more)

We can try type members:

def q[U <: Universe](trees: U#Tree): U#Tree 

but here we won't have a tree for universe (and that's crucial for quasiquotes as they desugare into call to universe members)

Out of desperation we can even try implicits:

def q[U <: Universe](trees: U#Tree)(implicit u: U): U#Tree

But that will require users to manually define implicit vals that correctly correspond to the universe they are currently working in:

import scala.reflect.api.Quasiquote
import scala.reflect.runtime.universe._
implicit val u: Universe = scala.reflect.runtime.universe
val x: Tree = ...
q"foo($x)"

That's lots of boilerplate and manual labor with possibility to screw yourself up (e.g. mismatch between universe you meant to use and implicit you had in scope for some reason)



Miles Sabin

unread,
Aug 14, 2013, 11:49:11 AM8/14/13
to scala-internals
On Wed, Aug 14, 2013 at 4:22 PM, Denys Shabalin
<denys.s...@typesafe.com> wrote:
> Let's assume that quasiquote are outside of the cake:
>
> import scala.reflect.api.Quasiquote
> import scala.reflect.runtime.universe._
> val x: Tree = ...
> q"foo($x)"
<snip/>

> Out of desperation we can even try implicits:
>
> def q[U <: Universe](trees: U#Tree)(implicit u: U): U#Tree
>
> But that will require users to manually define implicit vals that correctly
> correspond to the universe they are currently working in:
>
> import scala.reflect.api.Quasiquote
> import scala.reflect.runtime.universe._
> implicit val u: Universe = scala.reflect.runtime.universe
> val x: Tree = ...
> q"foo($x)"
>
> That's lots of boilerplate and manual labor with possibility to screw
> yourself up (e.g. mismatch between universe you meant to use and implicit
> you had in scope for some reason)

Maybe I'm missing something, but it looks like you added only,

implicit val u: Universe = scala.reflect.runtime.universe

to what you started with in this case. And that looks like it could be
a standard definition which placed somewhere sensible where it doesn't
need to be explicitly imported (or, at least very infrequently
explicitly imported).

Eugene Burmako

unread,
Aug 14, 2013, 12:04:45 PM8/14/13
to scala-i...@googlegroups.com
How do you then distinguish runtime and compile-time qqs?


Miles Sabin

unread,
Aug 14, 2013, 6:49:16 PM8/14/13
to scala-internals

On 14 Aug 2013 17:04, "Eugene Burmako" <xen...@gmail.com> wrote:
>
> How do you then distinguish runtime and compile-time qqs?

I don't know ... show me some canonical examples off the two scenarios and I'll see what I can come up with.

Cheers,

Miles

Eugene Burmako

unread,
Aug 15, 2013, 2:08:31 AM8/15/13
to <scala-internals@googlegroups.com>
I mean, if "implicit val u: Universe = scala.reflect.runtime.universe" becomes a standard implicit, then what do we do with quasiquotes in macros that require "implicit val u: Universe = c.universe"?

Paolo G. Giarrusso

unread,
Aug 16, 2013, 11:26:35 PM8/16/13
to scala-i...@googlegroups.com
On Thursday, August 15, 2013 8:08:31 AM UTC+2, Eugene Burmako wrote:
I mean, if "implicit val u: Universe = scala.reflect.runtime.universe" becomes a standard implicit, then what do we do with quasiquotes in macros that require "implicit val u: Universe = c.universe"?

Can't you have both implicits?

Miles's examples suggest that U is inferred to a singleton type, at least in the context of a sameU call. In that case, U should also be specific enough to pick the right implicit - but seems seems to fail in my testing:

implicit val iu1 = u1
def q[U <: api.Universe](trees: U#Tree)(implicit u: U): U#Tree = tree

> q(t1) //works
res10: api.Universe#Tree = api$Universe$Tree@1d77daaa

>implicit val iu2 = u2
iu2: api.Universe = api$Universe@21a437b6

> q(t1) //doesn't work any more
<console>:17: error: ambiguous implicit values:
 both value iu1 of type => api.Universe
 and value iu2 of type => api.Universe
 match expected type api.Universe
              q(t11)
                ^

However, the sameU context turns out to be required for U to be so specific - normally, it'd be api.Universe:

scala> transform(t1)
res0: api.Universe#Tree = api$Universe$Tree@1d77daaa

So, either you always have the right context, and then the right implicit will be chosen, or you don't have the right context, and then your problem is not the implicit. Actually, having two implicits will make the ambiguous call fail, which might be a good thing.

As you'd expect, the right context can be destroyed by deinlining, since type inference is local. Let's extract a subexpression out of a sameU call which used to work:

scala> sameU(u1)(t1, transform(t1)) //works
res6: (u1.Tree, u1.Tree) = (api$Universe$Tree@1d77daaa,api$Universe$Tree@1d77daaa)

scala> val t11 = transform(t1) //Take this out
t11: api.Universe#Tree = api$Universe$Tree@1d77daaa

scala> sameU(u1)(t1, t11) //This used to work
<console>:15: error: type mismatch;
 found   : api.Universe#Tree
 required: u1.Tree
              sameU(u1)(t1, t11)
                            ^

Of course, the right type annotation can fix everything - we just need to write it explicitly:
scala> val t11: u1.Tree = transform(t1)
t11: u1.Tree = api$Universe$Tree@1d77daaa

scala> sameU(u1)(t1, t11)
res5: (u1.Tree, u1.Tree) = (api$Universe$Tree@1d77daaa,api$Universe$Tree@1d77daaa)

Cheers,
Paolo

Miles Sabin

unread,
Aug 17, 2013, 7:11:42 AM8/17/13
to scala-internals
On Sat, Aug 17, 2013 at 4:26 AM, Paolo G. Giarrusso
<p.gia...@gmail.com> wrote:
> Can't you have both implicits?

You can.

> Miles's examples suggest that U is inferred to a singleton type, at least in
> the context of a sameU call.

That's correct.

> In that case, U should also be specific enough
> to pick the right implicit - but seems seems to fail in my testing:

Try this,

scala> object api { class Universe { class Tree { type U =
Universe.this.type } } }
defined module api

scala> val (u1, u2) = (new api.Universe, new api.Universe)
u1: api.Universe = api$Universe@46956438
u2: api.Universe = api$Universe@79e6bcbb

scala> val (t1, t2) = (new u1.Tree, new u2.Tree)
t1: u1.Tree = api$Universe$Tree@50ecb4ed
t2: u2.Tree = api$Universe$Tree@645a5d1c

scala> def q[U <: api.Universe](tree: U#Tree)(implicit u: tree.U): U#Tree = tree
q: [U <: api.Universe](tree: U#Tree)(implicit u: tree.U)U#Tree

scala> implicit val iu1: u1.type = u1
iu1: u1.type = api$Universe@46956438

scala> implicit val iu2: u2.type = u2
iu2: u2.type = api$Universe@79e6bcbb

scala> q(t1)
res1: api.Universe#Tree = api$Universe$Tree@50ecb4ed

scala> q(t2)
res2: api.Universe#Tree = api$Universe$Tree@645a5d1c

Paolo Giarrusso

unread,
Aug 17, 2013, 9:55:01 PM8/17/13
to scala-i...@googlegroups.com
On Saturday, August 17, 2013 1:11:42 PM UTC+2, Miles Sabin wrote:
> On Sat, Aug 17, 2013 at 4:26 AM, Paolo G. Giarrusso
> <p.gia...@gmail.com> wrote:
> > Can't you have both implicits?
>
> You can.
>
> > Miles's examples suggest that U is inferred to a singleton type, at least in
> > the context of a sameU call.
>
> That's correct.
>
> > In that case, U should also be specific enough
> > to pick the right implicit - but seems seems to fail in my testing:
>
> Try this,
>
> scala> object api { class Universe { class Tree { type U =
> Universe.this.type } } }
> defined module api
>
> scala> val (u1, u2) = (new api.Universe, new api.Universe)
> u1: api.Universe = api$Universe@46956438
> u2: api.Universe = api$Universe@79e6bcbb
>
> scala> val (t1, t2) = (new u1.Tree, new u2.Tree)
> t1: u1.Tree = api$Universe$Tree@50ecb4ed
> t2: u2.Tree = api$Universe$Tree@645a5d1c
>
> scala> def q[U <: api.Universe](tree: U#Tree)(implicit u: tree.U): U#Tree = tree
> q: [U <: api.Universe](tree: U#Tree)(implicit u: tree.U)U#Tree

Distracted readers (like me) should note the type of `u` is not `U` any more :-)

> scala> implicit val iu1: u1.type = u1
> iu1: u1.type = api$Universe@46956438
>
> scala> implicit val iu2: u2.type = u2
> iu2: u2.type = api$Universe@79e6bcbb

I should have known better there!

> scala> q(t1)
> res1: api.Universe#Tree = api$Universe$Tree@50ecb4ed
>
> scala> q(t2)
> res2: api.Universe#Tree = api$Universe$Tree@645a5d1c

That's a great idea! But it seems we can do even better. With little
overhead, we can encode `sameU` without the extra parameter.

However, just after I wrote to my surprise "Scalac seems extremely
robust up to now", I ran into what seems a soundness bug, which makes
the encoded `sameU` too imprecise in some situations. But I didn't
sleep enough, so I might be missing something.

Can somebody find further limitations of this idiom? I think it
probably solves the original problem, and once refined, it deserves to
be blogged about.

I'll send all details in a second (and longer) email.
--
Paolo G. Giarrusso - Ph.D. Student, Philipps-University Marburg
http://www.informatik.uni-marburg.de/~pgiarrusso/

Paolo Giarrusso

unread,
Aug 17, 2013, 9:59:04 PM8/17/13
to scala-i...@googlegroups.com
Time for the details. Don't let the length scare you: a lot of it is a
transcript, and the main point comes early, so you can read as much as
you like.

1) q still returns api.Universe#Tree, but we can make the interface
more specific:

def q[U <: api.Universe](tree: U#Tree)(implicit u: tree.U): u.Tree =
tree.asInstanceOf[u.Tree]

Now we have a precise return type.

2) We have an ugly cast. That cast shouldn‘t be needed, but here
Scalac doesn’t know that in fact tree has type u.Tree. It knows this
in the universe itself, so we can define a conversion method there
without a cast.

object api { class Universe { class Tree { type U =
Universe.this.type; def toUTree(u: U): u.Tree = this } } }
def q[U <: api.Universe](tree: U#Tree)(implicit u: tree.U): u.Tree =
tree.toUTree(u)

3) Moreover, we don't even need type inference anymore:

def q(tree: api.Universe#Tree)(implicit u: tree.U): u.Tree = tree.toUTree(u)

At the end there's a transcript showing that this works.

To show that we can take multiple parameters dependent on the same
implicit parameter, let's also encode sameU the same way:

def sameU2(t1: api.Universe#Tree, t2: api.Universe#Tree)(implicit u:
t1.U with t2.U): (u.Tree, u.Tree) = (t1.toUTree(u), t2.toUTree(u))

Notice the with: we rely on A1 with A2 equaling A1 when A1 == A2, and
I dare say Scalac doesn't fail us. The only problem is when an
argument (t1 or t2) was upcast to api.Universe#Tree: in that case,
sameU2 incorrectly compiles, and toUTree allows casting t2.U to t1.U
without an explicit cast. That is, we have a soundness bug (a known
one?).

scala> sameU2(t1, qBadT2) //Does not fail either - doh!
res14: (iu1.Tree, iu1.Tree) =
(api$Universe$Tree@331d3edd,api$Universe$Tree@450a3962)
scala> (t1, t2)
res15: (u1.Tree, u2.Tree) =
(api$Universe$Tree@331d3edd,api$Universe$Tree@450a3962)

What the typechecker should do instead seems conceptually clear:

scala> iu1: t1.U with qBadT2.U //shouldn't compile
res16: t1.U with qBadT2.U = api$Universe@54405a01

scala> iu1: t1.U with t2.U //should fail, and it does
<error>

scala> val upcastT2 = t2: api.Universe#Tree

scala> iu1: t1.U with upcastT2.U //should fail, but it doesn't!
res21: t1.U with upcastT2.U = api$Universe@54405a01

The problem is that upcastT2.U seems simplified away or at least
ignored (maybe first to api.Universe?), but it should be treated like
an existential and not simplified: iu1: t1.U with upcastT2.U should
not be accepted.

Last notes:

I'd like to just declare two implicits at once by writing:

implicit val (u1, u2) = (new api.Universe, new api.Universe)

but it doesn't work - u1 and u2 aren't found by implicit resolution then.

Passing the universe to toUTree bugs me, but it seems Scalac doesn't
have enough reasoning power to avoid this:

scala> object api { class Universe { class Tree { type U =
Universe.this.type; def toUTree: Universe.this.Tree = this }}}
defined module api

scala> def q[U <: api.Universe](tree: U#Tree)(implicit u: tree.U):
u.Tree = tree.toUTree
<console>:41: error: type mismatch;
found : U#Tree
required: u.Tree
def q[U <: api.Universe](tree: U#Tree)(implicit u: tree.U):
u.Tree = tree.toUTree

^

A complete transcript follows, but probably you want to take a look at
https://gist.github.com/Blaisorblade/6259337 instead.

scala> object api { class Universe { class Tree { type U =
Universe.this.type; def toUTree(u: U): u.Tree = this } } }
defined module api

scala> def q(tree: api.Universe#Tree)(implicit u: tree.U): u.Tree =
tree.toUTree(u)
q: (tree: api.Universe#Tree)(implicit u: tree.U)u.Tree

scala> def sameU(u: api.Universe)(t1: u.Tree, t2: u.Tree) = (t1, t2)
sameU: (u: api.Universe)(t1: u.Tree, t2: u.Tree)(u.Tree, u.Tree)

scala> def sameU2(t1: api.Universe#Tree, t2:
api.Universe#Tree)(implicit u: t1.U with t2.U): (u.Tree, u.Tree) =
(t1.toUTree(u), t2.toUTree(u))
sameU2: (t1: api.Universe#Tree, t2: api.Universe#Tree)(implicit u:
t1.U with t2.U)(u.Tree, u.Tree)

scala> val (u1, u2) = (new api.Universe, new api.Universe)
u1: api.Universe = api$Universe@54405a01
u2: api.Universe = api$Universe@4cd522dd

scala> val (t1, t2) = (new u1.Tree, new u2.Tree)
t1: u1.Tree = api$Universe$Tree@331d3edd
t2: u2.Tree = api$Universe$Tree@450a3962

scala> implicit val iu1: u1.type = u1
iu1: u1.type = api$Universe@54405a01

scala> implicit val iu2: u2.type = u2
iu2: u2.type = api$Universe@4cd522dd

scala> //Test sameU2

scala> sameU2(t1, t2) //Fails
<console>:16: error: could not find implicit value for parameter u:
t1.U with t2.U
sameU2(t1, t2) //Fails
^

scala> sameU2(t1, t1)
res1: (iu1.Tree, iu1.Tree) =
(api$Universe$Tree@331d3edd,api$Universe$Tree@331d3edd)

scala> sameU2(t1, new u1.Tree)
res2: (iu1.Tree, iu1.Tree) =
(api$Universe$Tree@331d3edd,api$Universe$Tree@7b8d343a)

scala> sameU2(new u1.Tree, t1)
res3: (iu1.Tree, iu1.Tree) =
(api$Universe$Tree@42049d11,api$Universe$Tree@331d3edd)

scala> //This was transform:

scala> def qBad[U <: api.Universe](tree: U#Tree): U#Tree = tree
qBad: [U <: api.Universe](tree: U#Tree)U#Tree

scala> q(t1)
res4: iu1.Tree = api$Universe$Tree@331d3edd

scala> q(t2)
res5: iu2.Tree = api$Universe$Tree@450a3962

scala> //Call transform and q inline to help type inference:

scala> sameU(u1)(t1, qBad(t1))
res6: (u1.Tree, u1.Tree) =
(api$Universe$Tree@331d3edd,api$Universe$Tree@331d3edd)

scala> sameU(u1)(t1, q(t1))
res7: (u1.Tree, u1.Tree) =
(api$Universe$Tree@331d3edd,api$Universe$Tree@331d3edd)

scala> sameU2(t1, qBad(t1))
res8: (iu1.Tree, iu1.Tree) =
(api$Universe$Tree@331d3edd,api$Universe$Tree@331d3edd)

scala> sameU2(t1, q(t1))
res9: (iu1.Tree, iu1.Tree) =
(api$Universe$Tree@331d3edd,api$Universe$Tree@331d3edd)

scala>

scala> //Call transform and q *not* inline, to avoid helping type
inference too much:

scala>

scala> val qBadT1 = qBad(t1)
qBadT1: api.Universe#Tree = api$Universe$Tree@331d3edd

scala> val qBadT2 = qBad(t2)
qBadT2: api.Universe#Tree = api$Universe$Tree@450a3962

scala> val qT1 = q(t1)
qT1: iu1.Tree = api$Universe$Tree@331d3edd

scala> sameU(u1)(t1, qBadT1) //Fails, although somewhat spuriously.
<console>:17: error: type mismatch;
found : api.Universe#Tree
required: u1.Tree
sameU(u1)(t1, qBadT1) //Fails, although somewhat spuriously.
^

scala> sameU(u1)(t1, qT1) //Succeeds! Yeah!
res11: (u1.Tree, u1.Tree) =
(api$Universe$Tree@331d3edd,api$Universe$Tree@331d3edd)

scala> sameU2(t1, qBadT1) //Does not fail, even though qBadT1 was
upcast. Is this good?
res12: (iu1.Tree, iu1.Tree) =
(api$Universe$Tree@331d3edd,api$Universe$Tree@331d3edd)

scala> sameU2(t1, qT1)
res13: (iu1.Tree, iu1.Tree) =
(api$Universe$Tree@331d3edd,api$Universe$Tree@331d3edd)

scala> sameU2(t1, qBadT2) //Does not fail either - doh!
res14: (iu1.Tree, iu1.Tree) =
(api$Universe$Tree@331d3edd,api$Universe$Tree@450a3962)

Eugene Burmako

unread,
Aug 18, 2013, 1:46:03 AM8/18/13
to <scala-internals@googlegroups.com>
1) Am I right in thinking that for this method:


  def q(tree: api.Universe#Tree)(implicit u: tree.U): u.Tree = tree.toUTree(u)

It's possible to provide a tree from any universe as a parameter?

2) Would path-dependent tricks also work for extractors? Quasiquotes also need to expand in pattern-matching positions, e.g. as in `case q"$x.$y" => x`.


Miles Sabin

unread,
Aug 18, 2013, 6:19:51 AM8/18/13
to scala-internals
On Sun, Aug 18, 2013 at 6:46 AM, Eugene Burmako <eugene....@epfl.ch> wrote:
> 1) Am I right in thinking that for this method:
>
>
> def q(tree: api.Universe#Tree)(implicit u: tree.U): u.Tree =
> tree.toUTree(u)
>
> It's possible to provide a tree from any universe as a parameter?

Yes.

> 2) Would path-dependent tricks also work for extractors? Quasiquotes also
> need to expand in pattern-matching positions, e.g. as in `case q"$x.$y" =>
> x`.

If they don't currently work in extractors it'd be *very* nice to make
them work :-)

Eugene Burmako

unread,
Aug 18, 2013, 7:42:54 AM8/18/13
to <scala-internals@googlegroups.com>
1) that's actually undesired. Is it possible to avoid that?

2) actually I dont know whether they do, just have suspicions :)

Miles Sabin

unread,
Aug 18, 2013, 8:03:31 AM8/18/13
to scala-internals
On Sun, Aug 18, 2013 at 12:42 PM, Eugene Burmako <xen...@gmail.com> wrote:
> 1) that's actually undesired. Is it possible to avoid that?

Well, I should have qualified that: it accept any tree the universe of
which is implicitly in scope. So the way to prevent a given universe's
trees from being accepted is to make sure not to publish it
implicitly.
Reply all
Reply to author
Forward
0 new messages