Max Arity is not Max Azria

366 views
Skip to first unread message

Som Snytt

unread,
Jan 11, 2013, 11:06:11 PM1/11/13
to scala-internals

This is a stale email about whether it's supposed to be easy to increase max tuplicity.

The argument against it being easy is that if someone wants it, they have to buy a support contract.

My question is whether it was intentional to make it harder in 2.10, and whether anyone could possibly care (besides the OP).

Caveat: I haven't tried it recently.

----

Someone asked on SO if it's possible to bump MaxTupleArity and friends.
http://stackoverflow.com/questions/12599927/how-does-one-compile-scala-with-bigger-tuples/

(I know, you're thinking why are you wasting time on SO when you could be contributing in some small way to Scala?  It's the rep, man.)

I modified genprod to emit phased implementations of Tuple26 and friends (for example) so that you can just copy them in (a few times) and recompile.

Is that useful? OK, it's not useful, but is it a beautiful symmetry of infrastructure?

The one change is that case classes should care about MaxProductArity; if you exceed MaxFunctionArity, just stop supporting Function in the companion module.  (Right now, it silently and usefully truncates the param list.)

(The open question is the constant 22 in ScalaRuntime.)

Also, if this is a fun feature, maybe it should be a plugin.


Eugene Burmako

unread,
Jan 12, 2013, 1:31:19 AM1/12/13
to <scala-internals@googlegroups.com>
With type macros, some changes to Dynamic and optional changes to when Dynamic lookup is triggered, we could implement on-demand generation of tuples/products/etc with arbitrary arities. Does this sound interesting?

Simon Ochsenreither

unread,
Jan 12, 2013, 6:47:23 AM1/12/13
to scala-i...@googlegroups.com

With type macros, some changes to Dynamic and optional changes to when Dynamic lookup is triggered, we could implement on-demand generation of tuples/products/etc with arbitrary arities. Does this sound interesting?

The issue which comes up when I'm thinking about is how this works in a world with separate compilation. E. g. library A uses a Tuple2, so it gets bundled with the libraries jar file, library B also uses a Tuple2, so it ends up there as well. I guess it will work as long as the classes are completely identical, but I fear what happens when people start playing with classloaders. (Assuming on-demand means on-demand at compile-time ... emitting the code at runtime would probably have a lot of additional issues like performance).

Do you have any insights into this?

Eugene Burmako

unread,
Jan 12, 2013, 6:54:13 AM1/12/13
to <scala-internals@googlegroups.com>, Josh Suereth
That's what I also wondered about. How are custom classloaders going to cause troubles? Also what about OSGi?

Paul Phillips

unread,
Jan 12, 2013, 11:46:51 AM1/12/13
to scala-i...@googlegroups.com

On Fri, Jan 11, 2013 at 8:06 PM, Som Snytt <som....@gmail.com> wrote:
This is a stale email about whether it's supposed to be easy to increase max tuplicity.

You got me wondering about how far we could get without macros. Imagine if we granted ourselves the literal singleton types which presently hover just outside the building, awaiting a kind word. Then the code enclosed at the end could look more like

  type Nat = x.type forSome { type X <: Int with Singleton ; val x: X ; x > 0 }
  trait VType[Arity <: Nat] extends BetterDynamic

Here is an instance of it:

  val myTuple: VType[3.type] { type Cons = (Int, (Int, (Double, Nothing))) }

We need to enhance Dynamic (BetterDynamic, here) to offer a compile-time check (this is in effect an extremely constrained macro) like:

  def selectDynamicValidate(name: String): Boolean // returns false to cause compilation error

That way we can exploit the fact that we are parameterized on an integer literal to perform a compile time check restricting the "dynamic" calls on a VType[n.type] to

  _1, _2, _3, _4, ... _n

yet _(n+ 1) is a compile time error.

My sense is that these modest changes (which only generalize existing features - arguably they make things simpler) would allow these classes to be fully arity-generic without requiring macros at all. Even FunctionN could work this way (I ignore performance for now) by having a single method under the hood:

  def applyImpl(args: Any*): Any

Then the type system and "BetterDynamic" would be tasked with enforcing that the arguments conform to the encoded parameter types. Think how much easier this would make the composition and decomposition of function parameter lists and tuples.

But I don't really have time to pursue it.

scala> trait VType { outer =>
     |   type Cons <: ((_, _))
     |   def apply[T] = new VType {
     |     type Cons = ((T, outer.Cons))
     |   }
     | }
defined trait VType

scala> val nil = new VType { type Cons = Nothing }
nil: VType{type Cons = Nothing} = $anon$1@2723a4c6

scala> nil[List[Int]][Set[Int]][Double][Int][Int]
res0: VType{type Cons = (Int, (Int, (Double, (scala.collection.immutable.Set[Int], (List[Int], Nothing)))))} = VType$$anon$1@4b4a0be0

Miles Sabin

unread,
Jan 12, 2013, 1:24:06 PM1/12/13
to scala-i...@googlegroups.com
On Sat, Jan 12, 2013 at 4:46 PM, Paul Phillips <pa...@improving.org> wrote:
> type Nat = x.type forSome { type X <: Int with Singleton ; val x: X ; x > 0 }
^^^^^
At this point you've helped yourself to full-spectrum dependent types
(hurrah!) so I think we can safely say we're done :-)

Cheers,


Miles

--
Miles Sabin
tel: +44 7813 944 528
skype: milessabin
gtalk: mi...@milessabin.com
g+: http://www.milessabin.com
http://twitter.com/milessabin

Paolo G. Giarrusso

unread,
Jan 12, 2013, 10:22:24 PM1/12/13
to scala-i...@googlegroups.com
Il giorno sabato 12 gennaio 2013 19:24:06 UTC+1, Miles Sabin ha scritto:
On Sat, Jan 12, 2013 at 4:46 PM, Paul Phillips <pa...@improving.org> wrote:
> type Nat = x.type forSome { type X <: Int with Singleton ; val x: X ; x > 0 }
                                                                        ^^^^^
At this point you've helped yourself to full-spectrum dependent types
(hurrah!) so I think we can safely say we're done :-)
 
Scala's very limited dependent typing is sometimes rather annoying, so I see your point, but adding dependent types is never that trivial.

Stefan Zeiger

unread,
Jan 14, 2013, 7:23:13 AM1/14/13
to scala-i...@googlegroups.com
On 2013-01-12 7:31, Eugene Burmako wrote:
> With type macros, some changes to Dynamic and optional changes to when
> Dynamic lookup is triggered, we could implement on-demand generation
> of tuples/products/etc with arbitrary arities. Does this sound
> interesting?

I don't think we should generate tuple classes. An advantage of that
would be ad-hoc specialization for large tuples (currently limited up to
arity 3 because it's not ad-hoc, so the number of required classes grows
exponentially). The disadvantage is the mess caused by having multiple
copies of the same class in different JARs or even different classloaders.

My current plan goes like this:

- Start with an HArray encoding that works without any implicits or
other hidden runtime overhead:
https://github.com/szeiger/ErasedTypes/blob/master/src/main/scala/com/novocode/erased/HArray.scala.
All methods which abstract over arity are defined in HArray. (There's
not much to see in this HArray implementation but the HList used for
typing it has a lot more methods, e.g. head, tail, consing,
concatenation, all of which can be typechecked without runtime overhead)

- Have arity-specialized tuple classes up to a certain size (initially
22, can be reduced later) which extend HArray. Tuple2 and Tuple3 would
be kept fully specialized. For larger arities, use an array-based HArray
implementation (HArrayA in the code above). This can be made a value
class in 2.10 so it would erase to a raw array in many cases.

- The fixed _1 ... _n accessor methods could be added to HArrayA with
smoke and mirrors (a.k.a. Dynamic and macros). Problem: Auto-complete in
IDEs won't work out of the box.

- The desugaring in the parser changes tuple syntax to the most specific
implementation (i.e. (1,2) becomes Tuple2(1,2), (1,2,...,23) becomes
HArrayA(Array[Any](1,2,...,23)). Once you abstract over arity (e.g.
concatenating two tuples), you only get an HArray, even if the
implementation is (or could be) specialized.

To do:

- Can Nat be a value class? (Doesn't work in 2.10 because you cannot get
a singleton type from a value class)

- What about functions?

-sz

Alois Cochard

unread,
Jan 14, 2013, 9:24:06 AM1/14/13
to scala-i...@googlegroups.com
Hi Stefan,

Sounds quite cool! some time I was hoping to see something like this landing in standard lib :)

Should we except a SIP coming soon about that?
Reply all
Reply to author
Forward
0 new messages