lazy def == memoization ??

1,349 views
Skip to first unread message

Bjorn Regnell

unread,
Jan 26, 2013, 9:56:02 AM1/26/13
to scala...@googlegroups.com
/*
Hi,

I have recently given a graduate course in Scala at Lund University and the following idea about a possible generalization of "lazy" not only for "val" but also for "def" came up in our course discussion. The module lazyDef below shows one interpretation of what we could mean with a "lazy def".
The idea is that IF you know that your def have no side effects and is a "pure" function in the sense that for every argument there is exactly one result, then you could mark it lazy and the result is only evaluated once then cached for next access.
This memoization semantics is not exactly lazy in the Haskell sense, but at least it is "lazy" in the sense that it only cares to do the work of calculating a result once and it could perhaps be seen as a generalization of what a lazy val is, as the case with a lazy def with no parameters is similar to a lazy val according to the proposed semantics below.
The proposal means that
  lazy def fib(n: Int): BigInt = if (n <= 2) 1 else fib(n-1) + fib(n-2)
would expand to:
  import lazyDef._ // module lazyDef defined below
  val fib: Int => BigInt = memoize {
    (n: Int) => if (n <= 2) 1 else fib(n-1) + fib(n-2)
  }
which runs much faster for a large n than this (try e.g. with n = 42):
  def fibSlow(n: Int): BigInt = if (n <= 2) 1 else fibSlow(n-1) + fibSlow(n-2)
 
***Do you think this would be a feasible/sensible/useful generalization of lazy to def in Scala?

*/

object lazyDef  { //proposed memoization sematics of "lazy def"
  class CachedFunction0[+R](f: () => R) extends Function0[R] {
    private[this] lazy val result = f()
    def apply(): R = result
  }
  class CachedFunction1[-T1,+R](f: T1 => R) extends Function1[T1,R] {
    private[this] var cache = collection.mutable.Map.empty[T1,R]
    def apply(p1: T1): R = cache.getOrElseUpdate(p1, f(p1))
  }
  class CachedFunction2[-T1,-T2,+R](f: (T1,T2) => R) extends Function2[T1,T2,R] {
    private[this] var cache = collection.mutable.Map.empty[(T1, T2),R]
    def apply(p1: T1, p2: T2) : R = cache.getOrElseUpdate((p1, p2), f(p1, p2))
  }
  class CachedFunction3[-T1,-T2,-T3,+R](f: (T1,T2,T3) => R) extends Function3[T1,T2,T3,R] {
    private[this] var cache = collection.mutable.Map.empty[(T1,T2,T3),R]
    def apply(p1: T1, p2: T2, p3: T3) : R = cache.getOrElseUpdate((p1, p2, p3), f(p1, p2, p3))
  } 
  //... etc.

  def memoize[R](f: () => R): CachedFunction0[R] =
   new CachedFunction0[R](f)
  def memoize[T1,R](f: T1 => R): CachedFunction1[T1,R] =
    new CachedFunction1[T1,R](f)  
  def memoize[T1,T2,R](f: (T1,T2) => R): CachedFunction2[T1,T2,R] =
    new CachedFunction2[T1,T2,R](f)
  def memoize[T1,T2,T3,R](f: (T1,T2,T3) => R): CachedFunction3[T1,T2,T3,R] =
    new CachedFunction3[T1,T2,T3,R](f)
  //... etc.
}

import lazyDef._
val fib: Int => BigInt = memoize {
  (n: Int) => if (n <= 2) 1 else fib(n-1) + fib(n-2)
}
def fibSlow(n: Int): BigInt =
  if (n <= 2) 1 else fibSlow(n-1) + fibSlow(n-2)
// scala> fibSlow(42)
// scala> fib(42)

/*
lazy def const() = 42                 //expands to: val const = memoize { () => 42 }  //cf. lazy val
lazy def twice(x: Int) = 2 * x        //expands to: val twice = memoize { (x: Int) => 2*x } 
lazy def add(x: Int, y: Int) = x + y  //expands to: val add = memoize { (x: Int, y: Int) => x + y }
*/



Paul Phillips

unread,
Jan 26, 2013, 10:02:24 AM1/26/13
to scala...@googlegroups.com

On Sat, Jan 26, 2013 at 6:56 AM, Bjorn Regnell <bjorn....@cs.lth.se> wrote:
I have recently given a graduate course in Scala at Lund University and the following idea about a possible generalization of "lazy" not only for "val" but also for "def" came up in our course discussion. The module lazyDef below shows one interpretation of what we could mean with a "lazy def".

Previously: https://groups.google.com/forum/#!topic/scala-debate/MGaNlWadIzc


Bjorn Regnell

unread,
Jan 26, 2013, 10:33:35 AM1/26/13
to scala...@googlegroups.com
Thanks for the pointer! - sorry for having overlooked that debate.
I see that the thread includes some nice arguments for memoization language support and lazy def means no extra keywords have to be added... so I am voting 'yes' to lazy def :-)
Could the compiler help me check that my def is a "pure" function if I mark it lazy?

Chris Marshall

unread,
Jan 27, 2013, 5:20:47 AM1/27/13
to scala...@googlegroups.com, scala...@googlegroups.com
I vote yes for this too. Admittedly the default behaviour might not suit all cases but it would be fine for the overwhelming majority.

Aside: in scalaz, you can memoize any F1 using !
--
 
 

Daniel Sobral

unread,
Feb 1, 2013, 1:15:47 AM2/1/13
to scala...@googlegroups.com
Actually, one of the most important counter arguments (given by Odersky himself, iirc), is that memoization needs too much customization to avoid memory leak, etc. However, if macro annotations become part of the language, then any library can provide memoization of methods through such an annotation, we seems much better than setting into stone how memoization in Scala works.


--
 
 



--
Daniel C. Sobral

I travel to the future all the time.

Paul Phillips

unread,
Feb 1, 2013, 1:34:07 AM2/1/13
to scala...@googlegroups.com

On Thu, Jan 31, 2013 at 10:15 PM, Daniel Sobral <dcso...@gmail.com> wrote:
Actually, one of the most important counter arguments (given by Odersky himself, iirc), is that memoization needs too much customization to avoid memory leak, etc. However, if macro annotations become part of the language, then any library can provide memoization of methods through such an annotation, we seems much better than setting into stone how memoization in Scala works.

Yes, we should be going the other direction: lazy val assumes too much and excludes important uses.

Chris Marshall

unread,
Feb 1, 2013, 1:05:46 PM2/1/13
to scala...@googlegroups.com
But the simple case works (like lazy val) for the overwhelming majority of use cases. Why block a perfectly valid feature which would make everyone's lives simpler because for a small proportion of people it's not flexible enough? Funnily enough, one of the arguments I was going to make in favour was the fact that "lazy val" exists in the language: scala would be worse without it

Chris


--
You received this message because you are subscribed to the Google Groups "scala-sips" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-sips+...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 

Paul Phillips

unread,
Feb 1, 2013, 1:08:05 PM2/1/13
to scala...@googlegroups.com

On Fri, Feb 1, 2013 at 10:05 AM, Chris Marshall <oxbow...@gmail.com> wrote:
Why block a perfectly valid feature which would make everyone's lives simpler because for a small proportion of people it's not flexible enough?

Because I am a sadist.

Daniel Sobral

unread,
Feb 3, 2013, 3:02:57 AM2/3/13
to scala...@googlegroups.com
On 1 February 2013 16:05, Chris Marshall <oxbow...@gmail.com> wrote:
But the simple case works (like lazy val) for the overwhelming majority of use cases. Why block a perfectly valid feature which would make everyone's lives simpler because for a small proportion of people it's not flexible enough? Funnily enough, one of the arguments I was going to make in favour was the fact that "lazy val" exists in the language: scala would be worse without it

Look, I like the idea of having memoization in the language, and I have thought of "lazy def" as the way to do it before. There's probably some discussion where I suggest it (I have seen other suggestions that were definitely not to my liking). But I find the argument about memory leaks compelling. Any non-trivial use would be subject to that which means that it must be handled somehow. Add thread safety and good performance, and you have the workings of a very complex feature, which will add a considerable maintenance load and possibly become a hindrance to other work on the compiler (such as macros :).

I would still be favorable to it all the same IF an alternative wasn't in sight, which is macro annotations. Granted that this one is coming last on the list of possible macros, and that macros are not a sure thing yet in the language, but I'd rather for the scala core team to tackle macro annotations than memoization.

For that matter, it's perfectly possible to implement memoization with the existing macros! Not as clean, granted, and suffering from parameter arity problems, but, consider this proof of concept:

import scala.reflect.macros.Context
val map = collection.mutable.Map[(Long, Long), Long]()

def memoImpl(c: Context)(a: c.Expr[Long], b: c.Expr[Long])(f: c.Expr[(Long, Long) => Long]) = {
  import c.universe._
  reify {
    map.getOrElseUpdate((a.splice, b.splice), f.splice.apply(a.splice, b.splice))
  }
}
def memo(a: Long, b: Long)(f: (Long, Long) => Long) = macro memoImpl

def ackermann(m: Long, n: Long): Long = memo(m, n) {
  case (0, _) => n + 1
  case (_, 0) => ackermann(m - 1, 1)
  case _      => ackermann(m - 1, ackermann(m, n - 1))
}

Doing it through libraries has an added advantage: it would greatly decrease the time to introduce such a feature. Imagine how long would it get discussed if it were added directly to the language!

 

Chris



On Fri, Feb 1, 2013 at 6:34 AM, Paul Phillips <pa...@improving.org> wrote:

On Thu, Jan 31, 2013 at 10:15 PM, Daniel Sobral <dcso...@gmail.com> wrote:
Actually, one of the most important counter arguments (given by Odersky himself, iirc), is that memoization needs too much customization to avoid memory leak, etc. However, if macro annotations become part of the language, then any library can provide memoization of methods through such an annotation, we seems much better than setting into stone how memoization in Scala works.

Yes, we should be going the other direction: lazy val assumes too much and excludes important uses.

--
You received this message because you are subscribed to the Google Groups "scala-sips" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-sips+...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 

--
You received this message because you are subscribed to the Google Groups "scala-sips" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-sips+...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 

Seth Tisue

unread,
Feb 3, 2013, 10:22:45 AM2/3/13
to scala...@googlegroups.com
On Sun, Feb 3, 2013 at 3:02 AM, Daniel Sobral <dcso...@gmail.com> wrote:
> I'd rather for the scala core team to tackle macro annotations
> than memoization.

+1

--
Seth Tisue - se...@tisue.net - http://tisue.net
Reply all
Reply to author
Forward
0 new messages