Hi, I encounter a problem (bug?) in using STM.
Given a par collection, say l = (0 until n).par
When I call 
l.map(i=> func(i))
and in func() I update a large vector in atomic {} which looks like
atomic {
 A() = A() + M_i // takes time, because A and M are large dimensional vector (5M or so), and the addition operation require more memory allocation.
}
It will throw OutOfMemoryError as follows ( I am sure I have enough heap memory for such operations except STM make copies for block purpose)
        java.lang.OutOfMemoryError ...	
        ...
        at scala.concurrent.stm.ccstm.InTxnImpl.runBlock(InTxnImpl.scala:560)
	at scala.concurrent.stm.ccstm.InTxnImpl.topLevelAttempt(InTxnImpl.scala:516)
	at scala.concurrent.stm.ccstm.InTxnImpl.topLevelAtomicImpl(InTxnImpl.scala:387)
	at scala.concurrent.stm.ccstm.InTxnImpl.atomic(InTxnImpl.scala:248)
	at scala.concurrent.stm.ccstm.CCSTMExecutor.apply(CCSTMExecutor.scala:24)
        ...
But If I do this way, it becomes memory safe. 
l.map(i=> {
  func(i)
  0.0}).reduce(_+_)
Could someone explain me about that?