Curiosity: why doesn't (count (doall (range 1000000000))) exhaust memory?

10 views
Skip to first unread message

Jason

unread,
Dec 21, 2008, 6:44:17 PM12/21/08
to Clojure
Yet another question, this time just a curiosity. Sorry for the
plethora of posts, but I'm trying to make sure I understand lazy seqs
properly.

Why doesn't (count (doall (range 10000000000))) cause an out-of memory
error? doall says it causes the entire seq to reside in memory at one
time, yet:

user> (count (range 100000000)) ; uses no memory, as expected
100000000
user> (count (doall (range 100000000))) ; still uses no memory!?
100000000
user> (count (doall (range 500000000))) ; even bigger, still no heap
growth
500000000
user> (count (doall (map identity (range 100000000)))) ; now as
expected
; Out of memory

Thanks,
Jason

Stephen C. Gilardi

unread,
Dec 21, 2008, 7:02:20 PM12/21/08
to clo...@googlegroups.com

On Dec 21, 2008, at 6:44 PM, Jason wrote:

Why doesn't (count (doall (range 10000000000))) cause an out-of memory
error?  doall says it causes the entire seq to reside in memory at one
time, yet:

(range n) produces an object that is a seq, not just one that's seq-able. Its "rest" operation is not implemented using lazy-cons, instead it returns an object that implements the rest seq in a self-contained way: a Range object that starts one increment higher. (see clojure.lang.Range, implementation of first and rest)

Holding onto the head in this case, does not keep a realized chain of objects in memory. Instead it holds the first one only. Subsequent "rests" are generated one by one during the doall and then discarded.

Your map example turns this into a chain of lazy-cons objects with the associated much greater memory use.

The doc for doall should probably be updated to say something along the lines of "it may cause the sequence to reside in memory all at once".

--Steve

Reply all
Reply to author
Forward
0 new messages