> Last night, I ran into an issue with clojure.contrib.sql:
>
> If you have a very large dataset you will run out of memory because by
> default, it pulls the entire result set into memory (even though it
> creates a lazy-seq using resultset-seq).
>
> This issue has been discussed previously here:
> http://groups.google.com/group/clojure/browse_thread/thread/7b0c250e0ba6c9eb/fb9001522b49c20a
>
> The fix is pretty simple, just call (.setFetchSize stmt 1). But,
> clojure.contrib.sql/with-query-results doesn't give you a way to do
> that currently. So I've created a patch that supports setting any of
> the attributes on stmt in the call to with-query-results. Here's the
> code:
>
> http://github.com/ninjudd/clojure-contrib/commit/332fe019c864fcd2f052a8bd13340c0ec259e5c4
>
> Now you can do this:
>
> (with-connection {…}
> (.setAutoCommit (sql/connection) false) ;; needed for postgres
> (with-query-results results ["SELECT id, data FROM nodes"]
> {:fetch-size 1000}
> (doseq [r results]
> …)))
Please do open an issue. I'll take a look at the fix. Do you have a contributor agreement in place?
--Steve