Chunked responses in constant space

184 views
Skip to first unread message

Michael Xavier

unread,
May 16, 2014, 12:52:27 PM5/16/14
to httpar...@googlegroups.com
Hi,

At work we're using HTTParty for all of our HTTP client needs and one of the use cases that came up is streaming large files to disk. These files are large enough that it isn't a great idea to read the whole file into memory. We instead want to stream it to a file on disk and then read from there in constant space. After digging in the source I found that you could get chunks of the response yielded to you:

HTTParty.get(url) do |chunk|
  f.write(chunk)
end

Which is perfect! The problem is that it looks like HTTParty collects these chunks in memory and concatenates them into a string and returns that, so memory usage will not be constant but instead be a function of the file size. I was curious about the reasoning behind this. It seems to defeat the purpose of chunking but maybe I'm misunderstanding something.
Reply all
Reply to author
Forward
0 new messages