Streaming large response bodys to slow clients

20 views
Skip to first unread message

vik...@ibiblio.org

unread,
Feb 21, 2017, 7:10:45 PM2/21/17
to sinatrarb
Hello everybody,

I'm playing with various frameworks and techniques to stream large, dynamically generated response bodies to (slow) clients, on a server with limited memory resources. In other words, I have to handle back pressure.

While it has been said that EventMachine had not "a proper API" for this, I managed to get into the event loop with Thin, and the results are not bad at all (half the CPU time, half the memory usage as the Reel equivalent). But I don't like too much that I'm actively generating content, then actively watching Thin's queue and "actively" waiting (my code is still called at each EM tick).

The ideal scenario would be to provide an Enumerator ("generator function") as the Response and have the web server call that at the same speed as the HTTP client consumes the data. But am I right that this is not how Rack works? When I give an object responding to #each(), it seems that Rack calls it all at once, independent of how fast the client consumes the data.

I had a look at Sinatra::Streaming::FileStreamer, which uses this technique. Its doc says "Sends the file by streaming it 8192 bytes at a time. This way the whole file doesn’t need to be read into memory at once. This makes it feasible to send even large files." - but it is only feasible when the client consumes the data quickly. Am I missing something?

Thank you for reading,
Viktor.


Reply all
Reply to author
Forward
0 new messages