10 MB is the default IOStream.max_buffer_size param.
I think the best way to override this is to subclass HTTPServer and override handle_stream() like so (not tested):
class LargeFileHTTPServer(HTTPServer):
def handle_stream(self, stream, address):
stream.IOStream.max_buffer_size = newsize
super(LargeFileHTTPServer, self). handle_stream( stream, address)
If you're expecting heavy loads you might consider to use nginx for handling uploads, it's better at it.
Hth.
Hi All,I was trying to upload a file of 10MB to my tornado server and I seeing these errorERROR:root:Reached maximum read buffer sizeERROR:root:Error in connection callbackTraceback (most recent call last):File "/Library/Python/2.7/site-packages/tornado-2.4-py2.7.egg/tornado/netutil.py", line 235, in _handle_connectionself.handle_stream(stream, address)File "/Library/Python/2.7/site-packages/tornado-2.4-py2.7.egg/tornado/httpserver.py", line 146, in handle_streamself.no_keep_alive, self.xheaders)File "/Library/Python/2.7/site-packages/tornado-2.4-py2.7.egg/tornado/httpserver.py", line 172, in __init__self.stream.read_until(b("\r\n\r\n"), self._header_callback)File "/Library/Python/2.7/site-packages/tornado-2.4-py2.7.egg/tornado/iostream.py", line 153, in read_untilself._try_inline_read()File "/Library/Python/2.7/site-packages/tornado-2.4-py2.7.egg/tornado/iostream.py", line 387, in _try_inline_readif self._read_to_buffer() == 0:File "/Library/Python/2.7/site-packages/tornado-2.4-py2.7.egg/tornado/iostream.py", line 436, in _read_to_bufferraise IOError("Reached maximum read buffer size")IOError: Reached maximum read buffer sizeAm I missing something?Thanks,Shrikar
Sorry, missed that extra zero.