creating a streaming handler

219 views
Skip to first unread message

Andrew Gwozdziewycz

unread,
Oct 16, 2009, 5:48:10 PM10/16/09
to python-...@googlegroups.com
Hey All,

I'm experimenting with the idea of building a streaming handler for
tornado, akin to the streaming apis that Twitter provides. For those
that aren't familiar, basically when you connect to the streaming
APIs, you get a "never-ending" HTTP response with the following
headers:

HTTP/1.1 200 OK
Content-Type: application/json
Transfer-Encoding: chunked
Server: Jetty(6.1.17)

And a stream of json, or xml which you can consume. So, it seems that
tornado has no way to currently do this, and I'm wondering if I just
haven't discovered it yet, or if it's by design.

Sorry for the poorly written example, but the example should
illustrate the problem I'm having.

#!/usr/bin/env python
import logging, httplib
import tornado.web
import tornado.httpserver
from tornado.options import define, options

define("port", default=8888, help="run on the given port", type=int)

class StreamHandler(tornado.web.RequestHandler):
id = 1
waiters = {}

def __init__(self, *args, **kwargs):
tornado.web.RequestHandler.__init__(self, *args, **kwargs)
self.id = StreamHandler.id
StreamHandler.id += 1

@tornado.web.asynchronous
def get(self):
self.wait(self.async_callback(self._on_message))
self.new_message("%d %s" % \
(self.id,
self.get_argument('msg', '<NO MESSAGE>')))

def wait(self, callback):
self.waiters[self.id] = callback

def _on_message(self, msg):
if not self.request.connection.stream.closed():
self.write("{%d: %s}" % (self.id, repr(msg)))
self.flush()
else:
del StreamHandler.waiters[self.id]

def new_message(self, msg):
for callback in StreamHandler.waiters.values():
try:
callback(msg)
except Exception, e:
logging.exception(e)

def _generate_headers(self):
lines = [self.request.version + " " + str(self._status_code) + " " +
httplib.responses[self._status_code]]
lines.append("Content-Type: text/plain")
lines.append("Transfer-Encoding: chunked")
return "\r\n".join(lines) + "\r\n\r\n"

class Application(tornado.web.Application):
def __init__(self):
handlers = [
(r"/stream", StreamHandler),
]
settings = dict()
tornado.web.Application.__init__(self, handlers, **settings)

def main():
tornado.options.parse_command_line()
http_server = tornado.httpserver.HTTPServer(Application())
http_server.listen(options.port)
tornado.ioloop.IOLoop.instance().start()

if __name__ == '__main__':
main()


Basically, what ends up happening is I get headers written but no
content. I've been looking for a way to flush the buffer and just
write it, but haven't found any clues. My initial intuition was
obviously wrong, as handler.write(); handler.flush() doesn't do what I
want. Anyone have any ideas, or help?

Thanks,

Andrew

--
http://www.apgwoz.com

chris

unread,
Oct 21, 2009, 3:34:22 PM10/21/09
to Tornado Web Server
Hi Andrew,


I wouldn't mind figuring out how to do a streaming handler w/ tornado
myself.

So, I want to send data to the client, but still hold the connection
open after I send data. I would have guessed self.flush() would do the
job, too.

Were you able to figure it out?

Does anyone else know if this is possible w/ tornado? It's not very
different from the functionality that tornado currently offers, all we
need to be able to do is to flush data to the client at will.


Best,
Chris

Andrew Gwozdziewycz

unread,
Oct 21, 2009, 4:04:44 PM10/21/09
to python-...@googlegroups.com
I haven't gone any further with my tests, and haven't had enough time
to make modifications to get it to work either.

--
http://www.apgwoz.com

alberto

unread,
Oct 30, 2009, 5:39:50 AM10/30/09
to Tornado Web Server
I've made this stupid example and it seems to work:

python: http://nopaste.com/p/aYITGaPAx
html+js: http://nopaste.com/p/a5sXo2I7D


what do you think?

On Oct 21, 9:04 pm, Andrew Gwozdziewycz <apg...@gmail.com> wrote:
> I haven't gone any further with my tests, and haven't had enough time
> to make modifications to get it to work either.
>
>
>
>
>
> On Wed, Oct 21, 2009 at 3:34 PM, chris <rob...@gmail.com> wrote:
>
> > Hi Andrew,
>
> > I wouldn't mind figuring out how to do astreaminghandler w/ tornado
> > myself.
>
> > So, I want to send data to the client, but still hold the connection
> > open after I send data. I would have guessed self.flush() would do the
> > job, too.
>
> > Were you able to figure it out?
>
> > Does anyone else know if this is possible w/ tornado? It's not very
> > different from the functionality that tornado currently offers, all we
> > need to be able to do is to flush data to the client at will.
>
> > Best,
> > Chris
>
> > On Oct 16, 3:48 pm, Andrew Gwozdziewycz <apg...@gmail.com> wrote:
> >> Hey All,
>
> >> I'm experimenting with the idea of building astreaminghandler for
> >> tornado, akin to thestreamingapis that Twitter provides. For those

chris

unread,
Nov 10, 2009, 11:52:07 AM11/10/09
to Tornado Web Server
Alberto,

could you repost that? The nopaste.com site is down.


Thank you very much,
Best,
Chris

alberto

unread,
Nov 13, 2009, 4:10:03 AM11/13/09
to Tornado Web Server
python: http://nopaste.info/7bcf0b20ff.html
html+js: http://nopaste.info/91d62e9f9e.html

damn IE... i think in some days i will implement another cross-browser
solution using continuous iframe

alberto

unread,
Nov 13, 2009, 5:05:24 AM11/13/09
to Tornado Web Server
client updated: http://nopaste.info/fb3368ed0b.html

it's a stupid chat
input labeled "u" accepts single username for private message (no
username = message for everybody)
input labeled "m" accepts message

IMPORTANT: access to application passing username as GET parameter in
this way:
http://localhost:8888/?username=alberto

chris

unread,
Nov 16, 2009, 12:08:11 PM11/16/09
to Tornado Web Server
Alberto,


Thank you very much. I was able to make streaming work with your
example.

For anyone reading this, there's one gotcha that you might be running
into if you run Tornado behind Nginx. Make sure your nginx
configuration file has:
proxy_buffering off;

Proxy buffering needs to be off so that each time Tornado sends some
bytes over the connection, nginx sends it out to the client directly.
If it's on, nginx buffers the data until the connection closes.



Best,
Rob
Reply all
Reply to author
Forward
0 new messages