Sorry for the long text and for my bad english :)
gls
On May 25, 5:33 pm, Giuseppe Luca Scrofani <glsdes...@gmail.com>
wrote:
My first post here and just started to work with web2py on a social
site. Great work Massimo! Batteries included but still light.
Gevent: now with support of Postgress, probably the fastest out there
Eventlet: used at Lindenlab / Second Life
Concurrence: with handy async mysql interface
Tornado: full async webserver in Python
Massimo: what do you think of an asynchronous model for web2py? It'd
be great to to have asynchronous capabilities. I am writing an app
that will require quite a bit of client initiated background
processing (sending emails, resizing images) which I would rather hand
off to a green thread and not block one the web2py threads. Curious
about your thoughts.
BTW - my first post here. Started to use for web2py for a community
site and enjoy working in it a lot! Great work.
On May 25, 9:39 pm, Candid <roman.bat...@gmail.com> wrote:
I have never used comet but I do not see any major problem
> Massimo, does web2py use a threadpool
> under the hood? For comet you would then quickly run out of threads.
The web server creates a thread pool. for stand alone web2py that
would be Rocket.
You do not run out of them any more than any other web app.
> If you'd try to do this with a thread per connection things would get
> out of hand pretty quickly so the best way is doing the work
> asynchronously like Orbited. Alternatives would be using one of the
> contemporary Python asynchronous libraries. These libraries provide
> monkey patching of synchronous calls like your url fetching. Some
> suggestions:
>
> Gevent: now with support of Postgress, probably the fastest out there
> Eventlet: used at Lindenlab / Second Life
> Concurrence: with handy async mysql interface
> Tornado: full async webserver in Python
>
> Massimo: what do you think of an asynchronous model for web2py? It'd
> be great to to have asynchronous capabilities. I am writing an app
> that will require quite a bit of client initiated background
> processing (sending emails, resizing images) which I would rather hand
> off to a green thread and not block one the web2py threads. Curious
> about your thoughts.
I do not think we can use async IO with web2py. async IO as far as I
understand would require a different programming style.
Anyway, if you have a working proof of concept I would like to see it.
Massimo
The idea of Comet is to keep the connection open to the client and
flow data as it becomes available:
http://en.wikipedia.org/wiki/Comet_%28programming%29
It saves the overhead of a client polling at intervals and
establishing the connection each time. In a thread per connection
model you would need to keep a thread available per client. A thread
per client can get expensive quickly and does not scale nicely. After
a few hundred connections most servers slow down dramatically because
of thread context switching. See also:
http://www.kegel.com/c10k.html
For most web apps a thread per connection (from a threadpool) won't be
a problem but for for things like Ajax email applications or chat / IM
it does get troublesome.
My plan is to require basic auth, and then put the client ID into the
session and to put any data on subscribed channels into that
connection, then just use long-polling to keep it open.
Since there's no real way for a web2py app to be notified of internal
state changes, I'm not sure long term how I would handle actually
looking for anything to send out over the long poll. Though I've had
some thoughts of writing a scheduler for web2py with granularity of a
second or so.