Just the simple Hello World programm from the Tornado page and
ApacheBench. After a lot of requests the used RAM (RES in top)
rises and rises.
It was fun to watch the speed getting better and better (JIT),
but it seems like Tornado on PyPy would need a restart from time
to time before it eats all the RAM there is.
Is this a known problem?
I remember reading something on the web2py list about files that
weren't closed in old versions of web2py. Is there perhaps
something similar in Tornado? Files which would be closed in
normal Python when there's no reference to them anymore.
And yes, I tested with normal Python 2.6: No problems.
Could be a PyPy bug.
Some more information:
RAM (column RES in top) at start: 24m
After 1,000,000 Requests (concurrency 10): 166m
After 2,000,000 Requests: 277m
After 3,000,000 Requests: 389m
With "--jit off" the same problem, but less extreme. So it's not
just a JIT thing.
web2py had some problems with PyPy, too. Here's a patch with
description + links:
http://code.google.com/p/web2py/issues/detail?id=288
--
Web (en): http://www.no-spoon.de/ -*- Web (de): http://www.frell.de/
<!--[if IE 6]><script>for(x in document.open);</script><![endif]-->
What I've learned: PyPy isn't a simple drop-in for CPython.
Beware of the Garbage Collection!
And it's not Tornado's fault. Just a nice-to-have for the future.