Memory leak with pypy 1.6?

312 views
Skip to first unread message

Stefan Scholl

unread,
Aug 23, 2011, 7:40:54 AM8/23/11
to python-...@googlegroups.com
I'm testing Tornado 2.0 with PyPy 1.6 and it burns RAM like hell.

Just the simple Hello World programm from the Tornado page and
ApacheBench. After a lot of requests the used RAM (RES in top)
rises and rises.


It was fun to watch the speed getting better and better (JIT),
but it seems like Tornado on PyPy would need a restart from time
to time before it eats all the RAM there is.


Is this a known problem?
I remember reading something on the web2py list about files that
weren't closed in old versions of web2py. Is there perhaps
something similar in Tornado? Files which would be closed in
normal Python when there's no reference to them anymore.

And yes, I tested with normal Python 2.6: No problems.


Could be a PyPy bug.

Stefan Scholl

unread,
Aug 25, 2011, 5:15:28 AM8/25/11
to python-...@googlegroups.com
Stefan Scholl <ste...@no-spoon.de> wrote:
> I'm testing Tornado 2.0 with PyPy 1.6 and it burns RAM like hell.
>
> Just the simple Hello World programm from the Tornado page and
> ApacheBench. After a lot of requests the used RAM (RES in top)
> rises and rises.

Some more information:

RAM (column RES in top) at start: 24m

After 1,000,000 Requests (concurrency 10): 166m
After 2,000,000 Requests: 277m
After 3,000,000 Requests: 389m

With "--jit off" the same problem, but less extreme. So it's not
just a JIT thing.


web2py had some problems with PyPy, too. Here's a patch with
description + links:
http://code.google.com/p/web2py/issues/detail?id=288


--
Web (en): http://www.no-spoon.de/ -*- Web (de): http://www.frell.de/
<!--[if IE 6]><script>for(x in document.open);</script><![endif]-->

Stefan Scholl

unread,
Aug 26, 2011, 4:19:15 PM8/26/11
to python-...@googlegroups.com
And some comments on Reddit:
http://www.reddit.com/r/Python/comments/juwja/how_does_quora_deal_with_tornados_memory_leaks/


What I've learned: PyPy isn't a simple drop-in for CPython.
Beware of the Garbage Collection!

And it's not Tornado's fault. Just a nice-to-have for the future.

Ben Darnell

unread,
Aug 28, 2011, 3:27:41 AM8/28/11
to python-...@googlegroups.com
Interesting.  PYPY_GC_MAX does *something*, since if I set it too low I get out of memory errors, but other than that it doesn't seem to have much of an effect on the process's growth.  Looks like the growth is not in python objects, but in something that pypy's gc can't see (could be heap fragmentation)

FWIW, I believe tornado already explicitly closes all its file/socket objects.  If it didn't, you'd hit limits on the number of open file descriptors way before memory became an issue.

-Ben

Peter Bengtsson

unread,
Nov 21, 2011, 3:38:08 PM11/21/11
to python-...@googlegroups.com
For the record (in case people get here from a search), there's now this
"The memory footprint of some of our RPython modules has been drastically improved. This should impact any applications using for example cryptography, like tornado."
Reply all
Reply to author
Forward
0 new messages