bug #333

20 views
Skip to first unread message

Teo Tei

unread,
Aug 29, 2015, 7:11:59 PM8/29/15
to XCache
Do I understand correctly how this bug was "fixed"?
http://xcache.lighttpd.net/ticket/333

I don't have enough knowledge to understand the source code, hence the question:

I see this comment in the commit:
"reduce memory usage for small or empty files"

REDUCE? SMALL?
Does that mean that include()ing the same file N times still consumes O(N) memory, even if much less than before?
Does that mean that the memory leak for large files is not affected?

If any of these is true, then the bug is NOT fixed at all, its impact is just mitigated.

Note that without xcache, this code:
for ($i=0;$i<10000; $i++) {
  include "somefile.php";
}

doesn't consume any more memory than
for ($i=0; $i<100000; $i++) {
  // paste code directly here
}

at least as far as reaching the memory limit is concerned.

If you are including the SAME file, there's no reason any additional memory should be used, or at least, the memory for caching the file should only be used once, and whatever extra memory is needed for each include, should be released after edning the include.

Or is it just that the commit summary is misleading, and the memory leak is, in fact, eliminated completely?


P.S. I would post this as a comment to the bug report, but since trac is broken and doesn't send me the account verification email, I cannot comment there.

Xuefer[mOo]

unread,
Aug 30, 2015, 12:56:53 AM8/30/15
to XCache
i'm not sure what cause it stack memory up

maybe XG(gc_op_arrays) get destroied too late. please set xcache.readonly_protection=on, and make sure it's on (by using xcache.mmap_path=/tmp/something instead of /dev/zero), then try again to see if it reproduced
Reply all
Reply to author
Forward
0 new messages