On Sun, Jun 21, 2009 at 04:31:10PM +0200, Robin Smidsrød wrote:
>
> I've been discussing memory profiling as of late, and one person (Chris
> Prather) suggested that maybe I should talk to you about including
> something like this in NYTProf before I go about trying to make my own.
>
> How hard would it be to add measurement of memory used by the process in
> NYTProf?
>
> My article,
> http://blog.robin.smidsrod.no/index.php/2009/05/26/memory-footprint-of-popular-cpan-modules,
> shows a simple run of memory profiling for some modules, and I'd like to
> try and extend it to a profiling module.
>
> Is NYTProf made extensible to support this kind of thing, or is memory
> profiling something that wouldn't work with its current architecture?
Fairly simple to do something fairly simple. But tricky to do something
worthwhile.
Copy-on-write, for example, is a significant issue. Then there's the
issue of memory allocated to lexical pads that were only used once, for
example. As is memory allocated to the perl process but not currently
used by it.
Your perlbloat.pl script, for example, doesn't account for modules that
use lots of memory *while loading* but then free much of it (back to
perl's own storage pool). The 'bloat' would be reported too high.
When the next module is loaded it'll use some of that freed memory
so its bloat would be reported too low.
Lots of tricky detail here, in terms of what can be usefully measured
and what it actually means. Lots of thought would also need to be given
to how to report it in a meaningful way.
Having said all that, I am interested in extending NYTProf in this
direction.
> PS: Are you ever on irc.perl.org? If so, under which name?
timbunce, but rarely.
Tim.