in reply to Memory usage of a "sub" in mod_perl

It is normal for a process to remain at the maximum size it has reached. Because you undef $page, that memory should be available to perl for other uses. If you see the process continue to grow, that's when you need to worry.

I don't understand how 109K of HTML causes your process to grow 58MB. Do you actually have strings that take up that much space in a single request?

  • Comment on Re: Memory usage of a "sub" in mod_perl

Replies are listed 'Best First'.
Re^2: Memory usage of a "sub" in mod_perl
by diego_de_lima (Beadle) on May 23, 2008 at 19:33 UTC
    Thnaks for all answers, they were quite usefull.

    Iīve many circular refs, and now I weakened them, which reduced slightly the memory usage. But still the process grows and donīt free its memory.

    100K HTML isnīt a big problem, but some times I have a 10mb ou 30mb structure in memory, which then generates a 3 mb HTML. And thats the problem.

    Iīm sure there arenīt circular refs anymore, but system doesnīt reclaims memory. I think this is the normal behavior, and Apache2::SizeLimit is the only solution for this. Am I right?

    But I still canīt undestand why it says the sub "end" is eating all this memory! I would understand if it says there is a global variable or something, but why a sub? For example, the perl-stats Memory Usage of the Html::page package says:
    128:       undef PV    44 bytes html
    129:       undef NULL  24 bytes 0xa7f4cb8
    130:       undef PV   37740 bytes Content-type: text/html
    Cache-Control: no-cache,no-store,max-age=0
    Pragma: no-cache
    
    <html>... ALL MY PAGE HERE 2 MB OF DATA!!!
    

    Why is it still arround? Iīve done a $page->{html} = ''; in the DESTROY method and Iīm sure it was invoked. And, by the way, it has nothing to do with the sub "end". "end" just parses $page->{html}, then flushes it to the browse and then undef $page->{html}. Shouldnīt "undef" be enouth?

    Thanks monks!

    Diego de Lima
      and then undef $page->{html}. Shouldn't "undef" be enough?

      That depends upon what that hash entry actually is. If it is a reference (or something containing a reference) to data which is held elsewhere, you just clear that slot from the reference, but not the referenced data.

      Consider:

      perl -le '$foo = "bar"; $h->{foo} = \$foo; undef $h->{foo}; print "foo + :$foo:"' foo :bar:

      which is different to

      perl -le '$foo = "bar"; $h->{foo} = \$foo; undef ${$h->{foo}}; print " +foo :$foo:"' foo ::

      where the referenced data is cleared.

      --shmem

      _($_=" "x(1<<5)."?\n".q·/)Oo.  G°\        /
                                    /\_¯/(q    /
      ----------------------------  \__(m.====·.(_("always off the crowd"))."·
      ");sub _{s./.($e="'Itrs `mnsgdq Gdbj O`qkdq")=~y/"-y/#-z/;$e.e && print}
        In my case, $page->{html} is a simple string, like:
        $page->{html} = '...ALL HTML HERE 2 MB OF PLAIN TEXT...';
        
        So, isnīt undef $page->{html} enought?

        Diego de Lima
      If you run the exact same request multiple times and the process size grows every time, you have a leak, probably a circular reference. If it stops growing but remains at the larger size, that is completely normal and the only way to avoid it is to structure your code so that it doesn't need to have all the data in memory at once.