Sewi has asked for the wisdom of the Perl Monks concerning the following question:

Dear Monks,
It seems that some file being run via mod_perl is eating up my memory. Some apache child processes grow to a value around 350MB. The more time goes by, the more childs reach this value.
As the problems started (some time ago) when I started using mod_perl, I guess there is a Perl file which is wasting RAM and every time a URL which uses this file is requested, the serving Apache child grows.
I tried to measure the memory usage difference between start and end of the handler() subs but didn't find anyone who claims to be responsible.
Any hints would be helpful.

Replies are listed 'Best First'.
Re: mod_perl memory leak
by Anonymous Monk on Sep 09, 2009 at 07:27 UTC
    Create a test that mimicks normal interaction with your program, and run it in an infinite loop, while monitoring memory growth before/after important function calls.

    Perform a code review.

Re: mod_perl memory leak
by Anonymous Monk on Sep 09, 2009 at 12:32 UTC
    Some apache child processes grow to a value around 350MB. The more time goes by, the more childs reach this value.


    They grow to 350MB and then stop growing?
    That sounds like an inefficient algorithm, not a memory leak.
    Memory leaks grow to infinity, whereas an algorithm that needs 350MB will use it, and be able to reuse it later without grabbing more. Of course, each child that hits that algorithm will need its own chunk of 350MB.

    Making the algorithm more space-efficient is the way to go here. Don't let it slurp huge files, make it read line by line.
      You're right, it's no memory leak, but I still have no clue how to get a hint where the memory is eaten up :-(