We're running into the problem where we've got 20 apache children processes running which keep getting larger and larger as perl outputs our dynamic pages...
Then you've got a memory leak. This usually happens in perl when there is a circular reference to a variable and therefore it dosen't get properly refcounted. It can also happen when there is a global variable that keeps getting data added to it (imagine an array that gets data pushed on to it every request.
As a quick fix, lower the MaxRequestsPerChild to a low number. Maybe 100... or even 10. Otherwise, you're going to have to find the memory leak.
In general, if all of a program's variables are properly scoped then the memory footprint of the program will not continually grow. It will quickly grow to the amount of memory it needs to perform its task, and then level off.
all of the code is stored in RAM which is allocated to each apache process
If you load code and data in to mod_perl before apache forks its children, then the memory will be shared. How big are the files that make up the application? I can't imagine this part being too much of an issue.
Would the size of the file be stored in memory along with the compiled code?
How big are the .pdf files? If they are particularly large, you may want to figure out a way for the web app to hand off the .pdf generation to a different, short lived process that will run in its own memory space and return the memory to the OS after it finishes running. After it runs, do a http redirect to have apache directly serve the file.
But you definitely have a memory leak if the child httpds are continually growing and growing in memory size.
In reply to Re: Perl and Apache 1.3
by trwww
in thread Perl and Apache 1.3
by Heffstar
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |