in reply to mod_perl memory issues...

There is quite a bit of writing on how to reduce your memory footprint with mod_perl. It mostly involves pre-loading modules at startup so they will be shared. Perl does not provide a way to effectively remove compiled code from memory, so the answer to your question is no. If you have an extreme case (for example, you very occasionally use LWP, which takes up a lot of memory), you can deal with it by waiting to load the module until you actually need it (with a require call) and telling the current apache process to exit when the request is over ($r->child_terminate). Apache::SizeLimit will do this for you if you load a module that pushes the current size over the limit.

Replies are listed 'Best First'.
Re: Re: mod_perl memory issues...
by etcshadow (Priest) on Oct 16, 2003 at 02:15 UTC
    Just to expound on what perrin said... Take all of your frequently used perl modules and "preload" them. That is, add a line in your apache conf file to
    PerlModule <module-name>
    The basic idea is that the root apache process's perl interpretter reads in and compiles those perl modules before it forks off children for handling http-requests. The way that fork() works means that all the compiled representations of all of these perl modules end up residing in shared memory, rather than a process's individual memory.

    Say you've got 15M worth of modules that your processes typically use... you end up spending that 15M only once, in total, rather than once per apache process.

    The trade-off is that apache takes slightly longer to start up initially, because it has to compile all of your perl code before the server starts answering requests. Also, if you are developing against a server that preloads modules, bear in mind that for your changes to take affect in the server, you need to stop apache and then start apache (rather than simply being able to HUP it).

    Anyway, another thing you can do hear is to use the MaxRequestsPerChild apache config directive to tell your mod_perl processes to die-off and respawn every so-many requests. The point of this is that if your mod_perl processes steadly accumulate compiled perl modules over time, then every so-many requests, you just reset the process's memory. Of course, this is done by wiping the process and starting over from scratch, rather than by selectively unloading modules. But still, it's better than nothing.

    The trade-off here is that you'll be wasting a certain amount of processor resources (not a whole ton) on exiting, re-forking, and re-compiling. However, if done in conjunction with preloading modules, the fork of the child process is fairly cheap (because it doesn't have to compile your comonly used perl modules).

    Overall, I'd recomend a combination of both: preload comonly used modules, and have your processes suicide after many requests (find an appropriate number for yourself... I'd suggest 100 to start with) so that infrequently used modules don't stay around forever.


    ------------
    :Wq
    Not an editor command: Wq
      Apache::SizeLimit is actually a better approach than MaxRequestsPerChild, since it only makes the process exit if it has actually been growing. There's no reason for a process to exit after 100 requests if it hasn't grown significantly, just as a process that served 5 requests and grew a lot (maybe it slurped a big file or downloaded something big with LWP...) should exit.
Re: Re: mod_perl memory issues...
by Chester K (Initiate) on Oct 17, 2003 at 23:37 UTC
    Perl does not provide a way to effectively remove compiled code from memory

    Symbol::delete_package() can remove a package from memory, including compiled subroutines in the package. However it doesn't do any of the other cleanup you'd expect (such as removing any files from %INC, or unloading XS modules), and it wouldn't be straightforward to use it with Apache::Registry in any case.