Beefy Boxes and Bandwidth Generously Provided by pair Networks
The stupid question is the question not asked

finding all modules used from a script

by d_i_r_t_y (Monk)
on Oct 31, 2000 at 05:12 UTC ( #39224=perlquestion: print w/replies, xml ) Need Help??

d_i_r_t_y has asked for the wisdom of the Perl Monks concerning the following question:

is there a way of obtaining/enumerating the names of all modules included by a script?

the idea is to call a polymorphic Flush_All_Caches -like method on a hierachy of related objects whose implementing modules are dynamically included at runtime via eval "require ThisModule";. these modules are used by multiple scripts and run under mod_perl, so the basic idea was to only include modules as they are needed in order to save valuable memory (and take a small speed hit...). ie:

foreach my $class ( enumerate_loaded_classes() ) { $class->can('clean_up') || next; $class->clean_up(); }

Ideally, it would be great to be able to dynamically unload large modules after they have served their purpose. the closest solution to this i know of is to set Apache up with a moderately low MaxRequestsPerChild to expire the oldest and fattest apache children.


Replies are listed 'Best First'.
Re: finding all modules used from a script
by btrott (Parson) on Oct 31, 2000 at 05:49 UTC
    Look at the special variable %INC. Take a look at Apache::StatINC and Apache::Reload to see some examples of messing about w/ %INC, in a mod_perl context.

    The keys in the hash will be in the form Foo/, but you can fix that with a simple regex. Then you can do your cleanup. For example:

    for my $file (keys %INC) { (my $class = $file) =~ s!/!::!g; $class =~ s/\.pm$//; if ($class->can('clean_up')) { $class->clean_up; } }
Re: finding all modules used from a script
by chromatic (Archbishop) on Oct 31, 2000 at 06:13 UTC
    Looking in %INC and using can will help you here, but I'm not sure this approach will work as well as the MaxRequestsPerChild limit.

    I'm not aware of any Perl implementation that released freed memory back to the OS. If you're really aggressive, you may be able to keep reusing the same pool of memory for big operations, but if you have one uncommon operation that requires a couple of megs (say, IO::Socket), you'll not gain much. The conventional wisdom I've seen recommends reaping mod_perl processes somewhere between 30 and 50 requests.

    Let us know your results -- I'd be interested to see a comparison.

      Actually Mac OS' Memory Manager is pretty apt at detecting low mem or mem2spare and can return it to the OS during runtime- of course, this has an effect on how one uses pointers, but it's not overly complicated. This is built into the Mac OS and is taken advantage of by MacPerl. Just a note!

      Update: So it's true what everyone else says; you need to fork to actually be able to release memory back to the OS which is what Apache does in handling clients (which is configurable). Basically, the UN*X memory model is eat as much memory as allowable until crash or exit() (the max mem usage is configurable via hard and soft limits and compile-time options).

      AgentM Systems nor Nasca Enterprises nor Bone::Easy nor Macperl is responsible for the comments made by AgentM. Remember, you can build any logical system with NOR.
Re: finding all modules used from a script
by btrott (Parson) on Oct 31, 2000 at 06:30 UTC
    As a followup partially to chromatic's note: the mod_perl guide suggests using Apache::GTopLimit instead of (or in conjunction with) MaxRequestsPerChild. The reason is that it provides even more fine-grained control over when your webserver child exits, and it allows you to set memory/process limits, rather than playing guessing games depending on how many requests a child has received.

    Check out the Improving Performance by Prevention section of the guide. Particularly:

RE: finding all modules used from a script
by extremely (Priest) on Oct 31, 2000 at 07:30 UTC
    Unless you are only rarely using these particular modules, you aren't going to see any gain from this. Maybe you can fork a child, have it import and do the large module import and then pipe it back to the perl process.

    Or (insert random deity here)-forbid you just use a CGI for the module-intensive, only-run-occasionally code. What you want to do with memory just can't be done. I've seen this same struggle over the past three years with mod_perl (admittedly you've gone a lot further than most... =) but in the end, neither perl nor the most common OSes make it easy to return memory.

    Ideally, use mod_perl on code that needs the extra speed of already being available in the server, code that needs access to the request at early stages, and code that needs special persistence. If the code isn't basically continually being run, you are likely to be wasting your time mod_perl'ing it.

    $you = new YOU;
    honk() if $you->love(perl)

Log In?

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://39224]
Approved by root
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others chanting in the Monastery: (3)
As of 2023-09-30 12:34 GMT
Find Nodes?
    Voting Booth?

    No recent polls found