I'm using many Perl modules in my script, my concern is while compile the process memory keeps growing drastically.
I want to undef some module once after its done.
I'm trying to understand the problem before I suggest a solution. Are you saying while the script's compiling, and before it starts to run, it uses a lot of memory? Or are you saying that when it runs (and compiles first) it uses a lot of memory?
Is this a long-running script, or a daemon? Is it processing large files that are getting read into memory? Is it processing large XML files? Is this a real performance problem? Are you running this script on a machine that's underpowered? Can you get more memory for the machine?
If you can explain what's going on in a little more detail, perhaps we can make better suggestions.
| [reply] |
Its possible (see Symbol::delete_package] , but not very useful. | [reply] |
| [reply] |
I don't know if it helps, but have you tried simply to make the following?:
use MyLib1;
use MyLib2;
<my code here>
no MyLib1;
<my code here>
no MyLib2;
<more code here>
Regards,
turo
perl -Te 'print map { chr((ord)-((10,20,2,7)[$i++])) } split //,"turo"'
| [reply] [d/l] [select] |
I'm unaware of any modules where that would help. no calls the given module's unimport() method, which (in theory) could free a lot of memory, but that's likely to be minimal (usually it toggles some lexical hints at compilation time) and it'll free that memory back to Perl, not the OS.
| [reply] [d/l] [select] |
I'm working on a project right now that uses a lot of modules. What I did was to tuck the modules that weren't used constantly into subs. Then you can call them as methods when you need them, and you won't get out of memory warnings.
As for undef, each module will probably have its own procedure for undef or exit, so check that module's documentation to see how its done. There's a lot of modules that don't have a method for undef, but if you're using subs, you may not need to undef them. | [reply] |