Come for the quick hacks, stay for the epiphanies. | |
PerlMonks |
Re: How to manage the transfer of large Perl environments from one network to anotherby swl (Parson) |
on Mar 29, 2019 at 22:37 UTC ( [id://1231885]=note: print w/replies, xml ) | Need Help?? |
If you have the entire environment then you could try a reductionist approach. Find every lib dir that contains a .pm file and add them to $ENV{PERL5LIB}. That will hopefully make everything available so the process runs to completion. Then add an END{} block to the master script to print or dump @INC (and maybe also %INC) to a file. That will give you the necessary dirs to use. Some caveats: If the process runs then you might not care that you have too many entries in @INC, and thus there is no need to reduce the set. This does not account for the order of libs, so might load incorrect versions and lead to subtle bugs. A check for .pm file uniqueness would be of benefit here. It doesn't fix the underlying problem of using so many dirs in PERL5LIB in the first place, but that sounds like a second job. If the full run takes days then it might not be worth the wait... The END block will not list all the used dirs if some scripts are called using system, backticks and so forth. An alternative given the last caveat might be to instrument the lib package to record each addition to @INC in a log file somewhere. Others can hopefully advise as to whether this is a reasonable idea and how it could be done.
In Section
Seekers of Perl Wisdom
|
|