fiddler42 has asked for the wisdom of the Perl Monks concerning the following question:
I am trying to install a customer's Perl environment on my employer's network. When said customer runs our non-Perl applications, a lot of Perl code gets exercised here and there throughout their (chip-building) flow. During the archive process at the customer site, a significant amount of Perl content (scripts, modules, etc.) gets collected. By the time everything is unpacked on my employer's network, there is a lot to sort out; for example, the PERLLIB environment variable captured at the customer site has over 430 directory entries.
Not surprisingly, my question is more system administrator-related: what would be the best method to verify that the captured Perl environment at the customer site unpacks and runs smoothly on my employer's network? I realize this is a loaded question, but, in essence, making sure the 430+ entries in the PERLLIB environment variable are ordered just right is proving difficult.
After unpacking my customer's environment, I print out the contents of @INC when a Perl script tries to run but fails to find a specific module (this is BY FAR the most common problem). I then grep/find where the module is located in the extracted directory structure and bump it to the beginning of the @INC list. It is not uncommon to see 'use lib' entries in a .pm that reference something in the customer's local directory tree, so I am sure you can imagine how fun it is to address such issues.
I apologize if this is not the best platform to post such a question, but I wasn't sure where else to go for something this complex. (Actually, I posted this Q on Stack Overflow and the first response was to post here. :-)
|
|---|