in reply to Need Help accessing perl modules installed on shared location

Don't use a local uncontrolled perl with a shared controlled module library path. You're asking for headaches, especially when combining modules with XS in them, crossing the 5.9 boundary (5.8 is not binary compatible with 5.10, there are probably other examples).

Instead, compile perl for your platform (linux/x86-64, it appears), and tell it that it will get installed to the shared filesystem. Make sure, too, that this shared filesystem is mounted the same on all machines (or symlinks make it seem that way). For example, /share/perl/5.8.8/{bin,lib}. Well, you should use something newer than 5.8.8 if at all possible. Then, use the shared perl instead of the local perl. Instead of #!/usr/bin/perl at the top of your scripts, use #!/share/perl/5.8.8/bin/perl. Mind you, this only really works if you have one platform. If you have multiple platforms, e.g., Linux and AIX or Sun or HP, or even Linux on x86-64 and Linux on x86-32, you'll probably want to install to platform-specific directories, have a shared library path, and then rely on the PATH to be set up properly to find the right perl.

Hope that helps.

  • Comment on Re: Need Help accessing perl modules installed on shared location

Replies are listed 'Best First'.
Re: Need Help accessing perl modules installed on shared location
by jainprithviraj (Initiate) on Jul 19, 2011 at 14:03 UTC

    Thanks for the valuable inputs.

    If I wish to run the perl programs on multiple platforms, would it mean that I need to create the servers (common server with perl and modules shared) for all the different platforms ?

    Is is safe to share the perl ? Suppose I wish to run a script xyz.pl from 10 different systems accessing the shared perl and modules; would it create any problems ?

      We've done this at work. Well, someone else did it, I'm just passing it on ;-)

      Consider:

      /share/linux86-64/bin /share/linux86/bin /share/linuxia64/bin /share/linuxppc64/bin /share/hprisc/bin /share/hpia64/bin /share/sunsparc/bin /share/sun86-64/bin /share/aix/bin # etc. /share/common/perl/lib
      Now, if, when you compile perl, you tell it the appropriate bin directory and the common perl lib directory, for where to put executables and libraries, respectively, you'll have a single share with everything you need. I assume you don't need all of the above, but you may.

      The next requirement for this is that anyone on, say, HP/ia64 (Itanium) will have to add the correct bin directory to their $PATH, e.g., PATH=/share/hpia64/bin:$PATH. This is so that the new perl is found first.

      Finally, your xyz.pl script would have to start like this:

      #!/bin/sh eval 'exec perl -S "$0" "$@"' if 0; # this line keeps it from being seen by perl.
      This will load the shell, which will evaluate the string, which will cause it to exec (replace itself with) perl with the name of the script and any parameters passed along. Perl will load the script, ignore the first line (the #! line doesn't have "perl" in it), start executing with the second, see the eval over two lines, see the if 0, and the compiler should optimise it away. And then it will merrily go on to the rest of your code.

      Now, all this said, when you compile perl, you should have the opportunity to hard-code some extra paths into @INC. My recommendation? Do so. Hard-code in a location for your OWN modules (not the ones you're installing from CPAN). For example, /share/common/perl/locallib. And then you can put your modules here. I say this largely because I'm of the opinion that every(*) perl script that is intended to last more than the day it's coded on should look like this:

      #!/bin/sh eval 'exec perl -S "$0" "$@"' if 0; # this line keeps it from being seen by perl. use lib ...; # if necessary. use My::App; # or whatever it's called. my $app = My::App->new(); $app->parse_args(@ARGV); # passing in @ARGV is recommended but not req +uired exit $app->run();
      Basically, load the module that has the real code, create the app object, tell the object to parse the arguments, and then tell the object to run, exiting with whatever return code it returns. The reason for this is simple: it makes it easier to write unit tests for your code. It also makes it easier to embed your app within another one, but that's the same as writing unit tests, since unit tests will generally embed the app within the .t file (by doing the same as the above).

      (*) Ok, there are other exceptions, too, but it's a general rule for me.

      BTW - this is an NFS share with hundreds(!!) of developers using it. Our build environment is built in perl (using make underneath), so even our C/C++ or Java devs will use perl, without knowing it. The share also has the Windows perl on it, and is shared via samba. While I'm sure that the machine has Gigabit networking, I don't think caching has been an issue. By default NFS (and I assume samba) already does do some caching anyway.

      Is is safe to share the perl ? Suppose I wish to run a script xyz.pl from 10 different systems accessing the shared perl and modules; would it create any problems ?

      No. It might create lots of traffic on your share, so look into caching :)