forkboy has asked for the wisdom of the Perl Monks concerning the following question:

Hi,
Wondering about methods of automating deploying (inhouse) perl modules (creating using h2xs) to multiple machines.

Currently we are doing it the hardway, eg,
make tardist
scp + ssh to each production server
make test + make install on each production server.

I'm thinking surely there must be a simplier way. The options which spring to mind are.

  1. Install our modules onto a NFS drive and set PERLLIB environment.
    Simple, but requires redeploying (or atleast moving) all our existing code.
  2. Build a deployment tool.

What are other peoples approaches to this? are there any existing tools for simplifying "mass" deployments?

jantitored by ybiC: Formatted using HTML markup instead of <pre> tags

Replies are listed 'Best First'.
Re: Module deployment across servers
by jpeg (Chaplain) on Mar 04, 2005 at 02:07 UTC
    How about rsync? This is exactly what rsync was invented for.
Re: Module deployment across servers
by Fletch (Bishop) on Mar 04, 2005 at 03:26 UTC

    You might look into whatever your OS' naitive packaging system provides (e.g. dpkg, rpm, pkg_add, whatever the SysV pkg-whoozits that Solaris uses). Some of those (dpkg and rpm for sure) provide methods for setting up some sort of network server to hold local modules (apt-get and yum, respectively).

Re: Module deployment across servers
by InfiniteLoop (Hermit) on Mar 04, 2005 at 05:57 UTC
    If you are using RedHat Linux, try the up2date tool. As mentioned here, you could use yum/apt also. See this link on how to set up a yum repository.
Re: Module deployment across servers
by martinvi (Monk) on Mar 04, 2005 at 08:44 UTC

    3. Use an existing deployment tool.

    Have a look at cfengine. Within the Unix-universe -- never tested cfengine with a non-Unix-OS -- the tool is pretty independent from different vendors, flavors, architectures and even philosophies.

Re: Module deployment across servers
by Anonymous Monk on Mar 04, 2005 at 09:57 UTC
    Well, what a good way is depends on your infrastructure. Using NFS is great if you have a homogeneous environment. Note there's no need for setting PERL5LIB. For instance, you might want to share your entire /usr/local, and have perl installed in /usr/local. Or you install perl in /net/server/pkg/perl5, and replace your binary /usr/bin/perl with a symlink to /net/server/pkg/perl5/bin/perl. Now, such an action may require moving or reinstalling all your code. But you do it once, and from then on each deployment needs to be done only once. Even in an environment with different platforms sharing over NFS will work, although you'll need to link to different binaries. But the libraries can be shared, and deployment of new modules need to be done once for pure Perl modules, and once for each platform for XS modules. And if it's just the local stuff you worry about, you could just share your site_lib directory.

    But if you don't go the NFS was, there's no need to do a make on every box you deploy on. Make it once for each platform, and distribute the created files. The distribution is fairly easy to automate, for instance by scp, or even using rsync and a cron job. (So you just build/install once for each platform, and the cron job takes care of syncing - a big advantage of using a cron job is that even machines that are down when you deploy will eventually get updated without having you to remind yourself - works great for laptops too).

    I've used both techniques in the past succesfully - and not just for Perl.

Re: Module deployment across servers
by Ultra (Hermit) on Mar 04, 2005 at 10:51 UTC
    I assume you can use any suggestion the monks gave to you
    The most natural for you would be to automate what you _always_ manually do:

    1) one script on local machine for make tardist
    2) one script on local machine for scp + ssh to each production server
    3) one script on the remote machine, that is run by the second script, to
    make test + make install on each production server.
    4) and eventually one collector script that runs the first and the second script and mails you the results ;-)
    Dodge This!