tobyclemson has asked for the wisdom of the Perl Monks concerning the following question:

Hi,

I'm currently writing a script to update the live servers for multiple domains from the dev server. I would like to be able to use SSH to connect to each of the servers and run some CVS commands on each one.

The commands will need to iterate for each domain that needs action and hence it would make sense to use an object orientated approach with a persistent connection for each of the servers. All of the domains are hosted on the same (2 identical) servers and there is only one dev server so there would just need to be 3 connections.

My question really is what is the easiest way to have multiple SSH connections from one script? Also how do I deal with user authentication etc. - is it possible to use shared keys or something?

On the CVS side of things, for each domain, the working directory on the dev server will need to be checked in to the cvs repository, then checked out onto the 2 live servers. Is it possible to have CVS 'objects' that represent the same command and then just CVSobject->execute() them for each domain?

Thanks in advance,
Toby

Replies are listed 'Best First'.
Re: SSH to multiple servers
by zentara (Cardinal) on Jun 26, 2006 at 15:52 UTC
    #!/usr/bin/perl use strict; use warnings; use Net::SSH::Perl; # If error is: # Can't locate object method "blocking" via package #"IO::Handle" at /usr/lib/perl5/site_perl/5.8.6/ # Net/SSH/Perl.pm line 212, <GEN0> line 1. # then use "protocol => 2"; my %hostdata = ( 'localhost' => { user => "z", password => "qumquat", cmdtorun => "ls -la", misc_data => [], }, 'zentara.zentara.net' => { user => "z", password => "aardvark", cmdtorun => "/usr/bin/uptime", misc_data => [], }, ); foreach my $host (keys %hostdata) { my $ssh = Net::SSH::Perl->new($host, port => 22, debug => 1, protocol => 2,1 ); $ssh->login($hostdata{$host}{user},$hostdata{$host}{password} ); my ($out) = $ssh->cmd($hostdata{$host}{cmdtorun}); print "$out\n"; }

    I'm not really a human, but I play one on earth. flash japh
Re: SSH to multiple servers
by shmem (Chancellor) on Jun 26, 2006 at 15:04 UTC
    Look at Net::SSH. The module works with ssh keys, not with passwords.

    It provides

    • ssh_cmd for single connections providing a command (and feeding something to STDIN)
    • sshopen2 which you call with 2 FileHandles - $reader, $writer
    • sshopen3 which you call with 3 FileHandles - $reader, $writer, $error

    The latter two would be your candidates, opening multiple connection and using select() to read/write from/to multiple filehandles. See IO::Select or POE for that. If that doesn't suit, here's a discussion about Waiting for multiple filehandles. In that thread there's a note from BrowserUk on how to do that with threads.. ,)

    --shmem

    _($_=" "x(1<<5)."?\n".q·/)Oo.  G°\        /
                                  /\_¯/(q    /
    ----------------------------  \__(m.====·.(_("always off the crowd"))."·
    ");sub _{s,/,($e="'Itrs `mnsgdq Gdbj O`qkdq")=~y/"-y/#-z/;$e,e && print}
    
Re: SSH to multiple servers
by Herkum (Parson) on Jun 26, 2006 at 14:52 UTC

    I have used Net::SSH::Perl with success. The basic functionality worked fine, however when I switched over to trying to use id_dsa files (so that I did not have to rely on storing a password in a file), there were some undocumented quirks with the module. The module has a mailing list on Sourceforge where I got my answers but the undocumented features :) did make it more difficult then it had to be.

    So the answer is, Net::SSH::Perl can work for you, though it will take a little work to get where it needs to be...

Re: SSH to multiple servers
by dsheroh (Monsignor) on Jun 26, 2006 at 15:15 UTC
    Although I'm sure that Net::SSH is a grand module (I admittedly haven't used it), it appears from its documentation that it may not provide as robust a model for interaction with the remote system as Expect or Expect::Simple. Both Expect modules encapsulate connections as objects, so dealing with multiple servers concurrently is fairly trivial.

    As for user auth, setting up RSA/DSA key pairs to allow passwordless login is the way to go. Especially since the alternative is to store your passwords in a file. Just set a good passphrase on the private key(s) so they can't be trivially lifted from the filesystem.

      But how do you automate once you've set a passphrase? Aren't you back to putting a password/passphrase in a file?
        You can use ssh-agent and ssh-add to allow you to manually enter the passphrase once, then have the keys available to all shells/programs you run under the agent, including your ssh-to-multiple-servers app. (Most Linux distros run ssh-agent by default when you log in under X, at least; if it's not active, ssh-agent bash will open a new shell with an active agent.)

        So the worst-case process would be:

        1. Log in
        2. Run ssh-agent bash
        3. Run ssh-add, which prompts for your passphrase
        4. Enter passphrase
        5. Run the multiple-ssh program
        Step 2 may not be necessary if you're running under an ssh-agent by default. If you completely trust the system you're running on, steps 3 and 4 can be skipped by using an empty passphrase on the private key, but I wouldn't recommend doing so unless you need it to be able to run unattended (e.g., from cron), since that does go back to putting the complete login credentials into a file.
Re: SSH to multiple servers
by madbombX (Hermit) on Jun 26, 2006 at 17:48 UTC
    Hello,

    I personally still use system /ssh -i $key -h $host/ for perl scripts although using other ways specified (Net::SSH::Perl) might work better for you. I have found that I still like to use system /ssh $blah/ because I am still a fan of pwless SSH keys because of the way I have the security of my network layed out (and I am sure that a lot of people would argue my use of this, but this isn't the time or the place for that).

    However, I have found that when I need to have multiple connections or do multiple time intensive things simultaneously that I use Parallel::ForkManager. It just forks off processes and lets you set the max number of processes that you would like to allow to run continuously.

    # Begin ForkManager my $_max_procs = 5; $_pm = new Parallel::ForkManager($_max_procs); # Log at process fork $_pm ->run_on_start( sub { my ($pid, $host) = @_; print "Forking process PID: $pid\n"; } ); # Log at process copmletion $_pm ->run_on_finish( sub { my ($pid, $exit_code, $host) = @_; print "Finishing up process PID: $pid\n"; } ); $_pm->run_on_wait( sub { print "Waiting for children to finish.\n" }, 5.0 ); foreach my $ssh_host (keys %{$ssh_host_list}) { # Fork off the children and get going on the queries my $pid = $_pm->start($ssh_host) and next; # do script stuff here # Closing the forked process $_pm->finish; } # Ensure all children have finished $_pm->wait_all_children; exit(1);
A reply falls below the community's threshold of quality. You may see it by logging in.