scotchie has asked for the wisdom of the Perl Monks concerning the following question:

Can anyone please help with the following question?

Net::OpenSSH creates a named socket for each connection, and I've noticed that it conveniently removes the named socket when the parent process dies.

I'm trying to use Net::OpenSSH and fork() in combination to perform the same operations on multiple target servers, in parallel. For example,

# Open connections for my $server (@targets) { $ssh{$server} = new Net::OpenSSH($server, ...); } # Do stuff in parallel for my $server (@targets) { my $pid = fork(); if ($pid) { $pid{$server} = $pid; } else { # In child process. Do stuff $ssh{$server}->system( stuff ); ... exit 0; } } waitpid($pid{$_}) for @targets; # Do more stuff ... for my $server (@targets) { $ssh->system (more stuff); }

The problem I'm encoountering is that whenever any of the children exit, Net::OpenSSH seems to be removing all of the named sockets. Essentially, it disconnects all connections that were opened in the parent process. I can work around the problem be reconnecting after the waitpid's, but reconnecting takes additional runtime.

Is there any way I can tell Net::OpenSSH not to clean up the named sockets when the child processes exit, but only when the parent exits?

Replies are listed 'Best First'.
Re: Net::OpenSSH and fork()
by sflitman (Hermit) on Jul 23, 2010 at 04:14 UTC
    Based on my reading of the docs, Net::OpenSSH is already forking off separate processes. I'm not sure you're getting anything by doing the child processes in parallel, why not fire off each connection in a loop. Net::OpenSSH appears to support this directly (taken from its POD, added an example for $cmd, tested):
    #!/usr/bin/perl use strict; use warnings; use Net::OpenSSH; my @hosts=( 'user@server1.com','user@server2.com','user@server3.com' ) +; my $cmd='uptime'; my %conn = map { $_ => Net::OpenSSH->new($_) } @hosts; my @pid; for my $host (@hosts) { open my($fh), '>', "/tmp/out-$host.txt" or die "unable to create file: $!"; push @pid, $conn{$host}->spawn({stdout_fh => $fh}, $cmd); } waitpid($_, 0) for @pid; exit;
    The spawn command directly and asynchronously runs each remote host session. I'm always leery of fork, it is probably copying all the named sockets into each child process space which is why you lose them when one child closes. (Hmm, isn't there reference counting?)

    HTH,
    SSF

Re: Net::OpenSSH and fork()
by Proclus (Beadle) on Jul 23, 2010 at 06:49 UTC
    You might want to look at POE::Component::OpenSSH instead of battling with processes.
Re: Net::OpenSSH and fork()
by salva (Canon) on Jul 23, 2010 at 09:34 UTC
    What is happening is that the ssh master process is killed in the DESTROY method that gets called in every forked process when they exit. The next release of Net::OpenSSH will have code to detect that condition and only kill the master from the same process where it was launched.

    Anyway, when forking in perl it is usually a good idea to exit from the child processes using POSIX::_exit($code) that will NOT execute any cleanup code as destructors or END blocks.

    Besides that, that is not the right way to use Net::OpenSSH that has built in support for running remote operations asynchronously (search in the docs for async or/and as sflitman has already pointed out, use the spawn method).

    Finally, there is also Net::OpenSSH::Parallel:

    use Net::OpenSSH::Parallel; my $pssh = Net::OpenSSH::Parallel->new; for my $server (@targets) { $pssh->add_host($server); } $pssh->push('*', command => @stuff); $pssh->push('*', command => @more_stuff); $pssh->run; # ... $pssh->push('*', command => @even_more_stuff); $pssh->run;

      Thanks all for the help and suggestions!

      Net::OpenSSH::Parallel looked promising. But at the time I started the project, the CPAN documentation had a note that it was still an alpha version. So I did not install it.

      My example used the fork() function, but I actually use forks.pm. I like the Perl threads feature, but my binary is not complied with support for it. Among other things, I need to run other tasks in parallel besides just SSH commands, and I also need to do some IPC between the processes. The Perl threads feature is a lot easier to use for IPC than pipes and other pure-UNIX solutions.

      I just downloaded version 0.48 of the module and modified the DESTROY method, from

      if ($pid) {

      to

      if ($pid && $perl_pid == $$) {

      This solves the problem.

      Best Regards,
      Scott
        the CPAN documentation had a note that it was still an alpha version

        Yes, it still says so, I am probably being too conservative there, I should move it to beta!