thanos1983 has asked for the wisdom of the Perl Monks concerning the following question:

Dear Monks,

This question is a sequence of Net::OpenSSH::Parallel with sudo commands. Although that the question has been answered I am facing a new problem that I can not overcome.

The script is operating perfectly, it adds the registered hosts and applies the set of commands. The problem is that I can not figure out how to capture and process the STDOUT.

I have found a way to capture the output and paste it into a file. I am trying to process the output before pasting it on the file. I want to process the output into a hash according to the device that is executing the code.

Sample of current output:

* Stopping NTP server ntpd ...done. * Stopping NTP server ntpd ...done. ntpd: time slew +0.000740s ntpd: time slew +0.013857s * Starting NTP server ntpd * Starting NTP server ntpd ...done. ...done.

The desired output would be something like:

IP1 => {ntpd: time slew +0.000740s} IP2 => {ntpd: time slew +0.013857s}

Sample of conf.ini file.

[MP 101] host = 127.0.0.1 user = username psw = password port = 22 [MP 100] host = 127.0.0.1 user = username psw = password port = 22

Sample of code with successful parallel ssh connections with sudo multiple commands.

#!/usr/bin/perl use strict; use warnings; use Data::Dumper; use Fcntl qw(:flock); use Config::IniFiles; use Net::OpenSSH::Parallel; select STDOUT; $| = 1; select STDERR; $| = 1; my %dev_data = (); my %sudo_passwords = (); sub devices { my $path = 'conf.ini'; open my $fh , '<' , "".$path."" or die "Could not open file: ".$path." - $!\n"; flock( $fh , LOCK_SH ) or die "Could not lock '".$fh."' - $!\n"; tie my %ini, 'Config::IniFiles', ( -file => "".$path."" ) or die "Error: IniFiles->new: @Config::IniFiles::errors"; close ( $fh ) or die "Could not close '".$fh."' - $!\n"; my @mps = keys ( %ini ); my $maximum_workers = @mps; my $maximum_connections = 2 * $maximum_workers; my $maximum_reconnections = 3; my %opts = ( workers => $maximum_workers, connections => $maximum_connections, reconnections => $maximum_reconnections ); my $pssh = Net::OpenSSH::Parallel->new(%opts); #open my $stdout_fh, '>>', 'test.log' or die $!; foreach my $hash ( @mps ) { $pssh->add_host( $ini{$hash}{host} , user => $ini{$hash}{user}, port => $ini{$hash}{port}, password => $ini{$hash}{psw} ); #default_stdout_fh => $stdout_fh ); $sudo_passwords{$ini{$hash}{host}} = $ini{$hash}{psw}; } my @cmd = ( "service ntp stop" , "ntpd -gq" , "service ntp start" ); sub sudo { my ($label, $ssh, @cmd) = @_; foreach my $c (@cmd) { $ssh->system( {stdin_data => "$sudo_passwords{$label}\n"} , 'sudo' , '-Skp' , '' , '--' , split " " , $c ); ( $dev_data{$label} = $_ ) if ( $_ !~ /slew/ ); } } $pssh->push('*', parsub => \&sudo, @cmd); $pssh->run; #print Dumper(\%dev_data); return %dev_data; } # end sub complex my %results = devices();

Does anybody know how to do that?

Seeking for Perl wisdom...on the process of learning...not there...yet!

Replies are listed 'Best First'.
Re: How to capture process and redirect STDOUT in a hash
by graff (Chancellor) on Jan 02, 2015 at 04:49 UTC
    I haven't tried to use Net::OpenSSH(::Parallel), but the docs for Net::OpenSSH say that the $ssh->system() method accepts hash-style parameters for "default_stdout_file" and "default_stderr_file" (storing these outputs to named files), or "defailt_stdout_fh" and "default_stderr_fh" (passing file handles to receive these outputs).

    So one approach might be, for each process in your queue, save stdout and stderr to distinct files (using different names for each process), and then read those files back when the ssh->system calls are all done.

    I expect you could also open file handles that write to in-memory scalar variables (see description of open(FH,'>',\$variable) in the man page for open), pass those file handles to ssh->system() as default stdout/stderr, and then just do regex matches on those variables when the processes are done.

    I think you'll want to use two separate outputs for each process (separating stderr and stdout), because each output of each process might operate asynchronously, and if more than one stream goes to a single file handle, the data might get interleaved in ways you wouldn't expect or want (e.g. a stderr message in the middle of a stdout line).

      Hello graff,

      Thank you for your time and effort reading and replying to my question. I was reading about that too, and actually I have manage to create a solution that writes the data into a file and then I open that file and process the data according to my needs.

      What I was hopping to achieve is to avoid opening this file read, open another file process the data and write. I was hoping that there is a way to capture the STDOUT and process it before is stored to the file.

      The way that easily can be done to write to the file is through the parameters that you set at the beginning. I am posting the solution just in case someone in the future might be interested on that.

      Sample of code is provided under:

      open my $stdout_fh, '>>', 'test.log' or die $!; foreach my $hash ( @mps ) { $pssh->add_host( $ini{$hash}{host} , user => $ini{$hash}{user}, port => $ini{$hash}{port}, password => $ini{$hash}{psw}, default_stdout_fh => $stdout_fh ); }

      It also possible to store the STDOUT to different file(s). It could be extremely easy just by adding another parameter into my conf.ini file and add a hash value on the foreach loop where you add the devices to prove. By doing that you can have different STDOUT on different device(s). Of Course by doing that you also need to add an open file process with the hash on the same foreach loop. Final step at the end of the process after write you need to close the files with a loop again.

      It might sound complicated but in reality is extremely simple.

      Seeking for Perl wisdom...on the process of learning...not there...yet!
        Here's a simple demonstration that uses variables as the storage for output file handles. You should be able to set up this sort of HoH (or AoH?) to keep track of the distinct "output file handles" of the various child processes, and handle the resulting output data in a simple, comprehensive way.

        For this example, I'm just using some random time stamps as keys for each log, but you could use anything that makes sense for your app. Again, I'd be inclined to use separate variables for stderr and stdout of each child, but maybe that's not necessary in your case.

        #!/usr/bin/perl use strict; use warnings; my %logs; for ( 0 .. 2 ) { my $id = time(); open( $logs{$id}{fh}, '>', \$logs{$id}{var} ) or die "open failed +on # $_: $!\n"; sleep int(rand(3)) + 1; # (i.e. for a small but variable number o +f seconds) } printf "log-file ids are: %s\n\n", join( " ", sort keys %logs ); for ( 1 .. 12 ) { my $id = ( keys %logs )[ int(rand(3)) ]; print "Sending entry # $_ to log $id\n"; print {$logs{$id}{fh}} "this is log event # $_\n"; } print "\n"; # How many entries per log? for my $id ( sort keys %logs ) { my @entries = split( /\n/, $logs{$id}{var} ); printf "log_id %s got %d entries\n", $id, scalar @entries; } print "\n"; # Which log got entry #4? for my $id ( sort keys %logs ) { next unless ( $logs{$id}{var} =~ /4/ ); print "Here is the log for $id, containing the fourth entry:\n$log +s{$id}{var}\n"; }
        (Minor update: I changed the numeric range in the second "for" loop, so that entry # 4 is also the fourth entry.)
Re: How to capture process and redirect STDOUT in a hash
by salva (Canon) on Jan 02, 2015 at 11:28 UTC
    Net::OpenSSH::Parallel does not support capturing the output from the remote processes on the fly. You have to save it into files and then read it back later.

    You can use the variable expansion feature to create a different file for every host easily, for instance:

    $pssh->push('*', cmd => { stdout_file => "$path/%LABEL%.out" }, @cmd);
Re: How to capture process and redirect STDOUT in a hash
by locked_user sundialsvc4 (Abbot) on Jan 02, 2015 at 16:16 UTC

    Could you maybe redirect the output to a pipe?