in reply to Re: How to capture process and redirect STDOUT in a hash
in thread How to capture process and redirect STDOUT in a hash

Hello graff,

Thank you for your time and effort reading and replying to my question. I was reading about that too, and actually I have manage to create a solution that writes the data into a file and then I open that file and process the data according to my needs.

What I was hopping to achieve is to avoid opening this file read, open another file process the data and write. I was hoping that there is a way to capture the STDOUT and process it before is stored to the file.

The way that easily can be done to write to the file is through the parameters that you set at the beginning. I am posting the solution just in case someone in the future might be interested on that.

Sample of code is provided under:

open my $stdout_fh, '>>', 'test.log' or die $!; foreach my $hash ( @mps ) { $pssh->add_host( $ini{$hash}{host} , user => $ini{$hash}{user}, port => $ini{$hash}{port}, password => $ini{$hash}{psw}, default_stdout_fh => $stdout_fh ); }

It also possible to store the STDOUT to different file(s). It could be extremely easy just by adding another parameter into my conf.ini file and add a hash value on the foreach loop where you add the devices to prove. By doing that you can have different STDOUT on different device(s). Of Course by doing that you also need to add an open file process with the hash on the same foreach loop. Final step at the end of the process after write you need to close the files with a loop again.

It might sound complicated but in reality is extremely simple.

Seeking for Perl wisdom...on the process of learning...not there...yet!

Replies are listed 'Best First'.
Re^3: How to capture process and redirect STDOUT in a hash
by graff (Chancellor) on Jan 02, 2015 at 18:17 UTC
    Here's a simple demonstration that uses variables as the storage for output file handles. You should be able to set up this sort of HoH (or AoH?) to keep track of the distinct "output file handles" of the various child processes, and handle the resulting output data in a simple, comprehensive way.

    For this example, I'm just using some random time stamps as keys for each log, but you could use anything that makes sense for your app. Again, I'd be inclined to use separate variables for stderr and stdout of each child, but maybe that's not necessary in your case.

    #!/usr/bin/perl use strict; use warnings; my %logs; for ( 0 .. 2 ) { my $id = time(); open( $logs{$id}{fh}, '>', \$logs{$id}{var} ) or die "open failed +on # $_: $!\n"; sleep int(rand(3)) + 1; # (i.e. for a small but variable number o +f seconds) } printf "log-file ids are: %s\n\n", join( " ", sort keys %logs ); for ( 1 .. 12 ) { my $id = ( keys %logs )[ int(rand(3)) ]; print "Sending entry # $_ to log $id\n"; print {$logs{$id}{fh}} "this is log event # $_\n"; } print "\n"; # How many entries per log? for my $id ( sort keys %logs ) { my @entries = split( /\n/, $logs{$id}{var} ); printf "log_id %s got %d entries\n", $id, scalar @entries; } print "\n"; # Which log got entry #4? for my $id ( sort keys %logs ) { next unless ( $logs{$id}{var} =~ /4/ ); print "Here is the log for $id, containing the fourth entry:\n$log +s{$id}{var}\n"; }
    (Minor update: I changed the numeric range in the second "for" loop, so that entry # 4 is also the fourth entry.)