in reply to Pipe two child processes
This is actually fairly straight forward if you use files for the location of the STDOUT/STDERR, turn on auto-flush, and perhaps throw in a little File::Tail which is used to read from continously updated files.
open (SAVEOUT, ">&STDOUT") or die "Unable to copy STDOUT : $!"; open (STDOUT, ">stdout") or die "Unable to open new STDOUT : $!"; select STDOUT; $| = 1;
open (SAVEERR, ">&STDERR") or die "Unable to copy STDERR : $!"; open (STDERR, ">stderr") or die "Unable to open new STDOUT : $!"; select STDERR; $| = 1; select STDOUT;
open (STDINCLONE, "<stdout") or die "Unable to open first process's ST +DOUT : $!";
Of course if you need to have STDOUT going both to a terminal AND also need to be able to read from it (file) you can still do everything I just described here with IO::Tee.open (ERRORS,"<stderr") or die "Unable to open first process's STDERR +: $!";
Note: The following is assumed:
1. The child processes are properly fork'd - see perldoc -f fork if needed
2. The code will be modified appropriately to incorporate the use of File::Tail if required
3. The two forked child processes are Perl scripts.
If these are not Perl scripts that you can't modify the source of you can still change the location of the STDOUT by using the > syntax. I am not sure about modifying the location of STDERR on something other than a *nix machine, but 2>stderr would work there. As far as turning on auto-flush for non-Perl script - I don't know.
Cheers - L~R
|
|---|