smimp has asked for the wisdom of the Perl Monks concerning the following question:

We have a perl script that basically runs commands that are read in from a config file. The script captures the stdout/stderr and return of each command. We'd like to run a couple of the commands simultaneously but are stuggling to figure out how to capture stdout/stderr from the forked commands. Any help is much appreciated.

Replies are listed 'Best First'.
Re: fork and stdout/stderr
by tachyon (Chancellor) on Feb 25, 2003 at 02:08 UTC

    When you fork each program (parent and child) has its own copy of the STDOUT and STDERR handles. You capture the output in the usual way:

    my @cmds = qw ( cmd1 cmd2 ); my $pid = fork(); defined $pid or die "fork $!\n"; if ( $pid ) { $parent_captures = `$cmds[0] 2>&1`; } else { $kid_captures = `$cmds[1] 2>&1`; }

    You can probably open STDERR onto STDOUT before the fork and then just capture STDOUT as well but I don't do it that way.

    cheers

    tachyon

    s&&rsenoyhcatreve&&&s&n.+t&"$'$`$\"$\&"&ee&&y&srve&&d&&print

Re: fork and stdout/stderr
by isotope (Deacon) on Feb 25, 2003 at 05:26 UTC
    Ack! Why does everybody result to shell voodoo to do this? You can do the job in pure Perl. Seems like I had some fun with this a long time ago.
    #!/usr/bin/perl -wT use strict; my $pid1=fork(); unless($pid1) { # First child my $stdout_file = 'child1_stdout.log'; my $stderr_file = 'child1_stderr.log'; local(*STDERR); local(*STDOUT); open(STDOUT1, '>'.$stdout_file) or die $stdout_file . ': ' . $!; open(STDERR1, '>'.$stderr_file) or die $stderr_file . ': ' . $!; open(STDOUT, ">&STDOUT1") or die "Couldn't redir stdout: $!"; open(STDERR, ">&STDERR1") or die "Couldn't redir stderr: $!"; system('cmd1'); close(STDOUT1); close(STDERR1); exit(); } my $pid2=fork(); unless($pid2) { # Same thing but call cmd2 instead of cmd1 exit(); } # Don't forget to wait for each child wait(); wait();
    Update: (sigh) Ok, the STDOUT/STDERR redirection won't carry out to the system() call. If you weren't calling an external program, this would work. Using IPC::Open3 is probably the way to go. Using sockets to pass the data back to the parent will probably work better than anything involving the filesystem.

    --isotope
    http://www.skylab.org/~isotope/
Re: fork and stdout/stderr
by Cabrion (Friar) on Feb 25, 2003 at 01:52 UTC
    exec "cmd1 >/tmp/cmdone.log 2>&1"; #combine stdout/stdin exec "cmd2 >/tmp/cmdtwo.out 2>/tmp/cmdtwo.err";# split them up
    Then read back in your log files. Fork or whatever along the way.
Re: fork and stdout/stderr
by MarkM (Curate) on Feb 25, 2003 at 05:26 UTC

    Various ways of accomplishing the effect that you are looking for:

    1. Use ">/tmp/command.out 2>/tmp/command.err" in the shell command string. Read the files after the command completes, and remove the files.
    2. Use IPC::Open3() and select() to monitor all pipes, reading data from pipes that have data ready.
    3. Spawn a thread off to read data from each sub process pipe. The thread would update a shared variable as data arrived on the pipe. The thread would exit when the pipe was closed.

    In order to decide which method would work best, many factors need to be analyzed, including the acceptable level of complexity of the final product, the efficiency of the product, the purpose for capturing stdout/stderr, and the host environment (UNIX vs WIN32).

Re: fork and stdout/stderr
by dpuu (Chaplain) on Feb 25, 2003 at 02:55 UTC
    Investigate the full power of the open function:
    open OUT1, "echo world |"; open OUT2, "echo hello, |" print "output is ...\n", <OUT2>, <OUT1>;
    Getting both stdout and stderr is possible, but I'll leave that as an excercise for the reader. --Dave.