gw1500se has asked for the wisdom of the Perl Monks concerning the following question:

I would like to run a command into which I will feed data via a pipe (sysopen). However, at the same time I would like to read the output from that command. I cannot seem to find an example or a good explanation of how to sysopen a command to write to its STDIN and read from its STDOUT and STDERR. Can someone tell me how to do this? Thanks.

Replies are listed 'Best First'.
Re: Setting up 2 way pipe for a command
by shmem (Chancellor) on Jun 29, 2008 at 15:01 UTC

    See IPC::Open2, or IPC::Open3 for reading STDERR as well.

    --shmem

    _($_=" "x(1<<5)."?\n".q·/)Oo.  G°\        /
                                  /\_¯/(q    /
    ----------------------------  \__(m.====·.(_("always off the crowd"))."·
    ");sub _{s./.($e="'Itrs `mnsgdq Gdbj O`qkdq")=~y/"-y/#-z/;$e.e && print}
      Thanks. I was looking for a built-in perl method. I forgot to check for modules.
Re: Setting up 2 way pipe for a command
by gloryhack (Deacon) on Jun 29, 2008 at 21:03 UTC
    When it's up to me to implement this, I almost always reach for IPC::Run. The documentation is a bit unwieldy, but once you work through it the implementation is about as straightforward as it can be.
      Thanks for the suggestion. I may have to switch to IPC::Run as I cannot get open2 to work. It seems to block when I try to read the output. I tried setting autoflush for the output but it doesn't seem to help. I should make it clear that I am trying to make this interactive so I need the output for each input (if there is one) while the command is running. I believe there is a 1:1 correspondence between a line into the command and a line out. The line in does what it is supposed to but at that point the script hangs waiting for a line on the output side so the next input line never goes. However, it is possible that there is not a 1:1 correspondence but until I can manage the blocking I won't really know. It appears like pump might do what I need but I have to read up on it more and see some examples.
      I switched to IPC::Run but has not faired any better. I keep getting a premature end error even though there is input data. Here is the relevent code segment I have now after many various iterations:
      my $h=start(\@cmd,\$in,\$out,\$err); while (!eof(FILELIST)) { $thisline=readline(FILELIST); ($thisfile,$thisfilesize)=split(/\t/,$thisline +); $in="$thisfile\n"; $h->pump; while ($h->pumpable) { $h->pump; } if (length($out)>0) { putmsg($out); } $total+=$thisfilesize; $complete=sprintf("%.2f",$total/$dumpsize); putmsg("$complete%\n"); } finish $h;
      In one of my iterations I avoided the premature end message however, the script simply quits on the first call of 'pump'. No error message, nothing. Does anyone one see anything obvious? Thanks.
        Uh... "premature end error"? As in HTTP server error response? If this is indeed the case, try running the thing from the command line, or at least printing a text/plain header somewhere ahead of the block above.

        Maybe on that iteration that didn't puke, the application being run via IPC::Run didn't output anything?