Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

I'm using a forked pipe [ open(FILE, '|-' ] and all appears to be well until I try to close the file handle, which blocks. I understand that close waits for the child process to exit before returning, but the read also appears to block even when the close has been initiated. This seems to create a deadlock situation of the child process waiting for the read to return, and the close waiting for the child to exit. The following test case may help to explain.
our @handles; our $message = ('xxxxxxxx'); sub child { my $buffer; while(sysread(STDIN, $buffer, 8)) { print("$$: $buffer\n"); } } for($i=0; $i < 4; $i++) { if(!open($handles[$i], '|-')) { # child process child(); exit 0; } } for($i=0; $i < 4; $i++) { syswrite($handles[$i], $message); syswrite($handles[$i], $message); syswrite($handles[$i], $message); } for($i=0; $i < 4; $i++) { close($handles[$i]); } Produces the following output then freezes: 18241: xxxxxxxx 18241: xxxxxxxx 18241: xxxxxxxx 18242: xxxxxxxx 18242: xxxxxxxx 18242: xxxxxxxx 18243: xxxxxxxx 18243: xxxxxxxx 18243: xxxxxxxx 18244: xxxxxxxx 18244: xxxxxxxx 18244: xxxxxxxx
If I use an external command such as open(FILE, '| cat') it works fine, but deadlocks with the forked pipe version. Any ideas?

Replies are listed 'Best First'.
Re: Forked pipe deadlock
by pc88mxer (Vicar) on May 25, 2008 at 20:50 UTC
    Your other child processes are keeping the pipe open for the first one. To demonstrate this, just reverse the order in which you close the pipes:
    for (my $i = 3; $i >= 0; $i--) { close($handles[$i]); }
    That is, when the parent forks child #2, it will inherit the parent's pipe to child #1, and that is what is keeping child #1 from seeing EOF.

    One solution is to explicit close those handles in the new children:

    if (!open($handles[$i], "|-") { ...close $handles[0..$i-1]... child(); exit(0); }
    Update: Corrected index bounds - thanks psini!
      for (my $i = 3; $i >= 0; $i--) { close($handles[$i]); }

      In Perl, I'd prefer something like this instead of the C style for loop (YMMV of course):

      close $handles[$_] for reverse 0..3;

      (the explicit reverse emphasizes that it's essential here to do something in reverse order of how you'd do it normally)

      Ehm, you probably meant:

      for (my $i = 3; $i >= 0; $i--) { close($handles[$i]); }

      Rule One: Do not act incautiously when confronting a little bald wrinkly smiling man.

      Thanks - that fixed it. Changed the code to:
      if(!open($handles[$i], '|-')) { # child process foreach $handle (@handles) { defined($handle) && close($handle); } child(); exit 0; }
      I wouldn't have thought of that, so many thanks - very much appreciated.
Re: Forked pipe deadlock
by pc88mxer (Vicar) on May 25, 2008 at 21:31 UTC
    Keeping track of the file handles to close (or share) is probably the only way to solve the problem. Unix has a close-on-exec flag, but there's no 'close-on-fork' flag.

    For more on this issue, see the thread Practice of using fork() in comp.unix.programmer.

    It also includes a good discussion about what library routines might not be safe to call after using fork.

Re: Forked pipe deadlock
by psini (Deacon) on May 25, 2008 at 21:20 UTC

    Another approach could be having the children close the inherited output pipes. If you substitute your "children" code with:

    if(!open($handles[$i], '|-')) { # child process for (my $j=0; $j<$i; $j++) { close($handles[$j]); } child(); exit 0; }

    each child closes the pipes to its brothers before entering the child() sub. This allows the parent process to close the pipes in any order, because they are no longer blocked by other children.

    This method works until you try to fork a new child after having closed one of the older pipes... I'm sure there should be a cleaner solution

    Rule One: Do not act incautiously when confronting a little bald wrinkly smiling man.