yaralian has asked for the wisdom of the Perl Monks concerning the following question:

Hi, I am trying to run a separate asynchronous process from a perl script. I use a handle and rather than open a file run a piped command. This works fine. But if it's inside a loop at the second iteration of the loop the handle being the same the command doesn't run. To get around this I decide to use anonymous handles but nothing seems to work. Anyone know what I'm doing wrong?
# This is oversimplified but should work. # The $command string is generated each time # The $handle is supposed to be an anon file handle. use IO::File; for($f=0; $f<10; $f++){ my $handle = IO::File -> new(); open($handle, "$command |"); }
Help please!!! Thanks in advance John

Replies are listed 'Best First'.
Re: Asynchronous processes and anonymous handles
by moritz (Cardinal) on May 16, 2008 at 07:12 UTC
    $handle goes out of scope after each iteration, thus an implicit close is called. If you want to actually use your file handles you have to store them somewhere.
Re: Asynchronous processes and anonymous handles
by waba (Monk) on May 16, 2008 at 19:06 UTC

    I'll add that assigning a new IO::File to $handle before calling open probably doesn't do what you'd expect:

    If FILEHANDLE is an undefined scalar variable (or array or hash element) the variable is assigned a reference to a new anonymous filehandle, otherwise if FILEHANDLE is an expression, its value is used as the name of the real filehandle wanted. (This is considered a symbolic reference, so "use strict ’refs’" should not be in effect.)

    So a declared $handle is all you need. It can even be written as open my $handle, "$command |".

    Combining with moritz's advice,

    my @handles; foreach my $f ( 0 .. 9 ) { open $handles[$f], "$command |"; }

    Of course, this is assuming that you actually want to read the standard output of $command. If all you want to do is fire and forget the process, there are lighter alternatives (system).