hacker has asked for the wisdom of the Perl Monks concerning the following question:
Basically I'm trying to consectively open a file for each child forked, store the results of my forked process in the file and close it. Each file is incremented for each child. Below is a small sample that exhibits this behavior:
use strict; use Parallel::ForkManager; my @ary = (1..100); my $count = 1; my $pm = new Parallel::ForkManager(30); foreach my $val (@ary) { # Without 'and next' works, but is it forking? # $pm->start; $pm->start and next; print "Working on $val (count: $count)\n"; open(ARY, ">$count") or die $!; print ARY "$val, $count"; close ARY; $count++; $pm->finish; };
If I apply one theory: If I have initialized the processing to 30 children, do they all get their first loop through the foreach() here, and hence the count is always '1'?
I notice that if I use '$$' instead of $count, they are indeed all running in their own pid, so that would seem to validate that.
Is there any way to apply an incrementing counter here, one for each child, in the order they were processed? Does removing the 'and next' still fork?
Update: As 3dan suggested, the run_on_start method was exactly what I needed here. Working pseudocode below:
use strict; use Parallel::ForkManager; my @ary = (1..100); my $count = 1; my $pm = new Parallel::ForkManager(30); $pm->run_on_start( sub { my ($pid,$ident) = @_; open(ARY, ">$count") or die $!; print ARY "$count"; close ARY; $count++; } ); foreach my $val (@ary) { $pm->start and next; print "Working on $val (count: $count)\n"; $pm->finish; };
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Odd Parallel::ForkManager behavior
by broquaint (Abbot) on May 19, 2003 at 11:25 UTC | |
|
Re: Odd Parallel::ForkManager behavior
by edan (Curate) on May 19, 2003 at 11:43 UTC | |
|
Re: Odd Parallel::ForkManager behavior
by mod_alex (Beadle) on May 20, 2003 at 06:38 UTC |