hacker has asked for the wisdom of the Perl Monks concerning the following question:

I'm working with Parallel::ForkManager and I've found that there may be a slight 'flaw' in the design (or my understanding of it) that I need to grasp.

Basically I'm trying to consectively open a file for each child forked, store the results of my forked process in the file and close it. Each file is incremented for each child. Below is a small sample that exhibits this behavior:

use strict; use Parallel::ForkManager; my @ary = (1..100); my $count = 1; my $pm = new Parallel::ForkManager(30); foreach my $val (@ary) { # Without 'and next' works, but is it forking? # $pm->start; $pm->start and next; print "Working on $val (count: $count)\n"; open(ARY, ">$count") or die $!; print ARY "$val, $count"; close ARY; $count++; $pm->finish; };

If I apply one theory: If I have initialized the processing to 30 children, do they all get their first loop through the foreach() here, and hence the count is always '1'?

I notice that if I use '$$' instead of $count, they are indeed all running in their own pid, so that would seem to validate that.

Is there any way to apply an incrementing counter here, one for each child, in the order they were processed? Does removing the 'and next' still fork?

Update: As 3dan suggested, the run_on_start method was exactly what I needed here. Working pseudocode below:

use strict; use Parallel::ForkManager; my @ary = (1..100); my $count = 1; my $pm = new Parallel::ForkManager(30); $pm->run_on_start( sub { my ($pid,$ident) = @_; open(ARY, ">$count") or die $!; print ARY "$count"; close ARY; $count++; } ); foreach my $val (@ary) { $pm->start and next; print "Working on $val (count: $count)\n"; $pm->finish; };

Replies are listed 'Best First'.
Re: Odd Parallel::ForkManager behavior
by broquaint (Abbot) on May 19, 2003 at 11:25 UTC
    Because the $cnt variable is incremented in the child, it won't effect the parent (as they're two separate processes). So you can either increment before forking and hope the children are processed in the order they were forked (quite likely in your case), or get the child processes to communicate back to the parent process the incrementing of $cnt (or even have the CHLD handler increment it). See. perlipc for more info on the latter option and kill for passing about signals.

    As for the and next the docs say

    The "and next" skips the internal loop in the parent process.
    So without it you don't move onto the next child process (if my thinking is correct).
    HTH

    _________
    broquaint

Re: Odd Parallel::ForkManager behavior
by edan (Curate) on May 19, 2003 at 11:43 UTC
    You could look into using the run_on_start or run_on_finish methods, and increment the counter there... might that easily solve your 'in the order they were processed' requirement?

    --
    3dan
Re: Odd Parallel::ForkManager behavior
by mod_alex (Beadle) on May 20, 2003 at 06:38 UTC
    Dear Hacker
    I don't quite understand your problems with ForkManager

    Leave "... and next;" as is. It is needed not to process child code in the parent process.
    here is a working example
    #! c:/perl/bin/perl.exe -w use strict; use Parallel::ForkManager; my @ary = (1..100); my $count = 1; my $pm = new Parallel::ForkManager(30); foreach my $val (@ary) { $count++; $pm->start and next; print "Working on $val (count: $count)\n"; open(ARY, ">child$count") or die $!; print ARY "$val, $count"; close ARY; $pm->finish; }