Otogi has asked for the wisdom of the Perl Monks concerning the following question:

I wrote a quick script using Parallel::ForkManager that forks a maximum of 10 processes. Each child process gets data from a specific device. I have over 10 devices and so it only forks for the first 10 and ignores the rest of the devices (the for loop continues to loop but no more is forked because max is reached). The forking is done inside a foreach loop that gets the device name and connection info from a hash. I am going to rewrite the whole thing since im not using forking efficiently, however, I am curious and want to know if there is a way in perl to wait for a number of processes to finish inside the for loop before forking more.

update Found out that the code I used to handle zombies  $SIG(CHLD) = 'IGNORE' before I used parallel::ForkManager was the culprit and it does actually wait before forking more.

Replies are listed 'Best First'.
Re: wait before forking more
by graff (Chancellor) on Feb 17, 2006 at 01:25 UTC
    I think you might want to look at waitpid -- there should be a way to get what you want using that.
Re: wait before forking more
by salva (Canon) on Feb 17, 2006 at 09:35 UTC
    I am curious and want to know if there is a way in perl to wait for a number of processes to finish inside the for loop before forking more

    That is exactly what Parallel::ForkManager does! When the forking limit is reached it waits for some child to exit before forking a new one again.

      Parallel:ForkManager does not seem to do this or I am using it incorrectly, this is essentially the structure I am using.
      foreach $device (keys %devices) { $pm->start and next; ..... $pm->finish; }
      I dont use $pm-<wait_all_children since i want it to continuously get input while children are processing.