Here's an example using Parallel::ForkManager that has come in very handy lately for these types of questions. It'll process $max_forks at a time. In the for loop, I've put in the number of clients (33). It'll process 10 at a time, calling do_something() for each one until all 33 are exhausted.
#!/usr/bin/perl use warnings; use strict; use Parallel::ForkManager; my $max_forks = 10; my $fork = new Parallel::ForkManager($max_forks); # on start callback $fork->run_on_start( sub { my $pid = shift; } ); # on finish callback $fork->run_on_finish( sub { my ($pid, $exit, $ident, $signal, $core) = @_; if ($core){ print "PID $pid core dumped.\n"; } } ); # forking code for my $client (1..33){ $fork->start and next; do_something($client); sleep(2); $fork->finish; } sub do_something { my $client = shift; print "$client\n"; } $fork->wait_all_children;
-stevieb
UPDATE: I can't say for sure, but after removing the sleep statement, the output hints at the fact that it'll add more into the queue before previous ones are finished as long as the max count doesn't go over 10. I'm not 100% sure of this though.
UPDATE 2: According to the Parallel::ForkManager docs, it does indeed throw another proc onto the heap after each one finishes. This number is configurable:
wait_for_available_procs( $n ) Wait until $n available process slots are available. If $n is not +given, defaults to 1.
In reply to Re: Sequential processing with fork.
by stevieb
in thread Sequential processing with fork.
by Kelicula
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |