in reply to Using perl to speed up a series of bash commands by transforming them into a single command that will run everything in parallel.

Perl has a finer control over forking than any shell. It rivals the system C libraries in that regard. You may enjoy comparing your code to perl's native system calls for the job.

my %kid; for (@commands) { defined(my $cpid = fork) or sleep 1, redo; $cpid and $kid{$cpid} = 1, next; # parent %kid = (); # child exec '/bin/bash', '-c', $_; # thanks, ikegami exit 1; } delete $kid{+wait} while %kid; print "@{[times]}\n";

After Compline,
Zaxo

  • Comment on Re: Using perl to speed up a series of bash commands by transforming them into a single command that will run everything in parallel.
  • Download Code

Replies are listed 'Best First'.
Re^2: Using perl to speed up a series of bash commands by transforming them into a single command that will run everything in parallel.
by salva (Canon) on Jun 09, 2006 at 20:04 UTC
    Or using some handy module...
    use Proc::Queue qw(system_back all_exit_ok), size => 8; # this ensures that, at most, 8 # child processes run at any time my @pids = map { system_back $_ } @commands; all_exit_ok(@pids) or warn "some processes failed\n";