in reply to Re: Using perl to speed up a series of bash commands by transforming them into a single command that will run everything in parallel.
in thread Using perl to speed up a series of bash commands by transforming them into a single command that will run everything in parallel.

A very good tip. I thought I was defending against, for example, passing in a hashref by mistake. But it appears that perl will throw an error in this case. Yay perl!

However, I had to modify the sub you proposed to get it to work:

 return join '', map { "( ( $_ ) & );" } @{ $_[0] };

Here's the code with a hashref, which gets thrown a nice error.

#!/usr/bin/perl use strict; use warnings; use Carp qw(confess); my @commands = map { "echo $_; sleep 2"; } qw(a b c d e); #does first, waits two seconds, does the second, waits two seconds, et +c. (should take about ten seconds) #for my $bash_command ( @commands ) { # time_bash_command($bash_command); #} #does commands in parallel. (should take about two seconds) my $parallel_running_command = parallelize_bash_commands([@commands]); $parallel_running_command = parallelize_bash_commands( { a => 'sleep 2 +', b=> 'sleep 2', c => 'sleep 2' } ); sub parallelize_bash_commands { return join '', map { "( ( $_ ) & );" } @{ $_[0] }; #return join '', map { "( ( $_ ) & );" } @{ my $input = shift or di +e "no input" }; # alternative -- also works, and checks if you forgot + to pass in a var. time_bash_command($parallel_running_command); } sub time_bash_command { my $command = shift or die "no command"; $command = "time `$command`"; print "$command\n"; print `$command`; }
  • Comment on Re^2: Using perl to speed up a series of bash commands by transforming them into a single command that will run everything in parallel.
  • Select or Download Code