You are missing the point.
So you saved 6.5 seconds. Let's assume that as the actual script is more complex and the number of files is larger, you'd manage to shave 30x as much off of its runtime. That's 195 seconds, a little over three minutes. Your code probably does that job 60x or maybe 100x faster than the original script.
These numbers, by any standard, are impressive.
Unfortunately, they kind of pale in comparison to the 2 hours runtime the script currently takes…
Is it worth going to any lengths to take 3 minutes off the runtime of a 2-hour job? Hardly.
But if you can arrange for four parallel downloads (and one doesn't have to go Perl for that — job control is almost the shell's raison d'être), even considering all the other work the script has to do, runtime would drop to something over half an hour. Maybe 45 minutes.
Now, which of the two options seems more worth pursuing?
Makeshifts last the longest.
In reply to Re^3: BASH vs Perl performance
by Aristotle
in thread BASH vs Perl performance
by jcoxen
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |