I wrote a script to demonstrate the difference between running a series of five simple two-second commands "serially" (one after the other) versus running them in parallel. Serially, it takes ten seconds. Parallelly, it takes two...touch a; ls -l a; sleep 2
I think your choice of commands for demonstration is a bit too simple -- to the extent that the results may be misleading.
If you parallelize any heavy processing on a single machine, you will of course see a slow down in the execution time for any single instance of the process, relative to how long it would take if it weren't running in parallel with other heavy processes.
Given the nature of multi-processing, there will be a trade-off point somewhere: some number N such that running N processes in parallel will be faster than running them serially, but running N+1 in parallel will be slower than, say, running (N+1)/2 in parallel, followed serially by running the remainder in parallel.
Mileage will vary depending on how heavy the processing is, and what resources are needed most: memory-bound, cpu-bound and io-bound jobs might show slightly different trade-offs, depending on how you combine them and what your hardware happens to be.
In reply to Re: Using perl to speed up a series of bash commands by transforming them into a single command that will run everything in parallel.
by graff
in thread Using perl to speed up a series of bash commands by transforming them into a single command that will run everything in parallel.
by tphyahoo
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |