1-wget approx. 5000 rar files total from 2 different servers
2-unrar the archives into approx. 16000 files of 4 different types
3-move/rename the files into different directories based on file type
4-run one group of files through a series of sed filters producing a modified set of files
5-copy all files (now approx. 20000) into a matching directory structure on a Windows box via smbclient
This currently takes about 2 hours to run on a reasonably tricked out Sun E-250. Most of the time is taken up in step 3 - simultaneously moving and renaming the files.
My question is this. Since Perl merely acts as a front end to mv and other command line UNIX commands, would I realize a significant performance increase rewriting this script in Perl? The script works fine as is. I'm not interested in adding functionality. I'm concerned only with execution speed.
Thanks,
Jack
In reply to BASH vs Perl performance by jcoxen
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |