Well as I was dreaming last night, I felt bad about not answering better. So I thought up a typical real world problem that one might face.The problem: You must search thru a list of subdirs, defined by a specific alpha-numeric pattern. In each of those subdirs, you must locate a specific file (defined by another alpha-numeric pattern), and search thru that file for a specific name(s), given on the commandline as options. You must include logic to account for abbreviation of the middle initial/name. When you find those names, extract an account number, and save them. Next, go to another list of subdirs( as defined by another alpha-numeric pattern) and search all files for that account number. If that account number(s) is found, add the file to a backup tarball, then delete the account number from the file, and save the resulting file to another tarball, in a directory named based on the account number. The tarballs cannot exceed 500 megs, and must named sequentially. When done with 1 in a sequence, transfer it thru SFTP to another computer, and email the account manager.
:-)
Now, I'm sure that this can be done with a multi-line shell script full of pipes , sed, % signs, echo, and grep; but you would need one heck of alot of shell expertise, and most people would not be able to follow it's logic, with all the pipes. In Perl however, a clear script could be written, with intermediate arrays available for debugging printouts, etc. I will concede that shell can be faster and simpler for some simple operations, BUT in general, it is hard to follow, and full of hard_to_decipher option lists and pipes, for most real world tasks. Sure, just recursively tarring up a directory is simple with the shell, but it's the other things that are usually needed, that cast shadows over shell and make Perl shine.
| [reply] |