I was curious about different people's approach to this. I write fairly simple scripts (I am a simple sysadmin).
My longest script is about 300 lines, 100 of which is perldoc. That one, was a <qoute>generic</quote> FTP script that I had hoped would
do all of the sorts of ftp things we do (get, put, mget, mput, rename, etc), in one script driven by a control file.
Of course every time we came up with a new ftp batch job, it brought along slightly new requirements, so this script grew
and grew to its present size. I have been chided by one of our architects for not breaking it up into smaller pieces and using
them in concert.
What are the feelings here? Theses pieces are not large enough or general enough to be thought of as OO,
so that's not what I'm asking. More like, if you could write one 200 line script that worked for 12 different batch jobs,
or 6 smaller scripts (say 30 lines) to do the same, which approach would you take?
Thanks,
Rich