What I need are a few more ideas, besides the one mentioned above, for suitable problems. Other than dealing with huge, incomprehensible log files on a daily basis, my cow-orkers also have to grub about in rather large databases and tweak data. In addition they use a complex build process which is controlled by setting dozens of environment variables.
Not exactly the kind of thing your cow-orkers seem to be deaing with on a daily basis, but I may suggest you two tasks from my early Perl days, that I had formerly addressed with rather complex and clumsy shell scripts:
- Pick up files from a complex directory hyerarchy and move them into a simpler one, with new filenames corresponding to the whole old path and somewhat massaged for compatibility with M$ filesystems and with iso9660 ones (yes I know 'bout RR before someone mentions that). Since the number of files is very huge, move to numbered subdirectories of a given target directory with each subdirectory holding at most 1000 files, all from corresponding original subdirectories or -in the remote event that one of them contained more than 1000- split them intelligently.
- generate some off line html docs and/or reports by first rolling your own mini-templating-language first in shell, then in Perl and then add features till it starts to become a mess and then move to some real templating system.