I know this is PERLmonks but if you are adamant about not writing this thing in it's entirety, why not cron + short shell script running wget? wget offers many switches that can make your life easier such as checking timestamps and file sizes before downloading. And if you can do it via SSH+RSA keys, there is rsync which is even better on only downloading new or modified files. Another possibility I can think of is CFEngine. It can download and execute files across countless systems. Hopefully they've fixed some of their file-transfer problems since I last used them.
As far as slicing-n-dicing(tm) your data into your database. Unless you already have a program to do this, you will probably need to write a script yourself. Use DBI or if you are unfamiliar with it and in a hurry, just have your script dump the required SQL to a file or the sql client directly.
Bottom line is, and you basically said it yourself, the systems out there are overkill for your needs because software tends to be generalized to meet the needs of many. Since your needs are narrow and specialized it's tough to find a system that fits and you either have to wrangle one of these "big" systems to your needs, or put together your own.
Good Luck
In reply to Re: cron plus?
by pzbagel
in thread cron plus?
by glwtta
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |