glwtta has asked for the wisdom of the Perl Monks concerning the following question:
It seems I need a system that will do the following (I would think this is rather common): at scheduled times, check certain online resources, if there are changes then download, parse, index and/or insert the data into a database alongside the old versions, or superseeding them.
So basically, what I need is a system that will easily allow me to schedule tasks that are comprised of (usually) three arbitrary scripts: the check for new / prepare data / integrate data. This would need have some rudimentary dependency checking, and very basic reporting (and failure notification).
I simply do not have the time to write this myself, and so far, all I could find in online searches are large systems that seems to be targeted at scheduling high volumes of tasks, and just look too "big" and scary, usually involving separate daemons running and generally seem to be overkill for what I need.
Any suggestions for a system which is simple and, well, doesn't include a kitchen sink? In an ideal world, I would want something that has a single web interface and allows me to schedule the same tasks on different machines (with different parameters), but that's certainly not a requirement to get something functional.
I know this is probably fairly common, but I don't think I can articulate what I need well enough to search for it successfully, without the involvement of knowledgeable individuals such as yourselves :)
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: cron plus?
by LameNerd (Hermit) on Apr 29, 2003 at 17:54 UTC | |
|
Re: cron plus?
by pzbagel (Chaplain) on Apr 29, 2003 at 20:16 UTC | |
|
Re: cron plus?
by Fletch (Bishop) on Apr 29, 2003 at 18:31 UTC | |
by glwtta (Hermit) on Apr 29, 2003 at 19:29 UTC | |
by Fletch (Bishop) on Apr 29, 2003 at 21:07 UTC | |
| A reply falls below the community's threshold of quality. You may see it by logging in. |