muba has asked for the wisdom of the Perl Monks concerning the following question:

Intro

Alright. So I am setting up this web site where any two users could sort-of interact with each other at any given time. Let's name any pair of users just Alice and Bob. :-)

Now, one of the interactions consists of Alice filling in a report about an event and then requiring Bob to either confirm or deny that event, within 24 hours. If he doesn't respond, it is implied that he agrees with the report and confirms it.

If the event is confirmed, then some DB changes need to be made and a bunch of e-mails need to be sent around.

If Bob objects to the report or denies the event (same thing), only a small DB change is needed (an Objection flag needs to be set) and nothing else is required until an administrator sets out to review the case.

What I need

So, basically, what I need is this:

When Alice files the report, a task needs to be scheduled. The task should be executed either after 24 hours or whenever Bob confirms the report, whichever comes first. But the task shouldn't be executed when Bob makes an objection.

The most logical thing, it seems, is that the scheduled task itself checks whether it still needs to be executed. So, before the scheduled tasks does anything, it simply checks the DB. Has Bob confirmed or objected? Don't do anything. Has Bob not responded? Execute the task.

A problem

Unfortunately, I lack access to cron or at like services, so I'll need to build something myself. The easiest thing to do, it'd seem, would be to make a Scheduler like module that checks for queued tasks on every hit (simply use MyScheduler; in every script of the website). Easy enough, but maybe a bit heavy resource-wise. Another problem I see would be when nobody checks the website for more than 24 hours. The MyScheduler won't be activated and scheduled tasks won't run in time. :-(

So I could start a background process that'll sleep 60*60*24;, check whether it still needs to run, and perform its job if necessery, but that will soon get messy when lots of users start to file reports.

And finally: my question

What better solutions are out there?

I have been given a few answers in the CB, which I will include here for completeness' sake. Please, comment on them or provide additional answers.

[tye] Best option is probably a long-running daemon that checks for stuff to do on a reasonable schedule and also does exec($^X,$0,@ARGV) every day or few

[muba] hm. I like the daemon idea. I should basically start it just with ./myschedulerd &, right?

[ysth] nohup. or even Net::Daemon [...] no, it's not N::D I was thinking of.[...] maybe Daemon::Generic

Replies are listed 'Best First'.
Re: Scheduled tasks from CGI website. But no cron or at.
by Corion (Patriarch) on Sep 24, 2009 at 06:56 UTC

    On Windows, I'm using Schedule::Cron1 to keep a long-running cron daemon. If you want to avoid long-running tasks (or can't launch them as you don't have shell access), maybe a crond loop that gets triggered every minute by a HTTP request from the outside could help.

    1Actually, I'm using Schedule::Cron::Nofork, but nowadays, Schedule::Cron has the nofork option itself and I should retire/deprecate Schedule::Cron::Nofork.

Re: Scheduled tasks from CGI website. But no cron or at.
by hangon (Deacon) on Sep 24, 2009 at 06:58 UTC

    I'll agree with the long running daemon option if you are allowed to run it. Once I wrote a scheduler daemon under similar circumstances, but unfortunately it would die after a day or two for no reason. After much hair pulling, it turns out that the web host was periodically killing off "unauthorized" long running processes. It seems that running your own daemon violated their terms of service.

    My solution was to write an externally triggered scheduler as a cgi program, and a separate triggering daemon running on a computer that I controlled. The daemon would send an http request to the scheduler once an hour. On receiving the request the scheduler would check a file for queued jobs, run them and exit.