perlrush2011 has asked for the wisdom of the Perl Monks concerning the following question:

Hi,

I need a perl daemon to look into a MySQL table and grab all the jobs which need to be executed. New jobs can be added into this table at anytime, so i need the daemon to be constantly checking the table.

Once it has found the jobs, it needs to run another perl script to run with the $jobID found from the table. The child script has a sleep() function where the value comes from user input in the MySQL table.

This is what I got currently, but the problem is only one job can be executed at once and has to wait until the sleep() has finished, where then the next job can be executed.

Code Snippet

# Enter loop to do work for (;;) { my $sth = $sql_query{"sel_ext_master_jobs"}->execute() or die "can't + execute the query"; my $results = $sql_query{"sel_ext_master_jobs"}->fetchall_hashref('j +ob_id') or die $sth->err; foreach my $job_id (keys %$results) { #print "JobID $job_id is available to be executed... +\n"; system("perl -w script.pl $job_id"); } } startDaemon(); sub startDaemon { # Daemonize eval { Proc::Daemon::Init(); }; if ($@) { print "Unable to start daemon: $@"; } # If already running, then exit if (Proc::PID::File->running()) { dienice("Already running!"); exit(0); } }

Is there a way I can make this script run sub_scripts simultaneously, without having to wait for the sleep() to finish in the sub_scripts

I'm using Proc::Daemon

TIA

Replies are listed 'Best First'.
Re: Perl Daemon
by Eliya (Vicar) on Feb 15, 2012 at 12:18 UTC
    Is there a way I can make this script run sub_scripts simultaneously

    Yes, use Parallel::ForkManager to fork off multiple processes.

    Something like this (adapted from the docs):

    use Parallel::ForkManager; ... my $pm = new Parallel::ForkManager(10); # max 10 processes simul +taneously foreach my $job_id (keys %$results) { $pm->start and next; # do the fork system("perl -w script.pl $job_id"); $pm->finish; # do the exit in the child process } $pm->wait_all_children;

    (Note: due to the usage of system() in the child processes, you'd in fact be running twice the number of processes.  If you want to avoid this (and are not doing anything with system()'s return status anyway), you could also use exec("perl -w script.pl $job_id"); in which case the $pm->finish would not be required (though it doesn't do any harm to leave it there, for when the exec should fail for some reason...).)

    Update: note to whoever downvoted (which I suppose was because of the comment on using exec):  if you have doubts, do try it before you downvote :)  It does work just fine, because in the above case, finish simply calls CORE::exit.  If you have evidence to the contrary, show it!