crammed_in has asked for the wisdom of the Perl Monks concerning the following question:

Hi all,

I have a perl script that schedules user-defined jobs. It kicks off a CGI on other servers (which is the same CGI on each server) at different times of the day (to spread the processing among several servers). All the CGI does is start a perl script on its server (which may then spend a few hours processing).

The problem is that my perl schedule script wants to wait for the CGI to exit before executing the same CGI with another job. I really need an exit status or something so the scheduler can go on with the other jobs.

I tried the following:
- made the CGI run the processing script on the server with a call to "system ('command &') to put it in the background
- made the CGI and processing script into one big script with a fork, then exit the parent

But the scheduler is still hanging around for it to exit in either case. I do not have the 'at' command on the server to try that out. Any suggestions would be greatly appreciated!

Replies are listed 'Best First'.
Re: running another script without waiting
by Roger (Parson) on Oct 08, 2003 at 06:10 UTC
    Is your perl scheduler (not your CGI script) in the form of -
    ... my $pid = fork() or die "Can not fork!"; if ($pid == 0) { # in child http->get("machine1:port1/process.cgi?arg1=val1&..."); exit(0); } # in parent, kick off another process $pid = fork() or die "Can not fork!"; if ($pid == 0) { # in child http->get("machine2:port2/process.cgi?arg2=val2&..."); exit(0); } ... and so on...
    Please post a bit more information in the future including a snip of your code so the monks can have a look.

      Thank you for the reply. I am using this command in the scheduler:

      foreach $url ( @scheduled_jobs ) {
          $req = new HTTP::Request GET => $url;
          $res = $ua->request($req);
      }

      The scheduler is a daemon forked into the background. I am running everything on FreeBSD.

      Your post is interesting - making grandchildren. I could give that a try.
        I see that you are using LWP::UserAgent module. That's where your problem lies, because I believe the user agent is a single threaded process that waits for the GET to complete before processing the next request.

        So if you change your scheduler to the following, it should HTTP get all your scheduled jobs without waiting for them to complete:

        foreach $url ( @scheduled_jobs ) { my $ua = new LWP::UserAgent; my $req = new HTTP::Request GET => $url; my $pid = fork(); if ($pid == 0) { # in the child my $res = $ua->request($req); exit(0); } # in the parent, carry on with next job... }
        I haven't tested the code, but I believe if you do something along the line, it should work.

Re: running another script without waiting
by BrowserUk (Patriarch) on Oct 08, 2003 at 06:17 UTC

    I assume from your mention of 'at' that you are running on Win32 systems, in which case using '&' on the end of your command line does not background a task.

    Use this instead.

    system( 'start /b yourcommand 1>nul 2>&1' );

    The call to system will return immediatly and the command will run in the background.

    Note the redirection of the output though. If there is any possibility of the command producing any output then you should redirect it somewhere safe.

    I've redirected to the nul device by way of example, but you probably want to change that to go to a log file somewhere.


    Examine what is said, not who speaks.
    "Efficiency is intelligent laziness." -David Dunham
    "Think for yourself!" - Abigail

      Thank you for the reply. The system is actually FreeBSD but I did a system-wide find for 'at' - not available on it.

        In that case, you should probably be looking for cron, and hopefully someone will leap in and exlain why your 'command &' isn;t doing what you think it should do.

        Please ignore the rest of my previous post...and possibly this one too, cos for all I really know, FreeBSD might actually have an AT command.


        Examine what is said, not who speaks.
        "Efficiency is intelligent laziness." -David Dunham
        "Think for yourself!" - Abigail

        The system is actually FreeBSD but I did a system-wide find for 'at' - not available on it.

        Hmm that's odd, at is a standard *nix command (try which at, should be in /usr/bin). In any case, you wouldn't really want an at job for this anyway. An at job is useful for when you want to run a command once at some point in the future. It sounds like you want to run your driver program at regular intervals. That's what a cron job is for.

        man at man cron man crontab
        -- vek --
Re: running another script without waiting
by DrHyde (Prior) on Oct 08, 2003 at 08:41 UTC
    Try system('nohup command &').
Re: running another script without waiting
by vek (Prior) on Oct 08, 2003 at 14:31 UTC

    made the CGI run the processing script on the server with a call to "system ('command &') to put it in the background

    The & will indeed place the processing script in the background. Unfortunately as soon as the CGI process ends, so will your processing script as it will receive a SIGHUP. To prevent this, you can use the nohup command to ensure your processing script is immune to hangups (man nohup).

    -- vek --