in reply to running another script without waiting

Is your perl scheduler (not your CGI script) in the form of -
... my $pid = fork() or die "Can not fork!"; if ($pid == 0) { # in child http->get("machine1:port1/process.cgi?arg1=val1&..."); exit(0); } # in parent, kick off another process $pid = fork() or die "Can not fork!"; if ($pid == 0) { # in child http->get("machine2:port2/process.cgi?arg2=val2&..."); exit(0); } ... and so on...
Please post a bit more information in the future including a snip of your code so the monks can have a look.

Replies are listed 'Best First'.
Re: Re: running another script without waiting
by crammed_in (Initiate) on Oct 08, 2003 at 06:33 UTC
    Thank you for the reply. I am using this command in the scheduler:

    foreach $url ( @scheduled_jobs ) {
        $req = new HTTP::Request GET => $url;
        $res = $ua->request($req);
    }

    The scheduler is a daemon forked into the background. I am running everything on FreeBSD.

    Your post is interesting - making grandchildren. I could give that a try.
      I see that you are using LWP::UserAgent module. That's where your problem lies, because I believe the user agent is a single threaded process that waits for the GET to complete before processing the next request.

      So if you change your scheduler to the following, it should HTTP get all your scheduled jobs without waiting for them to complete:

      foreach $url ( @scheduled_jobs ) { my $ua = new LWP::UserAgent; my $req = new HTTP::Request GET => $url; my $pid = fork(); if ($pid == 0) { # in the child my $res = $ua->request($req); exit(0); } # in the parent, carry on with next job... }
      I haven't tested the code, but I believe if you do something along the line, it should work.