TheOtherGuy has asked for the wisdom of the Perl Monks concerning the following question:

I would like to be able to execute a process in the background from a CGI script, have the CGI script return back immediately. Using a fork() and exec(), both the parent and the child process write data back to the web page... and it doesn't complete as long as the child process is running (the child will keep running if you back out of the webpage.) Executing the script using nohup and in the background does not resolve the problem.

Has anyone seen this before? Suggestions on how to resolve?
  • Comment on Running a process in the background from CGI scripts...

Replies are listed 'Best First'.
Re: Running a process in the background from CGI scripts...
by cephas (Pilgrim) on Dec 09, 2000 at 02:30 UTC
    First, you'll need to unbuffer STDOUT, and then you'll need to close STDOUT in the child (if you don't unbuffer, when you close STDOUT in the child, the buffer will be flushed and you may get some duplicate output.) Also your parent should just exit, and then the child will be left to do its thing in the background.

    Something like this should work

    local $| = 1; if($pid = fork()) { #Do some parent stuff exit; } else { #Child close(STDOUT); #Do child stuff }

    Hope that helps.

    cephas
Re: Running a process in the background from CGI scripts...
by Fastolfe (Vicar) on Dec 09, 2000 at 01:51 UTC
    Usually, if the parent forks and then exits, leaving the child to exec some external program, the browser should usually never see the output of the child process, because the exiting of the parent signals the end of the request as far as the web server is concerned.

    The exception to this is Windows, where fork is emulated. In this case, the parent's call to exit will not cause the parent to exit until the child exits first, allowing the child's output to appear in the browser. At least that's my suspicion.

    As another poster mentioned, the "fix" to this is just to keep the child from priting anything. Close STDOUT (and maybe STDERR) before exec'ing for instance.

Re: Running a process in the background from CGI scripts...
by elwarren (Priest) on Dec 09, 2000 at 04:09 UTC
    You might be better off rewriting your background process into a simple daemon. Then you can submit the task to it and then collect it when it's done.

    If the CGI gets alot of activity this model will also help reduce the load from forking. If you really felt like investing the time you could then extend that so that your compute intensive task could run on a seperate machine.

    And then you could, oh, wait, any more than that and you'd be better off with an application server :-)
Re: Running a process in the background from CGI scripts...
by chipmunk (Parson) on Dec 09, 2000 at 01:08 UTC
    How to resolve this depends on where you want the background process to write to, I guess. In any case, the child should close STDOUT so that the webserver is not waiting for output from it.
Apache::SubProcess()
by mattr (Curate) on Dec 10, 2000 at 13:16 UTC
    You might like to check out Apache::SubProcess which does not seem too well documented, but I am looking at using it as a way to do the same from within a mod_perl environment.

    The module overrides system() and creates a process from within perl which I believe is completely detached from the calling program. In mod_perl, if you fork() you can destroy the gains you made by using mod_perl in the first place, and so encapsulation in a perl module is recommended. If you are having trouble getting your process detached (memory-wise, and stdin/stdout/stderr-wise) why not try it out?

    If anyone else has experience with Apache::SubProcess() I'd sure like to hear about it too! The mod_perl guide has a memo the author wrote that he needs to study it too!