ndnalibi has asked for the wisdom of the Perl Monks concerning the following question:

Hello Purveryors of Wisdom, I have a web-based application that runs a cgi script on execution. The CGI script takes a long time to run (and should). Does anyone have any suggestions as to how to either: a. run the cgi/perl as a "forked" process or b. dynamically update the web page (cgi generated) so that users don't think the process has stalled. Bonus: script will continue to run if user closes page. I've read a little about the "fork" command, but am unsure if this is appropriate in a CGI script.

Replies are listed 'Best First'.
Re: Long running CGI script
by Corion (Patriarch) on Sep 15, 2008 at 20:44 UTC
Re: Long running CGI script
by GrandFather (Saint) on Sep 15, 2008 at 22:52 UTC
Re: Long running CGI script
by gregor-e (Beadle) on Sep 15, 2008 at 22:59 UTC
    Or maybe have a look at CGI::Ajax. It gives a fairly straightforward mechanism of polling for updated info.
Re: Long running CGI script
by pileofrogs (Priest) on Sep 15, 2008 at 21:33 UTC

    Does the user need to see the results or not? IE, do you want to show them the results or just say "Thanks, we're processing your input. Go away". If your users don't need to see the results, you could just make one scrip that handles the user interaction and another that handles the heavy-lifting, and have the user interface script kick off the heavy lifting in the background.

    --Pileofrogs

      The user will get a simplified view of the end result that looks pretty and they can check a random smpling of data to make sure everything looks OK. I am doing this with css using lower resolution images. The other end spits out a high res PDF with many thousands of pages. I can see this taking up to an hour. So your suggestion is basically what I'm looking for. I will take a look at the links and let you know my progess - I need to iron out a couple of image resize problems first. Thanks for the fast input!