amonroy has asked for the wisdom of the Perl Monks concerning the following question:

I have a CGI that does some heavy stuff that might take a looong time, so I am afraid the browser will time out. I am planning to run the heavy stuff in a child process forked by the parent CGI. The parent will just print an HTML page that will refresh every n seconds and it will check if the child has finished.

Question: is this the best to approach this type of situations? In the future I might want to port all my CGIs to ModPerl 2.0.

I am using Apache 2.0 and Perl 5.8.0

Thank you for your time.

-Andrés

  • Comment on Is forking the best solution for a CGI running a heavy/slow process?

Replies are listed 'Best First'.
Re: Is forking the best solution for a CGI running a heavy/slow process?
by perrin (Chancellor) on Apr 15, 2004 at 23:20 UTC
    Yes, that is a good solution. The only other common solution is to implement a queue of some kind, where your CGI just adds things to the queue and then directs users to a page where they can check their progress in the queue. A queue has the advantage that it can limit the number of simultaneous requests being processed, but requires more work to implement.
Re: Is forking the best solution for a CGI running a heavy/slow process?
by saintmike (Vicar) on Apr 15, 2004 at 23:24 UTC
    If what you're doing on the server side can't run any faster (e.g. if you're retrieving data from a database you have no control over), then that's probably your best bet. merlyn has written a column on the topic.
Re: Is forking the best solution for a CGI running a heavy/slow process?
by etcshadow (Priest) on Apr 16, 2004 at 00:02 UTC
    In general, I'd say that: yes, that's a good way of doing things. Another option (maybe not even as good, but it may depend on the situation) is to keep the connection to the browser open by spooling html as you go. You have to use non-parsed headers (CGI->nph(1)), and set $|=1.

    Anyway, all things considered, the way you're describing is usually better, but the streaming content usually easier (since you don't have to separate your processing into something that can be forked off, and that reports it's status to another script... and then write that other script to display the status and refresh, etc... it's just more work).

    ------------ :Wq Not an editor command: Wq
Re: Is forking the best solution for a CGI running a heavy/slow process?
by tilly (Archbishop) on Apr 16, 2004 at 00:05 UTC
    OT advice.

    Apache 2.0 and mod_perl is not a widely used combination so you may wish to consider Apache 1.3. If you do use mod_perl with Apache 2.0, then I'd advise you to configure Apache to use pre-forking. If you do that, then you may want to avoid Windows.

    The reason for this advice is that Perl's threading model is very, very heavy-weight. Therefore your performance with the threading model likely to be poor. Pre-forking will likely to scale much better (with mod_perl). My advice to avoid Windows is tied to my pre-fork advice. Windows strongly assumes that applications will be multi-threaded, and doesn't play well with applications that try to fork().

Re: Is forking the best solution for a CGI running a heavy/slow process?
by Fletch (Bishop) on Apr 16, 2004 at 02:11 UTC

    Another possibility is to fork off another process which uses HTTP::Daemon or POE::Component::Server::HTTP to provide updates on the status of that job, and then redirect the browser to that process which does the long running job (and also provides status updates to the browser without tying up an Apache process).

Re: Is forking the best solution for a CGI running a heavy/slow process?
by jayrom (Pilgrim) on Apr 16, 2004 at 14:16 UTC
    You might want to read this short Perl.com article about forking with mod_perl.