MonsieurBon has asked for the wisdom of the Perl Monks concerning the following question:

Hello

I have been searching for a solution to this problem for quite a while now: I have a website where people can upload a file (usually 10s to 100s of MB). After the upload of the file I need to do some heavy processing with the file like encryption, conversion of file types, etc. The user does not need to wait for the process to finish. He could surf the page and come back later to see the result. The problem is, that lighttpd buffers all output until all process terminate. So printing a redirect before processing, forking a childprocess (even with $SIG{CHLD}='IGNORE';) or using Proc::Simple to start a function in background did not work! Has someone a sollution to this problem?

Best regards
MonsieurBon

  • Comment on Background process with perl and lighttpd

Replies are listed 'Best First'.
Re: Background process with perl and lighttpd
by moritz (Cardinal) on Jun 02, 2010 at 14:13 UTC
    You probably need to fork, and closse STDIN, STDOUT and STDERR in the child. Some webservers won't consider a request finished until these file handles aren't closed.
Re: Background process with perl and lighttpd
by jau (Hermit) on Jun 02, 2010 at 16:14 UTC
    The problem is, that lighttpd buffers all output until all process terminate.

    Have you tried to flush the output stream?

    This should do the trick:

    sub flush { my $fh = select( STDOUT ); my $hot = $|; $| = 1; print STDOUT ''; $| = $hot; select( $fh ); return; }

        flushing the output does not work...

        The only thing that works is closing STDIN, STDOUT and STDERR. But then the script stops!

        This is what I have:

        my $q = new CGI; my $session = CGI::Session->load($q); my $tmpfile = $q->upload('upfile'); my $filename = $q->param('upfile'); print $q->redirect('showProgress.cgi'); close STDOUT; close STDIN; close STDERR; # <-- script stops here! append_file( 'debug', "start\n" );