A possible strategy for this can be (pseudo-code!):
$| = 1; # keep info to browser going when it's there # Do some piece of long running stuff; this will # prevent apache from timing out the connection # but might need to be called often. $r->reset_timeout(); # At the same time check that user still cares. $r->connection->aborted() and stop_it_all(); # While in a browser nearby some clever JavaScript is # showing "Processing" "dot" "dot" "dot" and can # be dynamically fed updates by something like $r->print('<div class="status_"' . $count++ . '>+</div>'); # which is going straight to the JS b/c of $|. # Probably something client side is the only way to # give decent status "progress" messaging; the # mod_perl can send approximations of how much is # left to go.
This is a sticky problem. It might be better served with an email receiving queue to take the files, or by breaking them into single file uploads instead of letting a user upload, whatever, 10 large files at once.
It seems likely that because of all the up-front work CGI does you might need to, probably should, write the uploads in native mod_perl so you have complete control. I don't usually use CGI.pm with mod_perl so I'm sorry that's just a guess. It's a fantastic module but mod_perl gives you all its facilities (well, not as easily) and much finer control at every level of the request; not just the reading of the already accepted request and sending the response.
Good luck. (update: added close to div tag, #2: speling problamz)
In reply to Re: Deferred/selective processing in CGI.pm
by Your Mother
in thread Deferred/selective processing in CGI.pm
by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |