I don't know how you're going to do that if you're using straight CGI. Essentially, the Web server is going to call your script after all of the data has been uploaded to the server. What you need is something like mod_perl which will allow you to write code to handle all of the HTTP request phases.
Cheers,
Ovid
Join the Perlmonks Setiathome Group or just click on the the link and check out our stats.
| [reply] |
I had a similar question a few days ago about how to handle a long server calculation where the server/client communication was timing out. I was referred (thanks sevensven) to this article by merlyn. Its basically a technique to fork two server processes, one that writes a file saying "please wait, refresh to see if I'm done" that autorefreshes and sends the url to it, and a business-end process that does the calc or upload or whatever, and eventually replaces the origional file with the final results and does cleanup afterwards. Very nifty, and maybe what you want, but too high maintainance, IMHO. I thought about forking a process that just sends a "." every few seconds, sort of like a progress monitor, but I'm not sure if this would work, or how to kill it after the final result is sent.
drinkd | [reply] |