sstevens has asked for the wisdom of the Perl Monks concerning the following question:

Oh wise monks, I am hoping you can help me. From various posts on this site, I have figured out how to setup a fork. I am now trying to upload a large file from a web browser and display to the uploader some kind of status. I've implemented this using CGI::Ajax, but I need a way to do this without using javascript. Here is the code I have:
#!/usr/local/bin/perl $|++; use CGI; my $req = new CGI; if (-f "temp.txt") { print "Content-type: text/html\n\n"; print "done!\n"; unlink("temp.txt"); exit(0); } my $pid = fork(); if ($pid == 0) { print "Content-type: text/html\n\n"; print "<META HTTP-EQUIV='Refresh' CONTENT='2'>\n"; print "working...\n"; exit(0); } else { $file = $req->param("file"); # yeah, I'll change this later open (OUTFILE, ">temp.jpg"); while (my $bytesread = read($file, my $buffer, 1024)) { print OUTFILE $buffer; } close (OUTFILE); open(TEMP, ">temp.txt"); print TEMP "fdas\n"; close(TEMP); waitpid($pid,0); }
The problem is that nothing is displayed until the file upload has completed, then the forking all works (it does the meta refresh and everything). There must be a way for the uploading to take place in the background, but I can't figure it out. Any help would really be appreciated!

Replies are listed 'Best First'.
Re: forking large file uploads
by pc88mxer (Vicar) on Jan 23, 2008 at 08:45 UTC
    I'm pretty sure the reason you are seeing this behavior is that CGI is processing the uploaded file (i.e. reading the HTTP stream) at the time you create the CGI object (i.e. my $req = new CGI;)

    This suggests that putting my $req = new CGI; only in the child process might fix your problem. However, I would very surprised if that works. You need the client's browser to run two requests at the same time: one which is uploading the file and another which is querying the status of the upload. The querying request will be the 'visible' one whereas the uploading one is hidden (if only for aesthetic purposes.) So I think it's the client's browser which has to fork() (in a sense). When the file upload is initiated, the browser has to create another page (via javascript) which queries the status of the upload periodically. The file upload request isn't going to display the page returned to it until the file upload is complete.

      Unfortunately, moving my $req = new CGI; to the child process doesn't fix the problem. I like your suggestion about creating another "status" page via javascript, but I'm not allowed to use any type of javascript because our users' computers have javascript disabled. I'll keep hacking away and post a solution, if I find one. In the meantime, any other suggestions are welcome!
Re: forking large file uploads
by hipowls (Curate) on Jan 23, 2008 at 02:04 UTC

    STDOUT is buffered, try turning it off with $|++; before forking. $| (or $OUTPUT_AUTOFLUSH if you use English) when set to a non zero value forces a flush after every write. It affects the current default filehandle. See http://perldoc.perl.org/perlvar.html for more details.

    Update: Sorry I just noticed that you did have $|++ at the top of your code, I guess I just skimmed the first few lines thinking it was the usual #!perl strict warnings stuff. Teach me to pay better attention;)

Re: forking large file uploads
by kyle (Abbot) on Jan 23, 2008 at 15:52 UTC

    I could be wrong, but I think you're stuck. From the browser's perspective, it's sending some large file as part of a request to the server. For it to display anything, it has to be finished sending that request. If you're trying to read what it's sending and give it something to show, I don't see how that's going to work. With JavaScript, you can have a display going at the same time, but there isn't any "at the same time" in a single HTTP request.

    As an aside, I usually see a fork done this way:

    my $pid = fork(); die "Can't fork: $!\n" if ! defined $pid; if ( $pid ) { # parent } else { # child exit; }