I've solved a similar problem lately, namely importing a big-csv file of records into my system. I'll lign out my solution path, maybe it helps you.
The problem:
User should be able to upload a file that gets sanity checked and imported into a database. As such files can be very big, and the import can take long the user shall be presented a progress bar until the import has finished.
The idea:
- User triggers upload
- Server takes file and saves it into an "imcoming" directory.
- Server generates a unique token and renames the csv file accordingly
- Server creates a "progress counter file" for that token and initializes it with "0"
- Server creates a non blocking call to the "background-importer" and tells it what file (and therefore token) to import
- Server sends back the html with the progress bar and a piece of Javascript to update it.
The biggest part of the problem is to make the call to the importer "non blocking" so that the caller can end and send it's response. That's the reason why forking is not an option here. My first attempt to solve this was by using IPC::Open2, but that didn't work out.
I found the easiest way to do so is to use LWP::UserAgent with a timeout of 1. In my case, I've actually used an RPC call to a webservice, but the principle remains the same: no blocking by making an http-request and not waiting for the response.
So what we now have is a background-job on the server that does the importing. While running, this job will also update the progress counter file.
The last part of the setup is a little server component that will be passed a token and return the progress of that token (means it will read out the progress counter file and return the content as plain text).
This component will be called by the Javascript in order to update the progress bar.
The solution:
This is the import controller:
This is the html for the progress bar
This is the updater JS
And finally the server sided progress component
The snippets are directly taken from the production server, and contain german. Let me now if that is problem.
| [reply] [d/l] [select] |
Thanks. I could use some code from your example.
| [reply] |
#!/usr/bin/perl --
use CGI qw(:standard);
$| = 1;
if (my $pid = fork) { #parent does
print header(), q~
<HTML>
<HEAD>
<META HTTP-EQUIV="Refresh" CONTENT="3; URL=/cgi_data/page1.html">
<TITLE>New Site notification</TITLE>
</HEAD>
<BODY>My homepage has moved to a new location. I am here</BODY></HTML>
~;
} elsif (defined $pid) { #child does
close STDOUT; #tell apache no more output
sleep 30;
} else {
die "Cannot fork: $!";
}
exit 0;
| [reply] [d/l] |
From this example and from the link of the first response above, I guess all I need is to insert the
close STDOUT;
in the child process after I print the html page but it does not seem to do the trick. Your example does not work either. The page is still hanging 30 seconds before it gets redirected to page1.html for some reason. I have the example at
http://129.107.52.101/cgi-bin/testpull.cgi
| [reply] |