in reply to Creating a CGI Caching Proxy
This solves several problems. First, there's very little latency. Second, it solves the problem of what to do if two browsers try to access the file at once. Third, it solves the problem of what to do if the user presses stop.
Here's some pseudocode, which is probably clearer:
my $url = $cgi->param('url'); my $file = url2file($file); my $fh = FileHandle->new("< $file"); if ($fh) { # Executable means it's a partial download if (-x F) { # If it's not locked, the download process has died if (flock(F, LOCK_EX|LOCK_NB)) { $fh=get($file,$url) or die "Couldn't get URL!\n"; } stream($fh); } stream($fh); } else { $fh=get($file,$url) or die "Couldn't get URL!\n"; stream($fh); } sub get { # Open the filehandle for read and write, # Lock filehandle for write, # fork off process to start the download. # Child Process: Download URL, # Set +x bit when done # Return dup of filehandle } sub stream { # Keep streaming data from filehandle until # the executable bit is set. Works pretty much like # tail -f. }
|
|---|