Fellow Monks,
I have a piece of code to authenticate users and let them download files from a server. The files are in a folder hidden from public and are only downloaded by users with correct permission (username/password).
To prevent passing a real URL, I read the file and dump it to STDOUT, here's the code that does that:
sub dl_file { my $file = param('file'); $file =~ s/^\.+//; $file =~ s/^\///; if (-e "$folder/$file") { my $data; open IN, "$folder/$file" or die "Cannot open file $file, $!\ +n"; binmode IN; {local $/; $data = <IN>;} close IN; my $filename = (reverse (split /\//, $file))[0]; print header(-type=>"application/x-msdownload", -attachment=>$filename, -cookie=>[$cookie1, $cookie2]); print $data; } else { print $top, h1('Error'), p('file not found.'); print hr, a({-href=>"$me?command=MAIN"}, 'Back to Main'), $bot +tom; } }
However, one problem arised when the admin of the site placed a 12 Mb file there.. so when trying to download, the script attempts to read the whole 12Mb in and pass them as a response, so just waiting for the script to start responding results in a browser timeout.
I had the idea of issuing an exec "cat ", "$folder/$file" but couldn't cause they are using the script on an NT server.
What's the best solution to this? would doing a while loop and reading a specified buffer size solve this? or should I look into another way of doing it?
In reply to Reading a big file and passing to output by Chady
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |