Chady has asked for the wisdom of the Perl Monks concerning the following question:

Fellow Monks,
I have a piece of code to authenticate users and let them download files from a server. The files are in a folder hidden from public and are only downloaded by users with correct permission (username/password).

To prevent passing a real URL, I read the file and dump it to STDOUT, here's the code that does that:

sub dl_file { my $file = param('file'); $file =~ s/^\.+//; $file =~ s/^\///; if (-e "$folder/$file") { my $data; open IN, "$folder/$file" or die "Cannot open file $file, $!\ +n"; binmode IN; {local $/; $data = <IN>;} close IN; my $filename = (reverse (split /\//, $file))[0]; print header(-type=>"application/x-msdownload", -attachment=>$filename, -cookie=>[$cookie1, $cookie2]); print $data; } else { print $top, h1('Error'), p('file not found.'); print hr, a({-href=>"$me?command=MAIN"}, 'Back to Main'), $bot +tom; } }

However, one problem arised when the admin of the site placed a 12 Mb file there.. so when trying to download, the script attempts to read the whole 12Mb in and pass them as a response, so just waiting for the script to start responding results in a browser timeout.

I had the idea of issuing an exec "cat ", "$folder/$file" but couldn't cause they are using the script on an NT server.

What's the best solution to this? would doing a while loop and reading a specified buffer size solve this? or should I look into another way of doing it?


He who asks will be a fool for five minutes, but he who doesn't ask will remain a fool for life.

Chady | http://chady.net/

Replies are listed 'Best First'.
Re: Reading a big file and passing to output
by Corion (Patriarch) on Jul 26, 2002 at 09:13 UTC

    The best way indeed would be to have a loop that reads small chunks and then passes them on to the socket :

    sub dl_file { my $file = param('file'); $file =~ s/^\.+//; $file =~ s/^\///; if (-e "$folder/$file") { open IN, "$folder/$file" or die "Cannot open file $file, $!\ +n"; my $filename = (reverse (split /\//, $file))[0]; print header(-type=>"application/x-msdownload", -attachment=>$filename, -cookie=>[$cookie1, $cookie2]); my $data; binmode IN; while ( read(IN,$data,16384)) { print $data; } close IN; } else { print $top, h1('Error'), p('file not found.'); print hr, a({-href=>"$me?command=MAIN"}, 'Back to Main'), $bot +tom; } }
    perl -MHTTP::Daemon -MHTTP::Response -MLWP::Simple -e ' ; # The $d = new HTTP::Daemon and fork and getprint $d->url and exit;#spider ($c = $d->accept())->get_request(); $c->send_response( new #in the HTTP::Response(200,$_,$_,qq(Just another Perl hacker\n))); ' # web
Re: Reading a big file and passing to output
by dada (Chaplain) on Jul 26, 2002 at 10:39 UTC
    since you're on NT and seem to handle binary data, I suggest you to put a binmode STDOUT; instruction after the print header(...) line.

    cheers,
    Aldo

    __END__ $_=q,just perl,,s, , another ,,s,$, hacker,,print;
Re: Reading a big file and passing to output
by CubicSpline (Friar) on Jul 26, 2002 at 12:53 UTC
    I had the idea of issuing an exec "cat ", "$folder/$file" but couldn't cause they are using the script on an NT server.

    Not that I really would advocate this type of approach, but don't let NT's silly idiomatic commands get in your way. Most UNIX commands actually are in NT, just under stupid, stupid names.

    You could do "type ", "$folder/$file" and achieve the same effect as using cat.

    ~CubicSpline
    "No one tosses a Dwarf!"