cosmicperl has asked for the wisdom of the Perl Monks concerning the following question:

Hi,
  I'm trying to make all my scripts more secure. Some of my programs save details about reports or customers to a data folder that can be downloaded using a link in the software manager area. So that this link works the data folder is in the public html folder.
  Obviously this isn't secure as anyone who knows the folder exists could download the files. I can think of 2 solutions:-

1) Having the data folder in a secure place (in cgi-bin or outsite public html) using code to read from the file and output to the browser (manager area is password protected).

2) Using .htaccess.

I'd prefer to use 1) as the script also runs on IIS and .htaccess wouldn't always be available.

The problem is, I'm not sure how to do it, and how to make sure both Binary and ASCII files are sent correctly when I don't know what format the file might be.


Once again I much appreciate it

Lyle

Replies are listed 'Best First'.
Re: Sending file data to a browser
by ikegami (Patriarch) on Dec 05, 2005 at 22:38 UTC
Re: Sending file data to a browser
by monarch (Priest) on Dec 05, 2005 at 23:00 UTC
    There's a number of different approaches you can use for the security portion, using CGI::Session or Apache::Session for example, or .htaccess, or even a couple of post variables.

    As for the serving up of a file, always send in binary mode, that way it doesn't matter if the file is text or binary; if it is text you just hope that the client knows how to deal with different newline formats.

    Serving up a file consists of something like the following (which is for mod_perl, but you could apply the principles to CGI):

    sub outputfile( $ $ ) { my $r = $_[0]; # Apache mod_perl request object my $view = $_[1]; # name of file to upload my $error = undef; # error message to return if ( ! -f ( "$view" ) ) { $error = "<p>Error, file \"$view\" could not be found</p>"; return( $error ); } my $fsize = ( stat( _ ) )[7]; if ( ! open( INFILE, "<$view" ) ) { $error = "<p>Error opening file \"$view\":<br />$!</p>"; return( $error ); } # biff to output my %export_headers = ( "Content-Disposition" => "attachment; filename=\"$view\"", "Content-Length" => $fsize, ); $r->content_type("application/octet-stream"); $r->no_cache(0); # req for winXP that refuses to open cached pages eval { # mod_perl 1 $r->header_out($_ => $export_headers{$_}) for keys %export_headers; }; if ( $@ ) { # mod_perl 2 $r->headers_out->{$_} = $export_headers{$_} for keys %export_headers; } my $data; my $result; while ( $result = read( INFILE, $data, 8192 ) ) { print( $data ); } close( INFILE ); return( undef ); }
      Thanks. I think I can adapt this. But why is the read 8192 bytes at a time? I've seen 1024 before, why 8192? Does it result in the same output, or is there a difference?
        Both numbers are arbitrary. 1024 Bytes is 1 KB. 8192 Bytes is 8 KB.
        You can make the number of bytes read in a read to whatever size suits you. It's a balancing act between making system calls (expensive in time) vs how much data to extract at a time (expensive in memory). Anything will work, from one byte to a million (or even more), the last read will only partially fill the buffer anyway.

        Have a look at perldoc -f read for more details on the read function.

        On another note, it is interesting to see that Microsoft Internet Explorer for Windows XP refuses to allow one to download a file if the cached parameter is set on the server. Why this is so confuses the blazes out of me. It's not the only thing that Microsoft have $!@#ed up in the latest version of IE. Seriously, who are the complete $!#@heads that work for Microsoft?? Either dumb or seriously seriously selfish $!#@s.

Re: Sending file data to a browser
by pileofrogs (Priest) on Dec 06, 2005 at 00:05 UTC

    Don't forget about file permissions. If this is like many CGI, you'd have to make it world read/writable.

    You might want to look at something like suexec or cgiwrap. This will allow your script to run as a different user than the web server user, and then you can use file permissions to protect that data from other users.

    If you're on a shared system, other users could read/write your data otherwise, either directly or by writing a CGI to read/write your data.

    Even if you own the web server and you're the only user, this is a good idea, as it partitions data, protecting it in the event a different script or even the web server itself gets hacked.

    In my environment, I create a new user for each different CGI task.

    -Pileofrogs