stumbler has asked for the wisdom of the Perl Monks concerning the following question:

I am running perl cgi scripts on Apache (1.3.33) webserver.

I have set-up individual working directories for all users in the path,say, /usr/local/wd/<unique user id>

If user 'sam' logs in and executes some programs, the logfiles are stored in the path /usr/local/wd/sam/.

I have written a perl-cgi script to view the log files using HTML::Templates.

CGI script

# Create the HTML Page from the Templates print $query->header(); $user = ... #get username $log_file = .... #get logfile my $template = "err.tmpl"; # Instantiate a new HTML::Template object my $template = HTML::Template->new( filename => $template ); $template->param( username => $user ); $template->param( log_file => $log_file ); # Generate the HTML page print $template->output();

Template File (err.tmpl)

<html> <head><title>Show Log Files</title></head> <body > <script> function show_file(){ window.open('/wd/<tmpl_var name = "username">/<tmpl_var name = "log +_file">') ; } </script> <form method="post" enctype="application/x-www-form-urlencoded" name=" +results"> <table align="center"> <tr><td><a href="javascript:show_file()">Log File</a></td></tr> </table> </form></body></html>

As I have set a soft link for 'wd' to '/usr/local/wd/' , I am able to open the link and view the logfile and everything seemed to be fine.

Then, I realized that by pasting the url ( say http://host.com/wd/sam/logfile1.log ) in the browser,I am able to view the logfile. Also, it opens up the possibility that I can paste http://host.com/wd/ and look at all the folders and files of all users as the webserver runs as a special user.

I am not sure how to prevent this if someone tries to access the folders in the above approach. ( I use CGI::Session to keep track of 'logged in' status of the user for all other pages but when it comes to viewing the logfiles, it goes out of control of CGI::Session )

Looking for some suggestions on how to handle this securely.

Replies are listed 'Best First'.
Re: Perl CGI - Viewing logfiles - Security Issues
by Melly (Chaplain) on Jan 10, 2007 at 16:39 UTC

    Unless I've misunderstood, the problem you have is that your soft-link is underneath the web-root directory.

    The logs need to be accessible to the user that your web-server runs under, but that doesn't mean that they need to live inside your web-environment.

    Here's an example script that I run on an intranet to allow me to view some log files - note that I cannot browse to these file directly from my browser.

    #!/usr/bin/perl use strict; use CGI qw(:standard); my @logs = qw(/var/log/httpd/error_log /var/log/httpd/access_log /var/ +log/mysqld.log); my $log = $logs[0]; $log = $logs[param('log')] if param('log') =~ /1|2/; print header(); print start_html(-title=>$log); print '<pre>'; open(LOG, $log)||warn "cannot open $log: $!\n"; print while(<LOG>); print '</pre>'; print end_html();

    Note that I cannot just paste "/var/log/httpd/error_log" into my browser and expect it to return anything...

    map{$a=1-$_/10;map{$d=$a;$e=$b=$_/20-2;map{($d,$e)=(2*$d*$e+$a,$e**2 -$d**2+$b);$c=$d**2+$e**2>4?$d=8:_}1..50;print$c}0..59;print$/}0..20
    Tom Melly, pm@tomandlu.co.uk

      You exactly got the issue I have. Your approach will work fine but for one issue bcoz of my requirement.

      I have a softlink to the user directories under /usr/local/apache/htdocs as follows:-

      wd -> /usr/local/wd/

      Then, I build the link using the 'soft link', 'username' & 'log file name' and present it in HTML format.

      One of the requirement is that I show the log files as 'links' in the HTML page so that the user can click and view the file(s), if they want to. Hence, I coded like what I had explained in the original post.

      Is there any way to still have the files as 'links' and also take care of the security issue?

        "Is there any way to still have the files as 'links'..."
        You could consider linking to a CGI script similar to Melly's above. You could pass the script a query string identifying the particular log needed.
Re: Perl CGI - Viewing logfiles - Security Issues
by Anonymous Monk on Jan 10, 2007 at 16:33 UTC
    Remove the soft link. Remove the javascript. Print the logfile.
Re: Perl CGI - Viewing logfiles - Security Issues
by Sagacity (Monk) on Jan 11, 2007 at 03:12 UTC

    Just a thought...

    It appears that you need the server to step in here!

    Controlling access to the directories and files is it's job. Even though, we may have to help it out as much as possible, it is still the up to the server to handle this.

    We do this by placing an empty index.html, default.html or whatever your server requires as the natural "default" file to load if the directory is accidentally accessed by itself. The html file should have the head, title, body tags (with or without a message) and the proper closing tags as well. This just gives them a blank page in their browser.

    Also, look at placing a -Indexes and other directory access controlling allow/deny parameters in a separate .htaccess file in ANY directory you do not want the server to return a listing of, to a browser. Your scripts run under a different id/permission and can bypass the .htaccess .

    http://httpd.apache.org/docs/1.3/howto/htaccess.html is a good reference link to read more on this subject.

    Ahhh, but they may be nosy and know the pattern and type it in directly, including the logfile name

    To solve this, combine the first cgi that handles the login and returns a filtered list of the files you want them to see with an .htaccess file the directory to serve a 403 Forbidden on a direct - directory access

    Using the second cgi script activated by the link/s to get and spit out the logfile results. You can still meet the requirements of the soft links by linking to the new script that gets the data for the file the user chooses.

    My only caution on this is to make sure that the path to the file is as hard coded as possible as you want to prevent Reverse Directory Transversal. You will have to do some pattern matching on the input to look for anything that is a \,.,../, or other cracking pattern/technique. If you do get a pattern match, deny any further execution of the script and make a separate log entry in a file for you to review later. Don't get caught up in filtering it out and substituting to be a nice guy, just deny the execution, make them go back and do it right. We don't want to help the wrong group of users here.

Re: Perl CGI - Viewing logfiles - Security Issues
by Anonymous Monk on Jan 11, 2007 at 08:05 UTC
    Create a .htaccess file under 'wd' directory. and place the following statements in that .htaccess file.
    AuthType Basic AuthName "Password Required" AuthUserFile /usr/local/password/passwd.file Require user <username>

    Use the command htpasswd to create password file.
    htpasswd -c /usr/local/password/passwd.file <username>
    Finally add AccessfileName in the virtual host apache configuration.
    <virtualHost> ServerName www.xyz.com DocumentRoot <root-Dir> AccessFileName .htaccess_www.xyz.com </virtualHost>