blahblah has asked for the wisdom of the Perl Monks concerning the following question:

This is a psuedo-perl question. I am implementing this in perl. I want to put a "frontdoor" on my website so that access to the content is only possible with a valid password. The password protection and cookie functionality stuff is pretty trivial. The problem I'm having is:

How can I allow access to content via a script, but dissallow it via a direct URL? The content needs to have permissions to allow access by Apache, but dissallow outside access at the same time.

If people access www.mysite.com/?file=foo.txt

I don't want them to be able to type

www.mysite.com/foo.txt and get my file.

ideas?

Thanks,
Alex

janitored by ybiC: Prepend node title with "(OT) "



Update:
Thanks all for the responses. It's not mentioned specifically by anyone, but I think the magic words I was looking for were Content-Disposition: inline. iburrell and davido seemed to be talking about it though... ++ thanks again everyone!

Replies are listed 'Best First'.
Re: Protecting Content
by davido (Cardinal) on Feb 18, 2004 at 17:05 UTC
    You could put your text in a part of your directory structure not available via HTTP request.

    In other words, if my webpages are in ~/public_html, and my CGI scripts are in ~/public_html/cgi-bin, I could put my content files that I want to "hide" into ~/content (ie, not within the area that is within the document path available to the HTTP server).

    Then you would write a small set of CGI scripts that simply do the work of authenticating individuals and grabbing data from the files in ~/content.

    Your server should be configured such that ~/content isn't available by direct HTTP request, but from within your scripts, it may be made available; the scripts may get at your content path.

    Update: Another idea would be to put your content into a database that itself requires password access. The scripts would have access, but outsiders would have no way of directly getting at the DB's contents.

    ...just a thought.


    Dave

Re: Protecting Content
by hardburn (Abbot) on Feb 18, 2004 at 17:02 UTC

    You simply put foo.txt outside your server's document path and modify the Perl code to read it there instead of wherever you have it now. A Perl CGI can read any file on the system it has permission to read.

    ----
    : () { :|:& };:

    Note: All code is untested, unless otherwise stated

      That would still allow people to view the contents through the "direct URL" (using the script) eg, http://www.mysite.com/?file=foo.txt, but maybe I don't understand the question right.

      --
      b10m

      All code is usually tested, but rarely trusted.
        I think the key line is: How can I allow access to content via a script, but dissallow it via a direct URL?

        So accessing via the cgi is what he wants, just not accessing the file directly. The simple solution is to put the file(s) in a directory that is accessible by the apache user (or cgi user) but not under the document root. The CGI would then be responsible for verifying the session/auth and presenting the file if it needs to, or the error page if the file access is not allowed.


        -Waswas

        My understanding is that the poster wants to allow a CGI to print out the file (presumably because the CGI handles password protections itself), but wants to make sure that the file isn't available without going through the CGI.

        ----
        : () { :|:& };:

        Note: All code is untested, unless otherwise stated

Re: Protecting Content
by bean (Monk) on Feb 18, 2004 at 17:04 UTC
    Easy - don't put the files you want to protect in a web accessible directory. Or put them in a password protected directory - see mod_auth and its cousins.
Re: Protecting Content
by tilly (Archbishop) on Feb 18, 2004 at 17:08 UTC
    Have both the script and the program construct a unique string by combining the time, details of the request, and a hidden password. Then use Digest::MD5 to produce an encrypted signature for that string. Have the client send that signature in the request. Have the server check the signature and refuse to do anything if it doesn't match.

    If you make the timestamp have a high resolution (like down to the second), then the server should check several possible seconds for a match, to take into account the possibility of the request taking time, or the two clocks not matching perfectly.

    You might also want to use https for further privacy.

      i've used the timestamp method along with some other features for safely storing and distributing pay-per-view video content. worked VERY well too!
Re: Protecting Content
by iburrell (Chaplain) on Feb 18, 2004 at 21:12 UTC
    One mechanism is to use mod_perl authentication handler to control access to the content. If you are using cookies, the handler validates the cookies and allows access to the content file. Apache::AuthCookie provides a base class for this.

    If you are willing to feed the content dynamicaly through a CGI script, then the CGI script can check the cookie and push the file to the client. The content files don't need to be visible through the web server for this to work.

    You can even make the URL look like a static file by using the path info after the URL to the script.

    http://www.example.com/cgi-bin/content.cgi/foo.txt
    $cgi->path_info eq '/foo.txt'
Re: Protecting Content
by b10m (Vicar) on Feb 18, 2004 at 17:01 UTC
    How can I allow access to content via a script, but dissallow it via a direct URL? The content needs to have permissions to allow access by Apache, but dissallow outside access at the same time.

    To put it bluntly, you can't. There's always a way to script your way around these "safety measures". You could check for HTTP_REFERER, but that would mean that you block people who choose not to send such information to your server and it is easilly forged.

    The best way to protect your data is keeping it offline.

    --
    b10m

    All code is usually tested, but rarely trusted.
Re: Protecting Content
by fraktalisman (Hermit) on Feb 18, 2004 at 18:55 UTC
    or make a subdirectory and put a file into it called ".htaccess" that contains the single line "DENY FROM ALL". Your script can, after checking the user's authentication, open any file inside that directory, then send a http header to the browser, followed by the contents of the file.
Re: Protecting Content
by Anonymous Monk on Feb 18, 2004 at 20:38 UTC
    You can use .htaccess to PW protect a directory, then anything in the directory would require a PW to access...the only problem is that .htaccess is not real effiecient so it will leave the server susceptable to an inadvertant DoS atteack if someone tries to brute force the password, but a simple code that will block an IP addy after 3-4 attempts would take care of that.
Re: Protecting Content
by coreolyn (Parson) on Feb 18, 2004 at 17:11 UTC

    Make the file readable only by another user then write a sudo rule that is accesed via your script.