nponte has asked for the wisdom of the Perl Monks concerning the following question:

Hello, Relative newbie here... I have a file upload script working but am concerned about denial of service attacks by individuals attempting to upload large files. I have successfully done my own byte counting to eliminate large files, but I would like to intercept CGI::Post_max errors and send a useful msg to my user, informing them that the file is too large. Hopefully this will eliminate the upload of the temp file by CGI prior to my script getting control. I have been unable to get CGI::Post_Max working. When I enable Post_Max and attempt to send a file larger than the max, I get "Page cannot be displayed". I have been through the postings on this topic but have been unable to solve. Thanks for reading, Nick

Replies are listed 'Best First'.
Re: file upload, max_post, and cgi-error
by fglock (Vicar) on Oct 04, 2004 at 14:14 UTC

    From the docs (see Avoiding Denial of Service Attacks):

    $uploaded_file = param('upload'); if (!$uploaded_file && cgi_error()) { print header(-status=>cgi_error()); # <== change this line exit 0; }
      Thanks for responding... After a bit more testing I have found that my script is getting control, but with an empty parameter set as expected. I am able to retrieve the cgi_error "413 Request entity too large" from cgi_error but am finding that I cannot post an html msg to the user. I seem to be falling right through the code that puts out that error msg, produce the "page cannot be displayed" error, but am able to subsequently send myself an email with the 413 error. I believe the code that puts out the html error msg is good - am using it for other errors.

        You should probably try to reduce it to the minimum case, like fglock's sample from CGI.pm, that works and then expand it back to the code you're currently trying to use.

        Make sure you're doing the printing of the server headers yourself with header() and that you're including your own status which you might need to do as "200 OK" to appease Apache and print your own HTML instead of the default ErrorDocument for 413.

Re: file upload, max_post, and cgi-error
by nponte (Initiate) on Oct 04, 2004 at 17:07 UTC
    Here is the code that my script executes when there are no parms (max_post exceeded)...Don't know how I could make it any dumber. The emailwebadmin and dienice execute but still getting the "page cannot be displayed from the html portion. That code runs if I put it in a standalone module. Is there any type of disconnect happening here instigated by post_max? # # No input parms # else { $cgierr = CGI::cgi_error(); if ( $cgierr ) { print "Content-type:text/html\n\n"; print "<HTML><HEAD><TITLE>Error</TITLE></HEAD>"; print "<BODY>"; print "$cgierr\n"; print "</BODY>"; print "</HTML>"; # # Send an email to the web admin emailwebadmin ("A submitted RESUME is too large. The CGI error is: $cgierr"); } else { emailwebadmin ("No input parameters found!"); dienice ("jobappr.pl: No input parameters found! The Web administrator has been notified."); exit(); } }

      Well, your code (the header/error part of it) ran properly for me. Here is a slightly more explicit version you might try on your own server. I'm running Apache 2.

      use warnings; use strict; use CGI qw(:standard); $CGI::POST_MAX = 2_000; # play nice my $uploaded_file = param('upload'); if ( not $uploaded_file and cgi_error() ) { print header(-status=>'200 OK'), # this might help, or not... start_html(-title=>'TOO BIG'), h2("Why are you disrespecting me with that file size?"); } else { print header(), start_html(-title=>'Just right!'), h2("I'm Okay, You're not so bad."); } print start_multipart_form(), filefield(-name=>'upload'), submit('Upload'), end_form(), end_html();
        thanks for the script. I ran this on my system and get "Page cannot be Displayed" when I specify a file larger than post_max. Same behavior as my script.
Re: file upload, max_post, and cgi-error
by zentara (Cardinal) on Oct 05, 2004 at 13:39 UTC
    I hav'nt been playing around with cgi much lately, but I recall having trouble getting the post_max stuff to work right. I seem to remember that often the cgi script would sometimes actually upload the entire file, no matter the size, before reporting the file exceeded the max size.

    So I just resort to using the CONTENT_LENGTH

    my $maxsize = 1024 * 100; #max 100K my $upload_dir='uploads'; my $q = new CGI; print $q->header(); if($ENV{CONTENT_LENGTH} > $maxsize){ print "file too large - must be less than $maxsize bytes"; exit; }

    I'm not really a human, but I play one on earth. flash japh
      I've been able to do that OK. But the problem is CGI must download to a temp file on your server before passing control to your script, exposing you to denial of service attacks. If I can't trap it via cgi_error my server is at risk. The interesting thing about this is that I can still execute code to send myself an email after trying to put out the html msg to the user. It looks like CGI is somehow disconnecting from the process. I've tried reinstantiating CGI after getting the error but still no luck.

        Uh, okay, errr... maybe try...?

        #!...perl -T use strict; eval { # entire script as before but try 'require CGI' instead of 'use' }; deal_with_problem($@) if $@; sub deal_with_problem { # mail, print feedback page, etc }