Beefy Boxes and Bandwidth Generously Provided by pair Networks
more useful options
 
PerlMonks  

Re: hash collision DOS

by Jenda (Abbot)
on Jun 02, 2003 at 10:52 UTC ( [id://262345]=note: print w/replies, xml ) Need Help??


in reply to hash collision DOS

Well maybe CGI.pm and/or CGI::Lite could let us restrict the CGI parameters that are accepted and stored and throw away all others. Why should the $CGI object remember the parameters we are not interested in anyway?

Jenda
Always code as if the guy who ends up maintaining your code will be a violent psychopath who knows where you live.
   -- Rick Osborne

Edit by castaway: Closed small tag in signature

Replies are listed 'Best First'.
Re^2: hash collision DOS (CGI.pm protection)
by Aristotle (Chancellor) on Jun 02, 2003 at 21:19 UTC
    1. See the fine manual: you can already ->delete() parameters, so just grep unrequested parameters out of ->param() and dump them in the bit bucket.
    2. All webservers have a relatively tight maximum size for GET requests. (I think the default is something like 4kb for Apache.) You can set $CGI::POST_MAX for POST requests.
    Use those well and it shouldn't be possible to dump enough data on a script to slow it down significantly.

    Makeshifts last the longest.

      Calling delete would happen after the problem has already occured. I concur, if $ENV{QUERY_STRING} length bothers you, simply cut it down (same goes for POST_MAX).

      I do feel a nice addition would be a something like

      acceptOnly( thesekeys => qw[ these keys ] ); acceptOnly( thismanykeys => 44 );
      This would be trivial to add ... just a thought


      MJD says you can't just make shit up and expect the computer to know what you mean, retardo!
      I run a Win32 PPM repository for perl 5.6x+5.8x. I take requests.
      ** The Third rule of perl club is a statement of fact: pod is sexy.

      PodMaster is right. ->delete() comes too late. And even the $CGI::POST_MAX doesn't help much.

      Imagine you have a file upload script. There you need to keep the $CGI::POST_MAX rather high so they may be able to post quite a few CGI parameters. And then even the creation of the hash that CGI.pm uses to store data may take a lot of time. And the grep and delete would only make the issue worse.

      Jenda
      Always code as if the guy who ends up maintaining your code will be a violent psychopath who knows where you live.
         -- Rick Osborne

      Edit by castaway: Closed small tag in signature

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://262345]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others wandering the Monastery: (1)
As of 2024-04-25 00:41 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found