in reply to Re: Re: Re: Hash Clash on purpose
in thread Hash Clash on purpose
The CGI script reads the keys (params, in this case) via http -- end of story, since the client can ship anything over http. Of course the content will be "maliciously crafted" -- it's an attack, and a rather trivial one, as iburrell just demonstrated.
> New question: Is there a way to "fix" CGI.pm?
Sure, and you've mentioned one -- limit which params are allowed. But a simpler way to deal with this attack would be to limit the number of params allowed, with a sensible built in default (something considerably less than 10000). Another possibility is to refuse to enter any more entries into the hash if its size exceeds some threshhold. However, that doesn't work in Perl6, where scalar(%hash) just returns the number of entries. (But maybe there's some other way to get the table size?)
> Im still hoping to get feedback from more "senior" members of the community, to get a handle on what they really think of the issue.
It's a real problem, as iburrell trivially demonstrated. The linux kernel folks are addressing it, and the perl folks would be wise to do the same.
> Also does/will this effect Perl6?
If they don't do something about it. The current Perl6 hash function is the same as the so-called one-at-a-time function of perl 5.8.0. Scott Crosby's solution of using a universal hash function is expensive, and the linux folks have rejected it. Their approach, however, makes the hash function indeterminate, which would be a problem for many perl applications where reproducibility is important. Perl6 could have a "use determinate;" that would prevent it from implicitly randomizing the hash function (or anything else). If perl doesn't do it by default, then it falls upon CGI.pm, among others, to deal with the problem, and that gets rather ugly in terms of "the modules varied duties", as you put it.
|
|---|