in reply to order of hash

If security was a reason for increasing the randomness of hashes then the question begs to be asked, what are the entropy requirements of hash operations?

I have found a lack of available entropy in the system to be a major bottleneck in output of running programs at times. Obviously if randomness is being supplied by /dev/random then might not a race condition be possible in the case that you are storing random numbers in a hash while the insertion or bucket redistribution operations themselves require random numbers to operate? Is /dev/random actually used in hash mechanics now?

I am not trying to be facetious. Some operating systems really will have a problem with operations which require a lot of random numbers. I saw a visible problem with console display of random numbers. Try this (just tried on RedHat 9 in an xterm):

cat /dev/random | hexdump

After a little bit (1000 lines maybe) the output will stop and only restart when you move the mouse or hit a key. Kill it with ctrl-c and subsequent runs will only show one or two lines before waiting for input again. After a little bit it won't even output a single line until you let it sit by itself for a minute (maybe due to kernel's tcp randomization code?? anyway.) If /dev/random is not being used then the bucket order is just being hashed again, which is more obfuscation than real security I'd think. I tried to mix hash operations with reading from the above line (followed by a pipe) however it looks like it will continue to hang until I let it rest for a minute or so.

perl -e 'open (IN,"cat /dev/random|hexdump|"); while (<IN>) {$h{$_}=$_; print $c++ . " ";} close (<IN>);'

Considering I was still able to do hash operations after horribly exhausting my system's entropy I would say the security gained is fragile as it is based on a hash function and not an entropy source, therefore the order is not random at all, but calculable based on the hash to be reordered.

So is the "increased randomness" of hash order in newer perls mainly just meant as a reminder not to depend on the order of the hash, or is it something real which could be made to draw from an authentic entropy source if you really wanted to? Another use for /dev/cdrom ..

Replies are listed 'Best First'.
Re: Re: order of hash -- entropy requirements?
by hardburn (Abbot) on Nov 04, 2003 at 15:02 UTC

    Exhausting entropy like that is a serious problem for some applications. However, I doubt it's a big deal for Perl.

    The problem the increased hash randomness was trying to solve was that certain well-crafted inputs would expand the internal data structures, thus consuming a lot more resources and becoming a potential DoS attack (the attack can be generalized to be used on many languages and algorithms, not just Perl's hashes). By putting a little randomization in the hash, it becomes much harder for an attacker to predict how the datastructure will expand.

    The random number generator used need not be from an extremely high quality source. Just enough that an attacker won't be able to predict the hash seeds. It would be pretty easy to foil a remote attacker this way, but a bit harder for a local attacker, depending on their system priviliges.

    ----
    I wanted to explore how Perl's closures can be manipulated, and ended up creating an object system by accident.
    -- Schemer

    : () { :|:& };:

    Note: All code is untested, unless otherwise stated

      Thank you very much for your illumination! I had no entropy problems just now creating 100,000 small hashes so I guess the seed is being chosen without using /dev/random at all. Thanks again. --Matt