in reply to Any limit to large hashes?

Perl's hashing algorithm computes a 32 bit number and then does a shift trick that could result in the hashing starting to collide more often after about 100,000,000 elements. Of course if you are hashing that much stuff you are likely to run into issues with the fact that Perl's process is (even on 64-bit systems) unable to handle more than about 2 GB of memory (assuming you have that much available between RAM and swap).

If this is an issue you can tie to a tempfile using DB_File and that is fine for data sets up to your available disk space, your filesize (on many systems that is 2 GB) or IIRC about 100 terabytes (depending on how Berkeley DB was compiled).

If you have only a 100,000 or so keys in your hash, I wouldn't worry about it. :-)

  • Comment on Re (tilly) 1: Any limit to large hashes?