in reply to Re (tilly) 1: millions of records in a Hash
in thread millions of records in a Hash

If he is hitting his OS limit at 2GB, there is an alternative, I quote ...

Tie::DB_File::SplitHash is designed for support of file size limitted OSes. Transparently splits a DB_File database into as many distinct files as desired. Distributes hash entries between the files using a randomization algorithm. Has the effect of allowing DB_File hashes to grow to the full size of the partition. Requires 'Digest::SHA1' and 'DB_File' to be installed.

 
______crazyinsomniac_____________________________
Of all the things I've lost, I miss my mind the most.
perl -e "$q=$_;map({chr unpack qq;H*;,$_}split(q;;,q*H*));print;$q/$q;"

  • Comment on (crazyinsomniac) Re: Re (tilly) 1: millions of records in a Hash