in reply to Another "out of memory!" problem
If instead of putting the urls in a hash, you use their md5 digests Digest::MD5 (in binary) as keys, then you will save a subtantial amount of space. 1 million binary MD5s stored as hash keys uses about 36 MB:
use Digest::MD5 qw[ md5 ];; undef $h{ md5( $_ ) } for 1 .. 1e6;; print total_size \%h;; 35665714
Note also the use of undef $hash{ ... }. This autovivifies the key without allocating any storage for a value--thereby saving space. Whilst some will view this as an unconscionable "trick", for this type of lookup application that is pushing your memory limits, the savings are worth having.
Using this method you should be able to index close to 60 million urls on a typical 2GB machine without problems. And far more quickly than any tie or DB mechanism that requires disk accesses.
|
|---|