That's less than 8Mb, which shouldn't be much of a problem on a modern machine.use Devel::Size 'total_size'; my %hash = map {$_, rand() . ""} 1 .. 100_000; print total_size(\%hash), "\n"; __END__ 7713194
But fact is, Perl is memory hungry. The more structures you have, the more memory you use. The more complex the structures are, the more memory you use.
Speeding it up is only possible by using less memory at a time. Using tie as you propose it is not going to solve it. If there would be a known way of speeding up hash access, it would already be in the core! Not to mention that tieing is a slow mechanism, as it means that for each hash access, no matter how trivial, a Perl subroutine will be called. It is possible to use the tie mechanism to store the hash on disk instead of memory, but unless you otherwise would run out of memory, that's not going to change your performance for the better. Regular disk access is not likely to be faster than accessing your swap area.
In reply to Re: Efficient giant hashes
by Anonymous Monk
in thread Efficient giant hashes
by crenz
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |