in reply to Efficient giant hashes
100_000 elements is going to be, roughly, 1-2K per element. That's 100M-200M of RAM. Unless each element is some huge data structure in its own right (like an array of hashes or somesuch), you're probably not doing a lot of swapping to disk or anything like that.
It sounds like your algorithm(s) aren't scaling with your data structures. For example, doing
is going to be considerably slower than the equivalentforeach my $key (keys %hash) { ... }
foreach has to build the list in memory, then iterate over it. each will only bring one value in at a time.while (my ($key, $value) = each %hash) { ... }
My bet is on your algorithms, not your data structures. Maybe if you posted a few snippets of how you use this massive hash, we might be able to help you out.
Being right, does not endow the right to be rude; politeness costs nothing.
Being unknowing, is not the same as being stupid.
Expressing a contrary opinion, whether to the individual or the group, is more often a sign of deeper thought than of cantankerous belligerence.
Do not mistake your goals as the only goals; your opinion as the only opinion; your confidence as correctness. Saying you know better is not the same as explaining you know better.
|
---|
Replies are listed 'Best First'. | |
---|---|
Re^2: Efficient giant hashes
by halley (Prior) on Mar 10, 2005 at 14:54 UTC | |
by dragonchild (Archbishop) on Mar 10, 2005 at 15:32 UTC | |
by Anonymous Monk on Mar 10, 2005 at 15:15 UTC |