How large is "large", how many is "many"?
I have the impression that it's rather a space problem, my approach would be to split a hash into a HoH to allow swapping of the second tier.
This works well if you can organize your look-ups in a way (ordering or caching) that accesses to the second tier of hashes is bundled, such that you have a minimal amount of swapping. see also Re: write hash to disk after memory limit and Re: Small Hash a Gateway to Large Hash?
For instance looking up $point{$x}{$y} would be the fastest if you can you can bundle all look-ups to points ($x,*) for one fixed $x.
tl;dr maybe I'm missing the point...(?)
On a side note: are you aware of multi-dim hashes in Perl?
C:\Windows\system32>perl -e "$x=1;$y=2;$h{$x,$y}=4; print $h{$x,$y}" 4
If yes, why do you you need to join the keys by yourself???
> String: $x . ':' . $y
see perldata#Multi-dimensional-array-emulation
Cheers Rolf
(addicted to the Perl Programming Language and ☆☆☆☆ :)
Je suis Charlie!
In reply to Re: Fastest way to lookup a point in a set
by LanX
in thread Fastest way to lookup a point in a set
by eyepopslikeamosquito
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |