Also not to be excluded entirely is ... uhh ... an array of integers that is searched sequentially each time.
That is a surprise winner for my 'Worst Advice of the Month (January, 2013)' prize.
... set up a few short test-runs ...
I'd suggest you take your own advice before doling it out to others.
NB: The following uses 1/10 the number of values and 1/10 the number of lookups than the tests above for arrays, hashes and bit vectors, because it takes so long to complete.:
@a = map $_*20, 1 .. 1e4;; say total_size \@a;; 320176 $found=0; say time; for my $tgt ( 1 .. 20e4 ) { $tgt == $_ and ++$foun +d and last for @a; }; say time; say $found;; 1359149323.02034 1359149541.80419 10000
Using that as the basis for estimation, the total size will be 3.2MB -- 12 times that required by the bitvector solution.
And the lookups that took hashes 1 second; arrays 2/5ths of a second and bitvectors 1/3rd of a second; would take: 25 days!
It is really hard to see any circumstance when this would be a viable solution.
In reply to Re^2: Array vs. Hash for sparsely integer-indexed data
by BrowserUk
in thread Array vs. Hash for sparsely integer-indexed data
by puterboy
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |