Fullness can be described in terms of an overall percentage — this table is 50% full, that one’s 90% — but researchers often deal with much fuller tables. So instead, they may use a whole number, denoted by x, to specify how close the hash table is to 100% full. If x is 100, then the table is 99% full. If x is 1,000, the table is 99.9% full. This measure of fullness offers a convenient way to evaluate how long it should take to perform actions like queries or insertions.
Seems once again researchers are researching exactly the wrong thing. I mean if the table is 99% full, you should have increased its size a long long time ago. If it's even more full, you are an ... you are clearly not interested in doing anything practical, but rather seem to be working towards a degree in theoretical something or other.
While it's possible that there's something of some practical use in that paper (so far I only read the article), I would not hold my breath. If the topic was an already pathologically overstuffed hashes, then there might be something for the sanely proportioned ones too, but...
Jenda
1984 was supposed to be a warning,
not a manual!
In reply to Re: Better Hash Tables?
by Jenda
in thread Better Hash Tables?
by QM
For: | Use: | ||
& | & | ||
< | < | ||
> | > | ||
[ | [ | ||
] | ] |