in reply to Giant Tie'd data structures
For you to be getting that error, you must be storing (and hashing) individual items that are longer than 64k each. The recommendation for the setting of the pagesize (bsize) parameter is that it should be set to 4x the size of your estimated biggest element (with lower/upper bounds of 512/64k).
It's generally not a good idea to hash/index the entirity of entities that size. For most applications there is some obvious subset of each item that can be used as a key to the item. At worst, you could MD5 the item and use that as an index to the item and store the items themselves in individual files or a fixed record length file sperately and use the hash to look up the file/record number and load it seperately.
Anyway, you might find this page useful.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: Giant Tie'd data structures
by mast (Novice) on Oct 26, 2005 at 16:48 UTC |