Reloading the DB dump you receive into a local DB, as Flech suggests, is the best option, but if you can't afford to set up a database server, then your only other choice, is to use something like Search::Binary, and implement it's read() function using seek. It wouldn't be as fast in searching, but it might be acceptable, and way better than linear search which would take a long time on a file that size, even with a very big disk cache. Other than DB or binary search, trying to squeeze it into memory is your only only option, and storing it as strings with separators and/or with some compression is the thing to try.
On the other hand, if you have a batch process, with a list of keys (pins), you can do an fgrep against the full file with the list of pins, if it isn't too big, and then pipe that into a hash.
In reply to Re: Moving from hashing to tie-ing.
by rodion
in thread Moving from hashing to tie-ing.
by eff_i_g
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |