in reply to fast lookups in files
You already mentioned using a database which is a fine solution. It need not be heavy - I have used DBD::SQLite for problems like this. It would take less time to set up and have working then you have spent thinking about the problem.
FYI for the future - if keys were strings of letters instead of numbers, breaking the file into 26 smaller files based on starting letter and then doing a binary search would be even better. It may work here too but it would take a bit of analysis to know for sure. You would need to figure out a good way to distribute your keys between N files and a simple operation that could determine which file your desired key was in.Update: As an after thought, depending on how sparse your keys are, it may make sense to transform your file into fixed width records. You determine how many bytes the largest value you need to hold is and make all records that size. You no longer need to have keys in the file because the record number equates to the key name. Unfortunately, you need to pad missing records which may grow your file to an unmanageable size depending on how sparse things are. Something to think about.
Cheers - L~R
|
|---|