in reply to split file into smaller chunks

It could be that you are getting way too complicated for your application!

This is the first chunk. // This is the second chunk. // This is the third chunk. //
Some strategies:
1. Build a memory resident structure with the data that you need all in one pass through the data file. Fancy hash table stuctures,etc.
2. Search the file again and again and let the O/S do the "dirty work". Use regex and just do "something that appears stupid.
3. create a DB (which is expensive) and then query that DB

I mean how big is this file? If it is "small: like 250 MB" after the first search it all winds up memory resident anyway. Next searches (even linear) are 10x+ as fast.

I recommend option(2)..do something stupid and let the O/S do the work. If that is not "fast enough", then start thinking about option 1 or 3.

If say there are only 5,000 files and total DB size is 500 MB...do something easy... this is actually considered "small"! Don't get complex until you need to do it!

Update: Anyway you will be amazed at how quickly even a linear regex search works on a huge file once you have done it once before. On Win XP file size < 1GB.