in reply to 15 billion row text file and row deletes - Best Practice?

You might also find that your handling of the file speeds up if you gzip it. If you're I/O limited, you'll have less I/O to do to read through the file.

Of course, you'll need to make a pass through in order to gzip it, but hey, you'll save next time.

In fact, to do anything with this file (other than append) you'll need to pass through pretty much the entire file. I don't know what you use it for (can you tell us?) but almost every operation on this file (apart from 'append new item') is going to be faster in a database with an index.

  • Comment on Re: 15 billion row text file and row deletes - Best Practice?