in reply to Re: 15 billion row text file and row deletes - Best Practice?
in thread 15 billion row text file and row deletes - Best Practice?

This is somewhat similar to what I would suggest, except that instead of marking certain records as having been deleted, I'd keep two pointers into the file, one for read and one for write, then go through the entire file line by line reading and writing, but just not writing if I find one of the unwanted serial numbers. This approach will involve reading the whole file (but only a line at a time, so won't need much memory) and writing the whole file, so will take just as much time as writing the wanted lines to a new file and then renaming it, but it won't require 380GB of intermediate storage.

Don't forget to truncate() the file when you've finished writing the last record, or you'll end up with rubbish tacked onto the end.