in reply to 15 billion row text file and row deletes - Best Practice?

One alternative way to look at the problem requires co-operation from the downstream processes, and avoids having to create a copy of the file.

The idea is to open the file for read/write, and instead of deleting the record (which really means "not writing it to the new file") you seek to the beginning of the record and replace the serial number with dashes or tildes, or whatever it takes for a downstream process to ignore it.

This is akin to how DOS deleted files: it simply replaced the first character of the filename by a twiddly character, and the directory read-first/read-next system calls knew to ignore them when asked to return the files in a directory.

If you have relatively few serial numbers to delete you can store them in a hash (to the extent that "relatively few" fits comfortably in 2Gb of space).

If you have more have more than that, you may find that assembling them all into a regexp (with Regexp::Assemble) may yield a tractable pattern. The economy of sharing the common prefixes of many serial numbers means that the pattern won't be as big as you'd think. Then you want to see if the line matches the regexp, rather than seeing if the field exists in the hash.

• another intruder with the mooring in the heart of the Perl

  • Comment on Re: 15 billion row text file and row deletes - Best Practice?

Replies are listed 'Best First'.
Re^2: 15 billion row text file and row deletes - Best Practice?
by DrHyde (Prior) on Dec 04, 2006 at 10:46 UTC
    This is somewhat similar to what I would suggest, except that instead of marking certain records as having been deleted, I'd keep two pointers into the file, one for read and one for write, then go through the entire file line by line reading and writing, but just not writing if I find one of the unwanted serial numbers. This approach will involve reading the whole file (but only a line at a time, so won't need much memory) and writing the whole file, so will take just as much time as writing the wanted lines to a new file and then renaming it, but it won't require 380GB of intermediate storage.

    Don't forget to truncate() the file when you've finished writing the last record, or you'll end up with rubbish tacked onto the end.