in reply to 15 billion row text file and row deletes - Best Practice?
A possible algorithm would be to read one serial number to remove from the exclude file.
You then go through the main file one by one, copying each line to the out put file until you've either found the serial number you want to remove, or have gone past the serial number you picked from the exclude file.
If it is one to exclude you just don't write it, pick the next serial number from the exclude file and carry on.
If you've gone past it, you just read from the exlude file until you've got a serial number higher than the the one you've just picked from the main file. Then, as before, just carry on.
This alogrithm is single pass, and only has a memory overhead of two lines of text.
Update: I mean something like this.
#!/usr/bin/perl use warnings; use strict; open DATA, "data.txt" or die("Can't open file for reading: $1"); open EXCLUDE, "exclude.txt" or die("Can't open file for reading: $1"); open OUT, ">out.txt" or die("Can't open file for writing: $1"); my $exclude_serial = <EXCLUDE>; chomp($exclude_serial); my $line_data = <DATA>; while ($line_data) { chomp($line_data); my ($serial, $name, $flag) = split /,/, $line_data; # if we've run out of numbers to exclude just print the line to th +e outfile if (! defined $exclude_serial) { print OUT "$line_data\n"; # if we've not yet reached the serial to exclude, again just print + the line } elsif ($serial < $exclude_serial) { print OUT "$line_data\n"; # we must need a new exclude number then, pull it off the file, ke +eping track # of whether the current or subsequently read exclude serials mean + we shouldn't # print the current line } else { my $write_current_line = 1; # assume it's okay unless we find +a match do { $write_current_line = 0 if $exclude_serial == $serial; $exclude_serial = <EXCLUDE>; chomp($exclude_serial) if defined $exclude_serial; } until (! defined $exclude_serial or ($exclude_serial > $seri +al) ); print OUT "$line_data\n" if $write_current_line; } $line_data = <DATA>; }
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: 15 billion row text file and row deletes - Best Practice?
by pemungkah (Priest) on Dec 02, 2006 at 00:32 UTC |