Beefy Boxes and Bandwidth Generously Provided by pair Networks
No such thing as a small change
 
PerlMonks  

Re: 15 billion row text file and row deletes - Best Practice?

by reasonablekeith (Deacon)
on Dec 01, 2006 at 13:39 UTC ( [id://587187]=note: print w/replies, xml ) Need Help??


in reply to 15 billion row text file and row deletes - Best Practice?

You don't mention if either of the files are sorted. If they are both sorted by serial number, then it doesn't matter how big either file is, just that you've got enough room for the output file

A possible algorithm would be to read one serial number to remove from the exclude file.

You then go through the main file one by one, copying each line to the out put file until you've either found the serial number you want to remove, or have gone past the serial number you picked from the exclude file.

If it is one to exclude you just don't write it, pick the next serial number from the exclude file and carry on.

If you've gone past it, you just read from the exlude file until you've got a serial number higher than the the one you've just picked from the main file. Then, as before, just carry on.

This alogrithm is single pass, and only has a memory overhead of two lines of text.

Update: I mean something like this.

#!/usr/bin/perl use warnings; use strict; open DATA, "data.txt" or die("Can't open file for reading: $1"); open EXCLUDE, "exclude.txt" or die("Can't open file for reading: $1"); open OUT, ">out.txt" or die("Can't open file for writing: $1"); my $exclude_serial = <EXCLUDE>; chomp($exclude_serial); my $line_data = <DATA>; while ($line_data) { chomp($line_data); my ($serial, $name, $flag) = split /,/, $line_data; # if we've run out of numbers to exclude just print the line to th +e outfile if (! defined $exclude_serial) { print OUT "$line_data\n"; # if we've not yet reached the serial to exclude, again just print + the line } elsif ($serial < $exclude_serial) { print OUT "$line_data\n"; # we must need a new exclude number then, pull it off the file, ke +eping track # of whether the current or subsequently read exclude serials mean + we shouldn't # print the current line } else { my $write_current_line = 1; # assume it's okay unless we find +a match do { $write_current_line = 0 if $exclude_serial == $serial; $exclude_serial = <EXCLUDE>; chomp($exclude_serial) if defined $exclude_serial; } until (! defined $exclude_serial or ($exclude_serial > $seri +al) ); print OUT "$line_data\n" if $write_current_line; } $line_data = <DATA>; }
---
my name's not Keith, and I'm not reasonable.

Replies are listed 'Best First'.
Re^2: 15 billion row text file and row deletes - Best Practice?
by pemungkah (Priest) on Dec 02, 2006 at 00:32 UTC
    Keith has the best solution here. It's an old-fashioned match-merge. It's also the fastest: guaranteed O(n) (after the sort, of course).

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://587187]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others scrutinizing the Monastery: (3)
As of 2024-04-16 19:09 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found