in reply to 15 billion row text file and row deletes - Best Practice?
If you assume the name field is only 15 characters on average then a rough calculation is as follows. (Based on plain old ASCII).
You need to lose at least 80GB from the unfiltered file. Your average record length is: -
11 (serial) + 1 (comma) + 15 (name) + 1 (comma) + 1 (y|n) + 1 (\n) = 30. (total bytes)Now 80GB = 85899345920 bytes and 80GB/30b (approx)= 2,863,311,531. That's how many records need to be in your kill list. I doubt grep could handle that.
Interestingly, 380GB = 408021893120 that implies 380GB/30b (approx)= 13,600,729,771. There aren't even that many people in the world! So either you've left some fields out, my average name length assumption is wrong or each character in your file is more than a byte wide!
PS: I am sure that there are plenty of holes in my calculations :)
Update: Of course you said there were 15 billion rows!
|
|---|