in reply to 15 billion row text file and row deletes - Best Practice?
Another strategy that no one's mentioned is to split the incoming file into smaller chunks and work on each chunk.
But it looks like you have plenty approaches from which to choose -- although it would have been nice to know how large your kill file was, to put the problem into better perspective.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: 15 billion row text file and row deletes - Best Practice?
by tubaandy (Deacon) on Dec 01, 2006 at 15:42 UTC |