in reply to Re^2: 15 billion row text file and row deletes - Best Practice?
in thread 15 billion row text file and row deletes - Best Practice?

That was my first thought, too (well, almost - I thought of DBM::Deep because I played with that one before).

I would have suggested a db if the OP didn't clearly state "Without using a DB..." :-) It also depends on how this data is being used/processed, though. If this is a one-time filtering step then loading it all into a db, filtering, and exporting again could be very inefficient. OTOH, if this is just one of many steps that require searching though the data, a db could be better.

  • Comment on Re^3: 15 billion row text file and row deletes - Best Practice?