Beefy Boxes and Bandwidth Generously Provided by pair Networks
Think about Loose Coupling
 
PerlMonks  

Re: 15 billion row text file and row deletes - Best Practice?

by mattr (Curate)
on Dec 02, 2006 at 13:35 UTC ( [id://587395]=note: print w/replies, xml ) Need Help??


in reply to 15 billion row text file and row deletes - Best Practice?

Hi, BrowserUK's reply looks quite good.

I just would like to chip in that if you are willing to chunk your data file in 100MB bites, based on my timing of unix sort on a 100MB file of 9 digit numbers it would cost you 100 hours to get sort all the data first. If you can at the same time ensure fixed length columns it will help.

You really do not want to do any disk I/O since on my machine it takes 7 hours just to read the file once. If you have sorted kill list that is chunked according to the extents of each data chunk you also only have 1/40 of the kill list to worry about (if distribution is even).

  • Comment on Re: 15 billion row text file and row deletes - Best Practice?

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://587395]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others drinking their drinks and smoking their pipes about the Monastery: (4)
As of 2024-04-26 06:44 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found