For my own satisfaction, I generated just over 10GB of data spread across 4 files:
23/09/2016 20:18 2,737,000,000 data1.txt 23/09/2016 20:18 2,737,000,000 data2.txt 23/09/2016 20:18 2,737,000,000 data3.txt 23/09/2016 20:18 2,737,000,000 data4.txt 4 File(s) 10,948,000,000 bytes
That vaguely approximated your described data:
C:\test\10gb>head data1.txt @NS500278 GCGCCGGCGGCCATAGGGGCTGGTCAGTAAGAAGTAATGTCCCCAGCTGATTCGGATGGTGCGAATAAGG +TCTTATCACTTACCCAAAGTATCTGGCTCTATATTGAGATACCGGAAGCCCCTTGTTGGTACTATGTGA +TCGATATTTCT + =G=C=TATAAA=TGGTTG==C=TAT=GT=C==TAA==CCTA=AAACTTAAG==T=TTTGTCCAGGTAGCG +TC=CCA=TG=TA=CT=TT=CC=AA==TCA=ACCGTCGTGGCC=A==GAA=AT=TCAGGCC=CC=GCTAG +TG=CCA===TA 1 @NS500278 GTATACCGGTTGAAACAACACGAGAACGAAAGGGTCGCGACTCCATCATAAGGCTAGAAAACCAATTGTA +TGGATCTGAACATGTTGTGGCGTTACGCGGAACTCCCTGGCTAAAGTGAGACGATTATCAATAGAAAGC +AAACTCTACGT + AAAC=A=CTC=ATCAA==TGATACCGACCT=CCG=CCAG=CC=G=TC=TGGTG=GAGGTAGGT=AAC=GC +ACA=CC==GTGGACCCTA=C=C=CG==TA=G=CCA=G==CTGGCGTT=CCGAACAGGC===ATCGATC= +=TTTCTTTTTT 2
I implemented the code I described under How would I attempt the task as now described? in 55 lines, and ran it:
C:\test\10gb>..\1172352 *.txt > combined.out Pass one found 1048269 groups of records in 4 files in 361.736613 seco +nds Total time:4826.534832 seconds
So 1 hour 20 minutes. My "couple of hours" gestimate wasn't too far wrong.
In reply to Re^5: storing hash in temporary files to save memory usage
by BrowserUk
in thread storing hash in temporary files to save memory usage
by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |