More of a guess given your lack of details, but:
You'll never have more than one record from each file in memory at any given time, so beyond the size of the hash required to hold the keys, the size of the files doesn't matter.
An O(3N) process should be substantially quicker than 2 x O(NlogN) sorts + an O(N) merge if coded properly.
In reply to Re: Working on huge (GB sized) files
by BrowserUk
in thread Working on huge (GB sized) files
by vasavi
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |