I have yet to see a case where a hash too large for memory can be managed within a reasonable time frame with tied hash solutions. Tied hashes are just too slow for very large data. If it gets that big, then a database is probably a better solution.
Is your size problem due to having just very large files, or is it because you have a very large number of relatively large files? The possible solution would be very different.
I am regularly comparing truly huge files (hundreds of millions of records), that cannot fit into memory. In my experience (having tried many many things), the fastest way is to use the system sort utility to sort them according to the comparison key and then to read two files sequentially in parallel. The comparison routine is slightly tricky (to make sure you stay in sync), but nothing insurmountable.
In reply to Re: storing hash in temporary files to save memory usage
by Laurent_R
in thread storing hash in temporary files to save memory usage
by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |