If you're not going to keep it all in memory, you're going to write it all to disk. The question is how exactly to do that.
Instead of a straight in-memory hash, you could use a DBM::Deep file. This would require very little modification of your current code.
Alternately a solution based on Cache::Cache (like Cache::FileCache) might serve well.
A more crude solution might be to turn your hash keys into filenames and store each hash entry as a file with one line per value. You end up with millions of files, each 12 lines long. Then you can post-process those files into one big file pretty easily.
In all cases, it's probably going to be a lot slower than memory.
In reply to Re: Merging/Joining Multiple Large Files
by kyle
in thread Merging/Joining Multiple Large Files
by bernanke01
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |