Oh -- so you're not processing a single file at once -- that's what I took away from your OP. My mistake.
If you're loading stuff into a hash, and *that's* overflowing memory, then it sounds like you'll need another approach, and the one that comes to mind right away is to use a database.
In reply to Re^3: Parallel::Forkmanager and large hash, running out of memory
by talexb
in thread Parallel::Forkmanager and large hash, running out of memory
by mabossert
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |