in reply to Re: Parallel::Forkmanager and large hash, running out of memory
in thread Parallel::Forkmanager and large hash, running out of memory

Sorry if I was not clear enough. I am processing thousands of files. The files that contain the values that I am loading into a hash are at about 1800 files right now.

  • Comment on Re^2: Parallel::Forkmanager and large hash, running out of memory

Replies are listed 'Best First'.
Re^3: Parallel::Forkmanager and large hash, running out of memory
by talexb (Chancellor) on Apr 24, 2013 at 15:30 UTC

    Oh -- so you're not processing a single file at once -- that's what I took away from your OP. My mistake.

    If you're loading stuff into a hash, and *that's* overflowing memory, then it sounds like you'll need another approach, and the one that comes to mind right away is to use a database.

    Alex / talexb / Toronto

    "Groklaw is the open-source mentality applied to legal research" ~ Linus Torvalds