Maybe something can be done using ulimit first from the shell to set the maximum amount of memory the process can use, and then $^M to reserve a little emergency memory inside perl. | [reply] [d/l] [select] |
That's a good idea. However most of the time if the script really hits the point where it sucks up all available RAM, there's probably something to optimize :) In my case, the script indexed several millions files from 80 terabytes of storage in one big hash : I had to splice the job in parts to use less memory.
| [reply] |