in reply to when a script runs out of memory

I wrote a script that I had to optimize because it used to much memory. It ran out of memory on a machine with 1GB RAM and 2GB swap :) The message was : "Killed, out of Memory". The kernel (Linux) killed it, so it was absolutely impossible to manage the problem from within the script I guess.

Replies are listed 'Best First'.
Re^2: when a script runs out of memory
by salva (Canon) on May 11, 2005 at 10:34 UTC
    Maybe something can be done using ulimit first from the shell to set the maximum amount of memory the process can use, and then $^M to reserve a little emergency memory inside perl.
      That's a good idea. However most of the time if the script really hits the point where it sucks up all available RAM, there's probably something to optimize :) In my case, the script indexed several millions files from 80 terabytes of storage in one big hash : I had to splice the job in parts to use less memory.