esharris has asked for the wisdom of the Perl Monks concerning the following question:

When a script runs out of main memory, does the interpreter invoke a perl die? Or does it just bomb out?

I have CGI scripts that seem to bomb out when given tasks that require lots of space. I'm wondering if it is possible to output an "out of memory" message.

The source of the problem is just a conjecture. I tried using a debugger on the script and the debugger bombed out too!

Replies are listed 'Best First'.
Re: when a script runs out of memory
by Joost (Canon) on May 10, 2005 at 19:53 UTC
Re: when a script runs out of memory
by wazoox (Prior) on May 10, 2005 at 21:52 UTC
    I wrote a script that I had to optimize because it used to much memory. It ran out of memory on a machine with 1GB RAM and 2GB swap :) The message was : "Killed, out of Memory". The kernel (Linux) killed it, so it was absolutely impossible to manage the problem from within the script I guess.
      Maybe something can be done using ulimit first from the shell to set the maximum amount of memory the process can use, and then $^M to reserve a little emergency memory inside perl.
        That's a good idea. However most of the time if the script really hits the point where it sucks up all available RAM, there's probably something to optimize :) In my case, the script indexed several millions files from 80 terabytes of storage in one big hash : I had to splice the job in parts to use less memory.