I had always assumed that when such an error occurred, the OS would simply kill the perl process... Is my assumption incorrect ?
Actually, the OS is't involved. Perl just exits (via my_exit() in perl.c):
if ((p = nextf[bucket]) == NULL) { MALLOC_UNLOCK; #ifdef PERL_CORE { dTHX; if (!PL_nomemok) { #if defined(PLAIN_MALLOC) && defined(NO_FANCY_MALLOC) PerlIO_puts(PerlIO_stderr(),"Out of memory!\n"); #else char buff[80]; char *eb = buff + sizeof(buff) - 1; char *s = eb; size_t n = nbytes; PerlIO_puts(PerlIO_stderr(),"Out of memory during request +for "); #if defined(DEBUGGING) || defined(RCHECK) n = size; #endif *s = 0; do { *--s = '0' + (n % 10); } while (n /= 10); PerlIO_puts(PerlIO_stderr(),s); PerlIO_puts(PerlIO_stderr()," bytes, total sbrk() is "); s = eb; n = goodsbrk + sbrk_slack; do { *--s = '0' + (n % 10); } while (n /= 10); PerlIO_puts(PerlIO_stderr(),s); PerlIO_puts(PerlIO_stderr()," bytes!\n"); #endif /* defined(PLAIN_MALLOC) && defined(NO_FANCY_MALLOC) */ my_exit(1); ************************************ } } #endif return (NULL); }
"What's the surefire way of generating an OOM error in a perl script ?"
This does it for me:
c:\test\perl-5.13.6>perl -E"$x = chr(0)x2**31" Out of memory!
Personally, I think that if malloc fails for a request larger than say 64k, Perl should die not exit.
In reply to Re^2: die rather than exit on out-of-memory failure?
by BrowserUk
in thread die rather than exit on out-of-memory failure?
by chm
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |