pkal has asked for the wisdom of the Perl Monks concerning the following question:

Hi All,

I have a problem that I cannot resolve. Some script, called as CGI, prints "Out of memory!" to the stderr. However, the same script does not display that error if run from the command line.

I could not isolate the problem to a simple test case; the problem does not occur for a simpler script.

The script calls another script ("worker") via open(). The other script forks and executes some "daemon" script. The problem does not occur without calling that daemon:

my $pid = fork; unless (defined($pid)) { return -1; } if ($pid != 0) { # parent return 1; } POSIX::setsid(); close(STDIN);close(STDOUT);close(STDERR); fork() && exit 0; exec("$command") or return 0;
The limits on the host are the following:
bash-3.00# ulimit -a

core file size (blocks, -c) unlimited
data seg size (kbytes, -d) unlimited
file size (blocks, -f) unlimited
open files (-n) 1024
pipe size (512 bytes, -p) 10
stack size (kbytes, -s) 8192
cpu time (seconds, -t) unlimited
max user processes (-u) 29995
virtual memory (kbytes, -v) unlimited

prstat and top displays up to 33M of memory occupied by the scripts.

When I added some trace messages, the "Out of memory!" message was written after the last command of the "worker" script.

I also tried running truss to trace the worker process. It didn't help too much:
worker: ending main
write(2, " w o r k e r : e n d i".., 19) = 19
worker ending
write(2, " w o r k e r e n d i n".., 13) = 13
setcontext(0xFFBFEAE0)
Out of memory!
write(2, " O u t o f m e m o r".., 15) = 15

Printing the text "worker ending" is the last instruction of the script:

print STDERR "worker ending"; exit 0;

Do you have any ideas, what next should I do? What could be the root of the problem? What tools could be helpful to trace it?

Thanks in advance,
pkal

Replies are listed 'Best First'.
Re: Out of memory! in CGI
by Eliya (Vicar) on Feb 21, 2011 at 15:10 UTC

    Two ideas:

    You could check if the webserver process has the same ulimits as the shell you ran ulimit from.  For this, just have a CGI script run ulimit and return its output.

    As you get the "out of memory" after the last instruction of the worker script, it could be that some cleanup code in destructors, END blocks, or the Perl interpreter itself (I suppose the worker is a Perl script, too), is using more memory than available at the time.  So you could try POSIX::_exit to exit the process immediately, bypassing any cleanup.

      Regarding ulimit - I did it earlier; the output from CGI is the same as from the shell.

      Regarding POSIX::_exit - it really makes a difference; there is no "out of memory" message when this function is called instead of normal exit. I am not yet sure if it is enough to solve the problem - it would be good to know what happens with the normal exit. But "out of memory" was gone. Thanks!

Re: Out of memory! in CGI
by scorpio17 (Canon) on Feb 21, 2011 at 15:08 UTC

    Check your web server configuration file. There may be a parameter that limits how much memory individual CGI scripts may use. With Apache, I think it's called RLimitMEM.

      The RLimitMEM directive was not set. I set it to max [max] and restarted Apache, but nothing changed.
Re: Out of memory! in CGI
by NetWallah (Canon) on Feb 21, 2011 at 16:12 UTC
    I had a similar issue recently, when parsing thousands of XML files.

    Resolved by increasing the swap partition size from 2 to 4 GB (Real memory was 2GB).

         Syntactic sugar causes cancer of the semicolon.        --Alan Perlis

Re: Out of memory! in CGI
by locked_user sundialsvc4 (Abbot) on Feb 21, 2011 at 18:31 UTC

    I have certainly seen this (under most awkward and unpleasant circumstances...) when using Perl in a commercial shared-hosting environment.   Hosting companies (natcherly...) impose absolute limits on the amount of RAM that a web request can use, and that limit can at times be fairly prohibitive.   My solution was to switch the site to dedicated hosting... which turned out to be the best solution for this and a variety of other reasons, and not significantly more expensive vs. the additional power thus gained.   (I pretty much don’t use shared hosting anymore, anywhere.)