in reply to How to put a fat program on a (memory) weight-loss diet? [SOLVED]

There's no such thing as running out of memory without failing outright. What does it do when it runs out of memory?
  • Comment on Re: How to put a fat program on a (memory) weight-loss diet?

Replies are listed 'Best First'.
Re^2: How to put a fat program on a (memory) weight-loss diet?
by locked_user sundialsvc4 (Abbot) on Mar 04, 2009 at 16:17 UTC

    I can now conclusively see that it fails when loading DBD::mysql (specifically when loading mysql.so under conditions known to be correct):   it successfully loads, but then fails to initialize that module. This produces the message:

    Undefined subroutine &DBD::mysql::db::_login called ...
    ... because the _login subroutine is implemented in the XS-extension .so that didn't get loaded. (Other messages, such as unexpectedly increased this-or-that, can occur for the same fundamental reason.)

    But the program keeps going, by design. In fact it produces a credible “an error occurred” output of its own making. (In my local tests, it has about a 50% chance of doing that; at other times it bombs with Perl's sudden-death “Out of Memory!”)

    As the final coffin-nail to the diagnosis, the program is known to work properly both in my test-rig and in the hosting company's test rig. And I can induce it to fail ... 50% or so of the time “in the same way” ... through the use of ulimit as described in other recent threads by me.

    So the fundamental nature of the problem, by now, is conclusively known ... copious thanks to the Esteemed Monks!

      This doesn't sound at all like running out of memory. It sounds like failing to access files or having a broken DBD::mysql. The only time you have memory problems is when you cause them with ulimit.

        For as long as I have been wrestling with this problem, I believe that I have in fact accumulated enough evidence to conclude that it is memory-related and that all of the intended files are being accessed.

        The program, when it runs in CGI (and not when it runs in the hosting-company's test-rig, or from the command line), can be shown to be running under a ulimit of approximately 38 megabytes. It can furthermore be shown that the aforesaid limits exist only in that case. When run, in any environment, under the same conditions, the failure scenario can be reproduced.

        I grant that you are entirely correct in your theory, as it applies to a great many cases, and I would stress “to those who come after us” that the recommendation you make should be carefully considered. But I feel comfortable now in saying that the root cause in this case really is “memory.” I say that in part because I have indeed demonstrated that the correct .so’s are being found, in the correct locations, and loaded (but they do not initialize). (And furthermore, I have been able to capture the $@ output from their attempt to do so.)

        Thank you, nevertheless, for the important suggestion:   it is what one should consider first; and in fact, I did so.