Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

I'm writing perl that is searching for memory leaking processes from collected process data, and getting either "out of memory" or "Memory Fault(coredump)" issues. This code works fine on smaller machines with less processes, but it dies (on the last day of the week, argh!)when trying to create weekly summaries of busier boxes.

My questions:

  1. Is there a way to increase the memory perl needs to run, and get me out without code changes?
  2. Is there a way to tell perl to allow larger numbers of members in subscripted arrays? Think of variables like:
    $process_details{$p_key}[ 5 ] = $mem_kbytes;
  3. Or is there something simple that I don't know enough about perl yet to have stumbled over?

I'd post the code, but it is huge, and I'm not sure that it would help. I'd like to try this in general first and then see what happens.

Thanks in advance, oh hooded ones!

20060926 Janitored by Corion: Added formatting, code tags, as per Writeup Formatting Tips

  • Comment on Out Of Memory or Memory fault(coredump) errors, is there a way out?
  • Download Code

Replies are listed 'Best First'.
Re: Out Of Memory or Memory fault(coredump) errors, is there a way out?
by perrin (Chancellor) on Sep 26, 2006 at 19:28 UTC
    See this FAQ. There's also some info in the Camel Book aka Programming Perl.
Re: Out Of Memory or Memory fault(coredump) errors, is there a way out?
by Argel (Prior) on Sep 26, 2006 at 21:39 UTC
    Are you thinking of Java where e.g. you can give it more heap? Perl doesn't really work that way which means you should treat the out of memory error as legitimate. There is a chance you could build your own Perl that uses less memory than the one you are using right now but that sounds worse than throwing more memory at the problem. I don't see how you can avoid some code changes, which means you should take a look at that FAQ Perrin mentioned. You could also post your code (prefferably using code and readmore tags) and see if anyone offers some advice.
Re: Out Of Memory or Memory fault(coredump) errors, is there a way out?
by badaiaqrandista (Pilgrim) on Sep 26, 2006 at 22:48 UTC
    How about using tied hashes? With that, you're trading memory with cpu cycles.
    -cheepy-
Re: Out Of Memory or Memory fault(coredump) errors, is there a way out?
by andyford (Curate) on Sep 26, 2006 at 20:38 UTC
    I'm confused. Are you saying that you have a perl program that runs out of memory while looking for other processes that have run out of memory?

    andyford
    or non-Perl: Andy Ford

Re: Out Of Memory or Memory fault(coredump) errors, is there a way out?
by swampyankee (Parson) on Sep 27, 2006 at 17:35 UTC

    It may not even be a Perl issue, in that the id it's running under (you've not mentioned the O/S being used…) is preventing Perl from allocating memory it needs. If this is the case, it may be a way around the "out of memory" error you're getting from the Perl program, but it is most emphatically not risk free.

    One suggestion: use Tie. This should minimize (but not eliminate) required program changes, and reduce the possibility of running out of memory by keeping the hash on disk. You could also use the deprecated dbmopen, but it's deprecated for a reason, and I would advise against introducing it into code.

    emc

    At that time [1909] the chief engineer was almost always the chief test pilot as well. That had the fortunate result of eliminating poor engineering early in aviation.

    —Igor Sikorsky, reported in AOPA Pilot magazine February 2003.