Pstack has asked for the wisdom of the Perl Monks concerning the following question:
I am looking for ideas to contain and manage a Segmentation Fault that arises after several iterations of a batch process which, admittedly, is huge in both Perl code and data and data structures (that may not be getting cleaned up properly when finished with). The system monitor shows 100% cpu usage and 98% memory usage after the crash.
The process suite, normally invoked for just single iterations, has given no problems before. But in a batch loop after 5 to 12 or so it runs out of puff with a Segmentation Fault. Eval does not trap the crash.
The code is far too big and complex to go into, using many many external modules (some big), such as BerkeleyDB, Spreadsheet::WriteExcel etc., and literally tons of hashes and arrays etc., and maybe 20 or so internal modules in the worst cases.
If I could somehow predict the overload (my best guess from the circumstances) in advance, I could break up the batch runs into non-arbitrary smaller chunks, or if there were a module to clean up after calling certain modules before they are re-invoked, I could make use of that perhaps? Unfortunately there is no message besides the raw "Segmentation Fault".
o.k. you probably get the idea. This is Perl 5.8.7 compiled over RedHat 8 with linux kernel 2.4.20.
Any helpful thoughts appreciated.
cheers
Pstack
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: segmentation fault
by superfrink (Curate) on May 05, 2007 at 07:10 UTC | |
by Pstack (Scribe) on May 05, 2007 at 23:48 UTC | |
by Pstack (Scribe) on May 06, 2007 at 23:16 UTC | |
|
Re: segmentation fault
by RL (Monk) on May 05, 2007 at 14:01 UTC | |
by Pstack (Scribe) on May 05, 2007 at 23:57 UTC |