in reply to Re: segmentation fault
in thread segmentation fault

Some good thoughts there to work on, thanks.

>> ..try to run each job in a separate process..?

Do you mean with a system()'d or fork()+exec()'d call through a new system shell to a fresh child perl each time from a parent perl script? if not, I'm at a bit of a loss to grasp how this might be arranged?

I agree that stack overflow seems the most likely culprit and I wonder then if you think my "threaded" perl installation could be somehow contributing? While this application makes no explicit use of threads maybe the interpreter does?

regards

Pstack

UPDATE:

A higher ulimit -s did not itself fix the problem but helped track it down, as a bigger stack size would allow more successful runs before the segfault. What was incrementally clogging up the bash stack appears to be runaway Berkeley DB cursor handles not properly closed out. I suppose Perl would not free them merely via "going out of scope" since they belong to the underlying C suite. Thought you might like to know.

btw running so close to the default it seems a pity not to be able to up that bash shell stack limit from within Perl just for this routine (ie not altering the default itself)? My attempts anyway ran into a Wall.

cheers

Pstack

Replies are listed 'Best First'.
Re^3: segmentation fault
by Pstack (Scribe) on May 06, 2007 at 23:16 UTC
    btw am making some progress via ulimit resets, getting feedback etc.