in reply to Segmentation fault

You're getting a stack overflow in the refcounting garbage collector.

Increase your stack size to whatever is necessary.

On unix w/ bash:

ulimit -s 30000 perl test.pl
runs fine (and probably has plenty of room left)

Replies are listed 'Best First'.
Re^2: Segmentation fault
by jettero (Monsignor) on Jun 25, 2007 at 20:18 UTC
    Doesn't it seem like perl should have a cleaner way of recovering from this?
    $SIG{__ATROOSS__} = sub { die "about to run out of stack space!!" };

    This and the regexp one have always bugged the crap out of me.

    -Paul

        Does anyone know if that's a general stack fix or just fixes the regex engine?
        Just the regex engine

        Dave.

      The problem is there is no portable way to determine that you are about to run out of stack space...


      We're not surrounded, we're in a target-rich environment!
        I know almost nothing about it ... but it seems to me you could set a static limit for arch where there's no way to check how much stack space is left? Or is it dynamically allocated at random sizes on those platforms?

        Is it unreasonable to keep track of this sort of thing some place other than the stack? (I seem to recall the word heap for this context or something similar to it.)

        -Paul

Re^2: Segmentation fault
by bloonix (Monk) on Jun 25, 2007 at 20:30 UTC
    Hello Joost, big thanks for your fast answer. Yes, if I increase the ulimit the script runs very fine without a stack overflow. Do you know also why the segfault is produced first if the script die and not immediate?
      Do you know also why the segfault is produced first if the script die and not immediate?
      The stack-overflow is triggered when the top-level hash goes out of scope. In your example that is when the script ends, but that's just coincidental. If your hash goes out of scope before that, and your stack-size isn't big enough, you'll get a segfault at that point:

      { my $i = 60000; my %h = (new => {}); my $r = \%h; for (0..$i) { print $_, " "; $r = $r->{new}; $r->{new} = {}; } print "end 1\n"; } # <-- hash goes out of scope; segfault if stacksize is too low print "end 2\n";
      I'm not familiar with the implementation of perl's garbage collector, but I've seen this problem once before. In that case as in this one, the problem was caused by a program letting a long linked list (i.e. item a has a reference to item b which has a reference to item c etc. etc.) go out of scope.

      If you're really careful, you can clean up the linked list before it goes out of scope and prevent this error:

      { my $i = 60000; my %h = (new => {}); my $r = \%h; my $end = $r; # end of the list for (0..$i) { print $_, " "; $r = $r->{new}; $r->{new} = {}; $r->{new}->{old} = $r; # keep back-references $end = $r->{new}; } print "end 1\n"; while ($end) { # delete the list from the end delete $end->{new}; $end = delete $end->{old}; } } print "end 2\n";
        If you're really careful, you can clean up the linked list before it goes out of scope and prevent this error:

        [two-way linked list example]

        True, but way too much trouble. You may keep the original structure and just add a non-recursive disassembly:

        my $i = 30000; my %h = (new => {}); my $r = \%h; for (0..$i) { print $_, "\r"; $r = $r->{new}; $r->{new} = {}; } $r = $h{new}; undef %h; while ($r) { $r = $r->{new}; } print "end\n";