in reply to XML Parsing Out of Memory error

This might actually be a perl autovivify bug. Check this out:
use strict; my $ref; $ref->{'hash'}->[0]->{'hash2'}->{'hash3'} = 'go go gadget autovivify!' +; print $ref->{'hash'}->{'hash2'}->{'hash3'};
Produces 'Out of memory!' when ran

While this:

use strict; my $ref; $ref->{'hash'}->[0]->{'hash2'} = 'go go gadget autovivify!'; print $ref->{'hash'}->{'hash2'};
produces 'Bad index while coercing array into hash at t3.pl line 6.'

I tried a couple of different combinations, the trick to making it barf is to have TWO or more levels of referencing after the one you reference incorrectly.

This is Perl 5.6.1 on Redhat Linux 7.2

/\/\averick
perl -l -e "eval pack('h*','072796e6470272f2c5f2c5166756279636b672');"

Replies are listed 'Best First'.
Re: Re: XML Parsing Out of Memory error
by Matts (Deacon) on Feb 28, 2002 at 19:39 UTC
    It's actually a pseudohash bug, fixed in 5.7.2 (5.8 to be).

    Remember how pseudohashes work - they're actually arrays where the first entry is supposed to be a hash indicating the position of the fields. What's happening here is that perl is trying to create an enormous array. I can't recall the exact semantics of it, but it's something to do with extending the array to the size given by the length given by the memory location of "0", or something like that. Maybe someone can find the p5p discussion on this, so I don't have to look for it ;-)