nozz has asked for the wisdom of the Perl Monks concerning the following question:

Hi,

I have a huge xml file which gives out of memoery errors when I try and parse with XML::EasyOBJ. So, what i've done is broken up the large file by reading in a creating a small file between each tags of tags that i am interested in. I then define a new XML::EasyOBJ object and parse this temp small file. I then delete the file and create a new one using the next set of tags. I'm still getting the same out of memory error:

Out of memory during request for 1016 bytes, total sbrk() is 4266734992 bytes!

It's as though each XML::EasyOBJ is not being destroyed when the temp file is unlinked

I know that I could change the limits using ulimit but I was wondering what's wrong with the code?

A code snippet below:

open FILE,$file; my $in_entry='no'; while(my $line = <FILE>){ if($line =~ /<BIND-Interaction>/){ $in_entry='yes'; open OUTPUT, '>/tmp/bind_single.xml'; print OUTPUT $line; } elsif($line =~ /<\/BIND-Interaction>/){ print OUTPUT $line; $in_entry='no'; close OUTPUT; my $doc = new XML::EasyOBJ('/tmp/bind_single.xml'); ##do the parsing here ##stop the parsing here unlink('/tmp/bind_single.xml'); } elsif($in_entry eq 'yes'){ print OUTPUT $line; } }
cheers Rich

Replies are listed 'Best First'.
Re: XML::EasyOBJ and out of memory
by holli (Abbot) on Mar 18, 2005 at 12:43 UTC
    So, what i've done is broken up the large file by reading in a creating a small file between each tags of tags that i am interested in.
    Have a look at XML::Twig.


    holli, /regexed monk/
      good idea, thanks