Sihal has asked for the wisdom of the Perl Monks concerning the following question:

Fellow monks,

I'm using XML::DOM to parse a 250 kbytes XML document.
The parser works very fine indeed, but my perl process takes up to 1.85 Gigs of memory when parsing. Is that normal? I suspect no, but fail to have even the slightest hint on what might be causing this, bearing in mind that once the documents is parsed, the results are valid.

THanks in advance for any hints on what might be causing this.
  • Comment on Huge amount of memory used with XML::DOM

Replies are listed 'Best First'.
Re: Huge amount of memory used with XML::DOM
by Aristotle (Chancellor) on Dec 09, 2003 at 13:02 UTC
    Maybe this is related to the leak described in (XML::Parser) Finding and fixing a bug? Although I'm rather puzzled by the disparity between a 250KB input size and a 1.9GB process size. Something has to have gone seriously awry there..

    Makeshifts last the longest.

Re: Huge amount of memory used with XML::DOM
by artist (Parson) on Dec 08, 2003 at 18:10 UTC
    That's huge. If you post the code, we may be able to identify the problem.
Re: Huge amount of memory used with XML::DOM
by bear0053 (Hermit) on Dec 08, 2003 at 18:12 UTC
    perhaps some of your code would help...post an example snippet of where the parsing is performed
      Ok

      This is what I used to do:

      my $xml = getXML($url); # retrieve it thru http my $parser = new XML::DOM::Parser; my $doc = $parser->parse($xml); # This first snippet would take as much as 2Gigs of memory.
      Now, I do this:

      my $xml = getXML($url); # retrieve it thru http my $file = writeToFile($xml) or return; my $parser = new XML::DOM::Parser; my $doc = $parser->parsefile($file); # This isn't even noticeable


      Any Ideas why the first form is so greedy in memory?
      Never found anything about this in the doc.

      Thanks a lot.