Reading between the lines of your last couple of OPs, you seem to be parsing multiple XML files with XML::Simple; and doing so in threads in the hope of preventing leaks?
From a couple of quick experiments, it seems that whichever underlying parser -- XML::SAX, XML::SAX::PurePerl, XML::Parser -- you use, they all leak substantial amounts of memory. XML::SAX more so that XML::Parser, but still in the range of 10 to 30MB per iteration for a 12MB xml file.
My suggestion to cure that -- essentially what Zentara suggested earlier, but with a twist -- is to run the parser in a separate process. I used this to parse the file and then export the structure back to the parent:
#! perl -slw use strict; use Storable qw[ freeze ]; use XML::Simple; $XML::Simple::PREFERRED_PARSER = 'XML::Parser'; binmode STDOUT; print freeze XMLin( $ARGV[ 0 ] );
And then in the main script just a simple backticks command and thaw:
for( 1 .. 1000 ) { my $xml = thaw `xmlSto.pl junk.xml`; print mem $_; <STDIN>; }
It took about 1 extra second or so for the parent process to get access to the data structure, but it cures the leak completely.
In reply to Re: Memory Leak Package
by BrowserUk
in thread Memory Leak Package
by Bauldric
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |