Afternoon
Yeah, I get the same results (perl, v5.8.0 built for i386-linux-thread-multi; WWW::Mechanize 0.70). Like Roy Johnson suggested above, it's because WWW::Mechanize keeps a list of HTTP results in a page stack. Whenever it starts to get a new page, it stores the last response it received in an array.
If this is a problem for you - for example, if you've got a long running process and it's getting too fat - you should create a subclass of WWW::Mechanize that keeps a limit on the size of the page stack, perhaps by redefining the _push_page_stack method:package WWW::Mechanize::KeepSlim; our @ISA = qw/WWW::Mechanize/; sub _push_page_stack { my $self = shift; if ( $self->{res} ) { my $save_stack = $self->{page_stack}; $self->{page_stack} = []; push( @$save_stack, $self->clone ); # HERE! - stop the stack getting bigger than 10 if ( @$save_stack > 10 ) { shift(@$save_stack); } $self->{page_stack} = $save_stack; } return 1; } package main; my $agent = WWW::Mechanize::KeepSlim->new(); # ....
If you use this class with your example that demonstrates the problem, you should see the memory usage increase arithmetically for the first 10 requests, then stop increasing.
cheers
ViceRaid
In reply to Re: WWW::Mechanize memory leak???
by ViceRaid
in thread WWW::Mechanize memory leak???
by cwchang
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |