in reply to WWW::Mechanize memory leak???
Afternoon
Yeah, I get the same results (perl, v5.8.0 built for i386-linux-thread-multi; WWW::Mechanize 0.70). Like Roy Johnson suggested above, it's because WWW::Mechanize keeps a list of HTTP results in a page stack. Whenever it starts to get a new page, it stores the last response it received in an array.
If this is a problem for you - for example, if you've got a long running process and it's getting too fat - you should create a subclass of WWW::Mechanize that keeps a limit on the size of the page stack, perhaps by redefining the _push_page_stack method:package WWW::Mechanize::KeepSlim; our @ISA = qw/WWW::Mechanize/; sub _push_page_stack { my $self = shift; if ( $self->{res} ) { my $save_stack = $self->{page_stack}; $self->{page_stack} = []; push( @$save_stack, $self->clone ); # HERE! - stop the stack getting bigger than 10 if ( @$save_stack > 10 ) { shift(@$save_stack); } $self->{page_stack} = $save_stack; } return 1; } package main; my $agent = WWW::Mechanize::KeepSlim->new(); # ....
If you use this class with your example that demonstrates the problem, you should see the memory usage increase arithmetically for the first 10 requests, then stop increasing.
cheers
ViceRaid
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Re: WWW::Mechanize memory leak???
by pg (Canon) on Jan 07, 2004 at 20:28 UTC | |
|
Re^2: WWW::Mechanize memory leak???
by Anonymous Monk on May 09, 2018 at 07:36 UTC | |
|
Re: Re: WWW::Mechanize memory leak???
by Anonymous Monk on Jan 08, 2004 at 15:52 UTC |