in reply to WWW::Mechanize memory leak???
I don't believe it is leaking memory? Here's a quote from the perldoc for WWW::Mechanize.
Mech also stores a history of the URLs you've visited, which can be queried and revisited.Also, here's the output I get from your program.
agent size = 123086 agent size = 245249 agent size = 367383 agent size = 489517 agent size = 611651
This shows me that the size is going up by an almost constant size each loop. I then modified your program to use Data::Dumper and spit out the results after five GETs. Here's the code.
#!/usr/bin/perl -w + use strict; use WWW::Mechanize; use Devel::Size qw(size total_size); use Data::Dumper; + my $agent = WWW::Mechanize->new(); + for(my $i = 0; $i < 5; $i++) { $agent->get(qq(http://www.yahoo.com)); print "agent size = ".total_size($agent)."\n"; } + print Dumper($agent);
Looking at the output, there is a data structure that is part of the module called the page_stack. I'm guessing, that the implementation of the back() method uses the page_stack. That way, rather than re-requesting the pages, the pages are just reloaded from memory. I don't think this is a leak. This is just appears to be the functionality of the module.
|
|---|