It's die'ing because of an uncaught exception so, erm, perhaps catch it?
Update: Additionally, perhaps check the frelling status of your get before blithely and unconditionally proceeding on as if everything were hunky-dory?
If you've got a site that intermittently has hiccups something like this get wrapper might be useful:
use constant MAX_FETCH_ATTEMPTS => 5; use constant RETRY_FETCH_DELAY => 2; sub _get_with_retry { my ( $m, $url ) = @_; my $tries = 0; my $result; until ( $tries == MAX_FETCH_ATTEMPTS() ) { $result = $m->get($url); last if $result->is_success; $tries++; sleep( RETRY_FETCH_DELAY() ); } return $result; }
Update the second: Of course _get_with_retry would be kinda useless with autocheck enabled. Then again since you didn't bother to wrap your code in the correct tags I missed that it was set.
The cake is a lie.
The cake is a lie.
The cake is a lie.
In reply to Re: WWW::Mechanize goes bang
by Fletch
in thread WWW::Mechanize goes bang
by kansaschuck
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |