perlmonkey2 has asked for the wisdom of the Perl Monks concerning the following question:
Then register the beginning URLs:my $ua = Spider::LWP->new($depth,$path,$max_sockets,$ignore,$exclude); $ua->duplicates(0);#don't ignore duplicates here as this is done in th +e subclass a gazillion times more efficiently $ua->cookie_jar({});#where else would you store cookies? $ua->redirect(1);#follow redirects. HACKED BASE LIBRARY to make this +work with the subclass. $ua->in_order(1);#do the urls in order, as we randomize their entry in +to the queue. $ua->remember_failures(0);#don't remember failures here as the lib sto +res the entire object. This is done in the subclass $ua->max_hosts($max_sockets);#max open requests at any given moment $ua->max_req(1);#max requests per host $ua->nonblock(1);#don't block on LWP::UserAgent socket reads
Can anyone see anything wrong with this? What I think is happening is, and maybe I'm way off course here, but that for some reason a socket gets BLOCKED and times out and this timeout causes all the other sockets to timeout.$ua->wait(300); # block until we are all finished or until everything + has stopped for 5 minutes
|
|---|