in reply to Building a Parallel Robot
Don't forget that spidering a site overly rapidly is a good way to stop others being able to get to it, and thus really annoying the webmaster.
If you are going to try some kind of parallel robots then make sure you're not hammering all of their bandwidth. Sometimes it pays to do these things a little slower, it's more polite.
If you're on dialup then it's not overly important, but I have been on the receiving end of a very high bandwidth client spidering us and I can assure you I wasn't saying 'Oh, how nice.. they're showing their interest'. I was phoning a colleague to modify firewall rules to stop them eating up more of our bandwidth before it choked everything.
Yes, I know this server could have been set up better, but then so could most of them out there on the Internet.
Not really used the Perl robot stuff in anger yet, although this may change soon. I do know that most spidering things have a 'wait period' and 'maximum bandwidth' setting, and I'd heavily recommend their use'.
|
|---|