Although 100/sec sounds fun, I rather think it would look like some sort of feeble DDoS attack and get my IP blocked. I've read 5/sec is considered high by some spider writers. Apparently you can register your site with Google and set a parameter to limit it's strike rate though sometimes the Google spider just ignores it.
I don't run a web server, but I bet the logs are just stuffed full of bots gathering pages.
I do wonder what a polite rate is though; fast enough so that old results are still timely but slow enough to not be annoying.
In reply to Re^3: Speeding things up -- LWP::Parallel
by AlwaysSurprised
in thread Speeding things up
by AlwaysSurprised
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |