Randal Schwartz wrote an article on this subject titled, Throttling Your Web Server which might be useful.
Otherwise I would think that settings in your robots.txt might be sufficient to tell the spider to either slow down (there's a Crawl-delay directive) or to simply stop spidering your site.
If you are using Apache (or any modern HTTP server I think) you can, of course, simply deny certain IP addresses.
Celebrate Intellectual Diversity
In reply to Re: Optimize DBI connect to avoid max_user_connections
by InfiniteSilence
in thread Optimize DBI connect to avoid max_user_connections
by Alex-WisdomSeeker
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |