Simply don't hammer the site. Make your requests slower, by sleeping between requests. You should sleep at least as long as it took for the last request to get processed. All other "circumvention ideas" will only lead to an arms race between you and the hosting people.
Note that the hosting people have no interest in your task. They likely only care about keeping the website up and bots from crawling the website.
Test your crawler on a local copy of some pages.
In reply to Re^3: HTTP::Lite GET - too many requests?
by Corion
in thread HTTP::Lite GET - too many requests?
by mhnatiuk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |