That said, although rfc2616 is talking about persistent connections in this paragraph, I'd take it to heart even with non-persistent connections:
Clients that use persistent connections SHOULD limit the number of simultaneous connections that they maintain to a given server. A single-user client SHOULD NOT maintain more than 2 connections with any server or proxy. A proxy SHOULD use up to 2*N connections to another server or proxy, where N is the number of simultaneously active users. These guidelines are intended to improve HTTP response times and avoid congestion.Finally, I'm a bit wary of the line "I estimate that the robot will send about one request per second." If that's your estimate, have some mechanism in place so that when it goes above 90 requests/minute the script is killed. I've seen far too many programs go wrong with a simple misplaced comma to trust that some program I write won't suddenly go wild without doing some testing first.
The simplest way to do this is to log all requests to the screen and be fast with the Ctrl-C when things go bad.
In reply to Re: parallel downloading
by fizbin
in thread spidering, multi-threading and netiquette
by dannoura
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |