hi,
I've recently built a web robot which is supposed to spider through an internet forum. This site has no robots.txt file. I estimate that the robot will send about one request per second. Offhand that wouldn't seem to be too much of a bother for that website (it has about 1,000 people online at any given time so the increased traffic wouldn't be substantial) but just to make sure I want to know if I'd be violating any etiquette issues.
I also thought of doing the spidering using parallel downloading of the pages. How would I go about doing this? Multi-threading? If I do implement this then for a short time the traffic will increase substantially. Could this be construed as an attack on the server?
p.s. I realize I could clear up some of this by contacting the website administrator but, at this point, for various reasons, I still don't want to do so, although I may in the future.
In reply to spidering, multi-threading and netiquette by dannoura
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |