hi,
I've recently built a web robot which is supposed to spider through an internet forum. This site has no robots.txt file. I estimate that the robot will send about one request per second. Offhand that wouldn't seem to be too much of a bother for that website (it has about 1,000 people online at any given time so the increased traffic wouldn't be substantial) but just to make sure I want to know if I'd be violating any etiquette issues.
I also thought of doing the spidering using parallel downloading of the pages. How would I go about doing this? Multi-threading? If I do implement this then for a short time the traffic will increase substantially. Could this be construed as an attack on the server?
p.s. I realize I could clear up some of this by contacting the website administrator but, at this point, for various reasons, I still don't want to do so, although I may in the future.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
•Re: spidering, multi-threading and netiquette
by merlyn (Sage) on Feb 21, 2004 at 12:12 UTC | |
by dannoura (Pilgrim) on Feb 21, 2004 at 12:18 UTC | |
by merlyn (Sage) on Feb 21, 2004 at 13:08 UTC | |
|
Re: parallel downloading
by fizbin (Chaplain) on Feb 21, 2004 at 13:48 UTC | |
by flyingmoose (Priest) on Feb 21, 2004 at 17:35 UTC | |
by hossman (Prior) on Feb 21, 2004 at 20:54 UTC |