in reply to Re^2: My Crawler script
in thread My Crawler script
It defines
Your crawler should read the robots.txt and follow its strictures - including skipping the site altogether if you see
or a "disallow" that specifies your particular user agent.User-agent: * Disallow: /
I should note that some sites are a bit weird about who crawls them; at Blekko we had a certain site that wasn't sure they agreed with us on some philosophical points, to put it kindly, and they specifically blocked our crawler. This could happen, and it's important to be polite and follow the robots.txt directives to prevent people from taking more aggressive action, like blocking your IP (or worse, entire IP block).
(Edit: updated last sentence to clarify it slightly.)
|
|---|