in reply to Re^2: LWP::UserAgent Bad and Forbidden requests
in thread LWP::UserAgent Bad and Forbidden requests
Hi Corion,
True, but...all LWP::RobotUA gets you is a) client side processing of robot rules (i.e., once the user agent has downloaded robots.txt for a site, it will abort a banned url before making the request; and b) an optional, configurable delay between requests so your program can be a good "netizen" and avoid hammering websites too hard. None of this prevents the web server from evaluating your user agent identification string and processing its robot rules to accept or reject your request.
Cheers,
Larry
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^4: LWP::UserAgent Bad and Forbidden requests
by Corion (Patriarch) on Dec 16, 2011 at 07:26 UTC |