bluesplay106 has asked for the wisdom of the Perl Monks concerning the following question:

So I'm trying to create a web crawler and for some reason Mechanize is giving me some weird errors. So when I run my crawler it's fine, but then it just starts giving me GET errors for every link. I tried validating the link on my browser and using $mech->get and I received no error. Does Mechanize or the website I'm crawling have some sort of search limit? Thanks

Replies are listed 'Best First'.
Re: WWW::Mechanize giving GET Errors
by kcott (Archbishop) on Jun 13, 2013 at 16:19 UTC

    G'day bluesplay106,

    "So I'm trying to create a web crawler and for some reason Mechanize is giving me some weird errors. So when I run my crawler it's fine, but then it just starts giving me GET errors for every link. I tried validating the link on my browser and using $mech->get and I received no error. Does Mechanize or the website I'm crawling have some sort of search limit? Thanks"

    All the psychic monks are on a retreat this month. If you'd like an answer before their return, you'll need to provide some additional information:

    • What are the errors?
    • What code generates the errors?
    • Under what conditions does it run fine?
    • Under what conditions does it return errors?
    • What website are you crawling?

    See: How do I post a question effectively?

    -- Ken

Re: WWW::Mechanize giving GET Errors
by Anonymous Monk on Jun 13, 2013 at 16:18 UTC
    websites are free to be whatever they want, so if they want to block you, they'll block you, thats life