pwl0lwp has asked for the wisdom of the Perl Monks concerning the following question:

I have a script that uses & re-uses the same WWW::Mechanize object over & over again. After 194 uses (combinations of get, follow_link & submit), I suddenly get a "Bad Request" response from the server.

This happens reliably: I have a list of things that I'm iterating through, and if I change the list, it still stops after 194 successful calls.

I tried putting a 5-second sleep when the number of calls hit 190, in case it was the server getting annoyed (& producing a client error ??)

stack_depth is set to 10.

As a workaround I've undef'd & recreated the mech object every few iterations, which seems to have solved the problem, but I haven't found any reference to a limit.

Anyone seen something like this before and/or can explain what's going on ?

  • Comment on WWW::Mechanize gets bad request after running OK for a while

Replies are listed 'Best First'.
Re: WWW::Mechanize gets bad request after running OK for a while
by Anonymous Monk on Aug 19, 2016 at 03:36 UTC

    Which version of WWW::Mechanize/LWP...? It pays to always have the latest installed

    Anyone seen something like this before and/or can explain what's going on ?

    Simple answer for all such questions (whats going on), is that webservers can be as crazy/dumb as they want

    For example, webserver set a counter in a cookie ... which after 194 requests has a value the webserver doesn't know how to handle, so it quits (bad request)