in reply to HTTP::Request GET - url length restriction?

There is a maximum size a server will hold for a GET request. IIRC, it's usually 1024 bytes, but this may change from server to server. If you need more than that, use a POST. In general, you should probably be using POST whenever possible, unless you need to be proxy-freindly.

----
I wanted to explore how Perl's closures can be manipulated, and ended up creating an object system by accident.
-- Schemer

: () { :|:& };:

Note: All code is untested, unless otherwise stated

Replies are listed 'Best First'.
Re: Re: HTTP::Request GET - url length restriction?
by lovelost (Initiate) on Jan 13, 2004 at 09:33 UTC
    To be more specific, basically I am calling a mapping server with lots of postcodes, which then returns a map with those postcodes represented on the map. The company can only accept GET requests. Their server accepts very long urls, because my very long test example works fine with wget from the shell, and even if the GET request is posted as a URL in Mozilla / Netscrape.

    I can't give an example because it would break data protection law, but the get requests will be bigger than 1K. Note that the request will work with LWP if the request is a lot smaller. I'm restricted to using Perl 5.6.1 on RedHat 8. LWP is 5.6.9 I believe.

    Cheers

      Well, if you've tried the EXACT same request using Mozilla, Netscape, and wget and it still only broke under LWP, then--just out of curiosity--how do you know it worked under Mozilla, Netscape, or wget? Are you sure the server received the entire request with them, or did they just not raise errors like LWP did?

      You might want to contact Gisle Aas about this via the CPAN request tracker.