I don't know exactly what you're trying to do, but as you mention:
somewhere on the order of 20 pages on about 200 URLs
It sounds to me like you're just trying to watch pages that you're linking to, and checking to see if they still resolve. if that's the case, you might try some of the specialized scripts that exist for this sort of thing, such as linklint, or search for 'link checker' with your favorite internet search engine.
...
As for other approaches ... if it's static pages, you can use GET but then set a current 'If-Modified-Since'. Otherwise, you can also start retrieving a document, then close the connection. (I don't know of any HTTP clients for Perl that do this ... you'd likely have to use IO::Socket and write your own).
In reply to Re: Check if URL exists
by jhourcle
in thread Check if URL exists
by sivel
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |