in reply to Using URI::URL

Wrong question. You are trying to pull links from a webpage. You most likely instead should be using WWW:Mechanize:
use strict; use warnings; use Data::Dumper; use WWW::Mechanize; my $a = WWW::Mechanize->new(); $a->get( 'http://some.site.com.' ); print $_->url,"\n" for @{ $a->links };

jeffa

L-LL-L--L-LL-L--L-LL-L--
-R--R-RR-R--R-RR-R--R-RR
B--B--B--B--B--B--B--B--
H---H---H---H---H---H---
(the triplet paradiddle with high-hat)

Replies are listed 'Best First'.
Re: Re: Using URI::URL
by Anonymous Monk on Feb 21, 2004 at 17:53 UTC
    does www::mechanize follow the rules set by lwp::robotua? I know that before ifi tried to "get" something that a robots.txt file didnt allow me too, my get came up empty (a good thing). Does www:mechanize allow me to do the same thing? I have been to CPAN and looked at it, but didnt see anything about obeying rules. Thanks
      Kudos to you for wanting polite bots. The problem with getting LWP::RobotUA to play nice with WWW::Mechanize is that they both are subclasses of LWP::UserAgent. By itself, WWW::Mechanize does not consult the /robots.txt file, but you can instead use WWW::RobotRules. Here is a working example that tries to grab two files from my server: There might be a better way though ... ahh, how about "WWW::Mechanize::Polite"? And if i didn't just reinvent a wheel, you might be seeing this on the CPAN. ;)

      jeffa

      L-LL-L--L-LL-L--L-LL-L--
      -R--R-RR-R--R-RR-R--R-RR
      B--B--B--B--B--B--B--B--
      H---H---H---H---H---H---
      (the triplet paradiddle with high-hat)
      
        thanks jeffa, you rock!