What I want to do is set up a main subroutine which grabs URLs from a database table, passes them one at a time to parse_page, and receives back a corresponding array of links. And to be perfectly clear, I want to do this serially.
I'm sorry if I have done something stupid. I have been sick all week, and I am working on less than my normal powers of abstract reasoning and concentration. I shamelessly leveraged the example code in the perldoc for HTML::LinkExtor and squinted at the screen for a while to get this far.
Dave Aiello
Chatham Township Data Corporation
#!/usr/local/bin/perl use LWP::UserAgent; use HTML::LinkExtor; use URI::URL; sub parse_page { my($url) = @_; my $ua = LWP::UserAgent->new; my @links = (); sub attachment_link_extractor { my ($tag, %attr) = @_; push(@links, values %attr) if (($tag eq 'a') && ($attr{href} =~ m/attachments/)); } my $p = HTML::LinkExtor->new(\&attachment_link_extractor); $res = $ua->request(HTTP::Request->new(GET => $url), sub {$p->parse($_[0])}); my $base = $res->base; @links = map { if ($_ =~ /^http/) { $_ = "/". url($_, $base)->rel; } else { $_ = $_; } } @links; $p->links; return(@links); }
In reply to Why is this link parser only working the first time? by dave_aiello
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |