I'm trying to parse the links in a webpage and then write a routine that will go to the destination of each link and scan for a set of key words. If the key word(s) are in that link page the contents of the page will be copied.
I'm using LWP:: Simple to retrieve the web page under $webpage. I'm planning on looking into the benefits of using Link::Extor to get the links in the page sent to a scalar and then I'll set up a subroutine to go through the scalar, pull out the contents of each link (using LWP::Simple again), search the link for keyword matches, and then determine if that page needs to be copied.
If anyone has any hints on a better way to do this or any advice whatsoever, before I get too invested in this, it would be greatly appreciated.
thanks,
perl neophyte cdherold
In reply to Following Extracted Links by cdherold
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |