nic12385 has asked for the wisdom of the Perl Monks concerning the following question:
Now I have also got the code that can take a file and treat it as an array and the entries in the file are the scalars. Maybe my description is off, but this script works.use warnings; use LWP::UserAgent; my $UserAgent = new LWP::UserAgent; my $Request = new HTTP::Request ('get', 'www.website.com'); my $Response = $UserAgent->request ($Request); open (FILE, ">/strawberry/perl/file.txt"); print FILE $Response->{_content}; close (FILE);
Now I want to combine the two and make the 'get' to loop through every single entry on my website.txt file in the second code and record the information. Something like this;use warnings; use strict; my $file = "/strawberry/perl/website.txt"; open (FH, "< $file") or die "Can't open $file for read: $!"; my @lines = <FH>; print @lines; close FH or die "Cannot close $file: $!";
I have seen a lot of modules (and tutorials) that will will allow people to get info from web sites, but haven't seen anything on accessing multiple websites. I don't care what module I use really, just want to be able to access a large list of web sites that are in a text file. Any thought? Thanks in advance.my $Request= new HTTP::Request ('get','@lines')`or my $URL=get('@lines')
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: "Web Scrapping" Using a List of Web Pages
by Corion (Patriarch) on Jun 26, 2014 at 20:59 UTC | |
|
Re: Web Scrapping" Using a List of Web Pages
by Discipulus (Canon) on Jun 27, 2014 at 07:33 UTC |