haiihh has asked for the wisdom of the Perl Monks concerning the following question:
greeting monks..I'm new in programming also new in perl..i'm trying to fetch the page using WWW::Mechanize with multiple page by using for loop, but i cant stored/gathered the link..how this can be done??
for $cnt (1..3){ my $url = "http://www.foo.com/p/$cnt"; $mech = WWW::Mechanize->new(); $mech->get($url) or die "no such url"; @link{$cnt} = $mech->find_all_links(tag => "a", class => " +foo") or die "can't find link"; @link = @link{$cnt}; $linksize = scalar @link; $hlink = "got $linksize :\n"; print $hlink; $linkurl .= $_->url . "\n" for @link; print $linkurl; }
thank in advance
**sorry if my query is not clear..i'm newbies in perlmonks...if there more info needed pls let me know..
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: how to store link array in for loop with WWW::Mechanize
by Eily (Monsignor) on Nov 06, 2014 at 16:08 UTC | |
by Eily (Monsignor) on Nov 06, 2014 at 16:20 UTC | |
by haiihh (Initiate) on Nov 07, 2014 at 02:33 UTC |