in reply to Fastest way to download many web pages in one go?

here is the small example i wrote for you
it parses all hrefs from the original page
and downloading them into directory pages
forks at every link:
use LWP::Simple; $a = get("http://perlmonks.org/index.pl?"); (getLink($1)) while ($a =~s/a href=\"(http:\/\/.+)\"//); sub getLink{ if($pid = fork()){ $_ = shift; $filename = $& if /(?<=http:\/\/)[\w+|\.|\d+]+/; open OUT,">pages/$pid-$filename" or die $!; print OUT get($_); close OUT; exit(0); } }
please correct mistakes if you notice anything
i'll accept your corrections with all gratitude
and humility
thank you

Replies are listed 'Best First'.
Re^2: Fastest way to download many web pages in one go?
by davido (Cardinal) on Oct 13, 2013 at 01:54 UTC

    What happens when you run it?


    Dave

      well, it gets the first pointed http page
      and then uncontrollably forks at every "a href" link on that page
      getting and saving it to the local dir,
      naming files $pid-domainname.domain