in reply to Using LWP instead of wget?

At its simplest, LWP::Simple will do what you've described:

use LWP::Simple; my $url = '...'; my $filename = '...'; print "Getting $url and storing to $file returned: ", getstore $url, $ +file;

With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.

The start of some sanity?

Replies are listed 'Best First'.
Re^2: Using LWP instead of wget?
by kingram (Acolyte) on Jul 28, 2012 at 05:08 UTC
    I like it!

      I have to say, if you are simply downloading a big list of urls and don't mind waiting, I'd skip the Perl completely and use:

      wget -nd -i urls.list

      wget is fast enough, but serial.

      If I was in a real hurry, and the urls were spread across many servers, I might use something like (untested):

      #! perl -slw use strict; use threads ( stack_size => 4096 ); use Thread::Queue; use LWP::Simple; my $Q = new Thread::Queue; sub worker { while( my $url = $Q->dequeue ) { chomp $url; ( my $file = $url ) =~ tr[/:?*"][_]; #" my $status = getstore $url, $file; $status == 200 or warn "$status : $url ($file)\n"; } } our $THREADS //= 4; $Q->enqueue( <>, (undef) x $THREADS ); $_->join for map threads->create( \&worker ), 1 .. $THREADS; ## use as thisScript -THREADS=8 < urls.list

      With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
      Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
      "Science is about questioning the status quo. Questioning authority".
      In the absence of evidence, opinion is indistinguishable from prejudice.

      The start of some sanity?