in reply to Re: Using LWP instead of wget?
in thread Using LWP instead of wget?

I like it!

Replies are listed 'Best First'.
Re^3: Using LWP instead of wget?
by BrowserUk (Patriarch) on Jul 28, 2012 at 05:39 UTC

    I have to say, if you are simply downloading a big list of urls and don't mind waiting, I'd skip the Perl completely and use:

    wget -nd -i urls.list

    wget is fast enough, but serial.

    If I was in a real hurry, and the urls were spread across many servers, I might use something like (untested):

    #! perl -slw use strict; use threads ( stack_size => 4096 ); use Thread::Queue; use LWP::Simple; my $Q = new Thread::Queue; sub worker { while( my $url = $Q->dequeue ) { chomp $url; ( my $file = $url ) =~ tr[/:?*"][_]; #" my $status = getstore $url, $file; $status == 200 or warn "$status : $url ($file)\n"; } } our $THREADS //= 4; $Q->enqueue( <>, (undef) x $THREADS ); $_->join for map threads->create( \&worker ), 1 .. $THREADS; ## use as thisScript -THREADS=8 < urls.list

    With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    "Science is about questioning the status quo. Questioning authority".
    In the absence of evidence, opinion is indistinguishable from prejudice.

    The start of some sanity?