If the list of urls comes from a file somewhere, here's a script I had kicking around that fetches them concurrently and writes them to files:
C:\test>type getUrlList.pl #! perl -slw use strict; use threads ( stack_size => 4096 ); use Thread::Queue; use LWP::Simple; my $Q = new Thread::Queue; sub worker { while( my $url = $Q->dequeue ) { chomp $url; ( my $file = $url ) =~ tr[/:?.^"][_]; #" $url = 'http://' . $url unless $url =~ m[://]; my $status = getstore $url, $file; $status == 200 or warn "$status : $url ($file)\n"; } } our $THREADS //= 4; $Q->enqueue( <>, (undef) x $THREADS ); $_->join for map threads->create( \&worker ), 1 .. $THREADS; ## use as thisScript -THREADS=8 < urls.list
In reply to Re: fork / threads script
by BrowserUk
in thread fork / threads script
by luxs
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |