in reply to Re^4: Perl threads to open 200 http connections
in thread Perl threads to open 200 http connections
but it failed throwing the below error:
Hm . Runs fine under 5.8.9.
Also, in my original prg, I was trying to download 200 different files changing the URL each time, but in the prg that you gave, looks like it downloads a single file 200 times.. is that right ?
Sorry. I don't have a handy source of 200 different files at my disposal.
#! perl -slw use strict; use threads ( stack_size => 4096 ); use threads::shared; use LWP::Simple; use Time::HiRes qw[ time sleep ]; our $T ||= 200; ## This can be change by a command line arguement -T=n +nn my $url = ## your url here ##; ## This shared variable counts ## the number of running threads my $running :shared = 0; ## This records the start time my $start = time; ## For 1 to 200 for( 1 .. $T ) { ## start a new thread async( ## running this sub sub{ ## Increment the running threads count { lock $running; ++$running }; ## Make all threads wait until all threads are running ## so that the download requests all hit the server at the + same time sleep 0.001 while $running < $T; ## The number (1..200) passed in as $_ below. my $id = shift; ## get $url and store it in a file with $id as part of the + name getstore( $url, qq[c:/test/dl.t.$id] ); ## Now this thread is finished, decrement the count lock $running; --$running; }, $_ ## $_ (1..$T) becomes $id inside. ## Detach means that the threads go away as soon as they are done. ## Rather than hanging around consuming resources waiting to retur +n ## a return value to join that we have no interest in. )->detach; } ## Now the main thread just sleeps till all the d/l threads have finis +hed. sleep 1 while $running; ## And tells you how long the whole thing took printf "Took %.3f seconds\n", time() - $start;
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^6: Perl threads to open 200 http connections
by robrt (Novice) on Aug 05, 2010 at 14:41 UTC |