in reply to simple multithreading with curl

What did your Parallel::ForkManager code look like? Why didn't it work? We can't debug what we never see.

Why are you using a command-line call to curl when LWP::UserAgent does all this in Perl, and gives you more convenient error handling?

Why does it matter that you are downloading in series rather than parallel? Given how much more complicated any parallel/threaded code is, the time penalty for just waiting for the download is probably lower than the cost of your time in trying to code it and get it working.

All that having been said, threads is core and will probably get your job done with the minimum of fuss. Demo code:

use strict; use warnings; use LWP::UserAgent; use threads; my @websites = ( 'http://mysite.com/page', 'http://myothersite.com/page', 'http://myotherothersite.com/page', 'http://myotherotherothersite.com/page', ); my @threads; for my $url (@websites) { push @threads, threads->create(\&fetch, $url); } my @pages; for my $thread (@threads) { push @pages, $thread->join; } sub fetch { my $url = shift; my $ua = LWP::UserAgent->new; my $result = $ua->get($url); return $result->is_success ? $result->decoded_content : "Page retr +ieval failed"; }
or, if you are comfortable w/ map,
my @pages = map $_->join, map threads->create(\&fetch, $_), @websites;

#11929 First ask yourself `How would I do this without a computer?' Then have the computer do it the same way.