Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:
hi monks, i seek your wisdom,i am pretty new to perl
im using strawberryperl on windows xp to download multiple html pages,i want each in a variable
right now im doing this but as i see it, it gets one page at a time, and doesent go to the next until the current is downloaded
my $page = `curl -s http://mysite.com/page -m 2`; my $page2 = `curl -s http://myothersite.com/page -m 2`;
there are about 4 links in total, so i wanted to keep it as simple as possible,
looked into parallel::forkmanager, but couldnt get it to work also tried to use the windows command start before curl but that doesent get the page is there a more simple way to do this?
thank you in advance
|
---|
Replies are listed 'Best First'. | |
---|---|
Re: simple multithreading with curl
by BrowserUk (Patriarch) on May 19, 2013 at 18:31 UTC | |
by Anonymous Monk on May 20, 2013 at 04:07 UTC | |
by BrowserUk (Patriarch) on May 20, 2013 at 04:27 UTC | |
Re: simple multithreading with curl
by kennethk (Abbot) on May 19, 2013 at 18:07 UTC | |
Re: simple multithreading with curl
by choroba (Cardinal) on May 19, 2013 at 18:12 UTC | |
by Anonymous Monk on May 19, 2013 at 19:51 UTC |