in reply to simple multithreading with curl

Not quite a one-liner:

#! perl -slw use strict; use threads; use LWP::Simple; my @pages = map $_->join, map async( sub{ get "http://$_[0]"; }, $_ ), qw[ www.bbc.co.uk www.ibm.com www.cnn.com www.microsoft.com ];; print substr $_, 0, 100 for @pages; __END__ C:\test>1034235.pl <!DOCTYPE html> <html lang="en-GB" > <head> <!-- Barlesque 2.45.9 --> +<meta http-equiv="Content-Type <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w +3.org/TR/xhtml1/DTD/xhtml1-str <!DOCTYPE HTML> <html lang="en-US"> <head> <title>CNN.com International - Breaking, World, Business <!DOCTYPE html> <html class="en-gb no-js" lang="en" dir="ltr" xmlns:bi="urn:schemas-mi +crosoft-com:m

With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.

Replies are listed 'Best First'.
Re^2: simple multithreading with curl
by Anonymous Monk on May 20, 2013 at 04:07 UTC
    i forgot to mention something:
    need to manage each downloaded page differently, so i need to know each page in wich variable it is
    i thought this would be easier, for a noob like me
    when reading the manuals & tuts it seems so easy :)
      so i need to know each page in wich variable it is

      The pages will be in the array in the same order as the urls are in the list.


      With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
      Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
      "Science is about questioning the status quo. Questioning authority".
      In the absence of evidence, opinion is indistinguishable from prejudice.