RedGrinGo has asked for the wisdom of the Perl Monks concerning the following question:

Hello all, i got a question. i have to parse 3 html response from 3 different url's so that mean that i need to HTTP::request for 3 time and concatenate all three response and then parse. NOW, my will is to make it by forking those 3 request simultaneity to reduce the running time, but i never forked before, anyway i tried to get info from the web but my hand came up with nothing :-( can any one give me an hand here. Thank you all.

Replies are listed 'Best First'.
Re: muliple request with fork
by derby (Abbot) on Dec 10, 2009 at 13:44 UTC
Re: muliple request with fork
by Anonymous Monk on Dec 10, 2009 at 14:06 UTC
    forking 3 diffrent request will not speed up the overall running time (unless you have a ultra fast network, in which case the hitting links sepratly will also be fast), as the speed will be dependent on your network speed. If you hit links in parallel, you are dividing your bandwith for these requests.
      forking 3 diffrent request will not speed up

      ...but it could reduce the overall time taken, in case the servers are responding slowly (think of requests to perlmonks.org :)  or the server side has limited bandwidth...

      There are two different things to consider: bandwidth and latency.

      Parallel requests don't help anything when it comes to (your own) bandwidth, but since the latency of the requests happens during the same time, the overall run time can still decrease.

      For large files the bandwidth is more important, for small files the latency.

      Perl 6 - links to (nearly) everything that is Perl 6.

      Hey nice!!!

      Didn't thought about this one, so what you are saying that it doesn't matter if i make the requests simultaneity, this interesting

      Anyway is there any idea how to make this despite what you wrote above ???

      I really want to try to this with fork.