Beefy Boxes and Bandwidth Generously Provided by pair Networks
more useful options

Re: Fastest way to download many web pages in one go?

by ig (Vicar)
on Oct 12, 2013 at 09:56 UTC ( #1057977=note: print w/replies, xml ) Need Help??

in reply to Fastest way to download many web pages in one go?

Parsing will undoubtedly be very fast compared to downloading...

If parsing takes negligible time compared with DNS lookup and getting the documents, I would explore and accept or rule out asynchronous DNS and HTTP solutions before pursuing parallel processing, but you don't say much about the complexity of the parsing or where your bottlenecks are.

Years ago I customized a very nice C library for parallel DNS queries and interfaced it to Perl, but I was doing millions of lookups per job. It may have been ADNS, accessible via Net::ADNS but I don't recall with any certainty. If I understand correctly, you are only dealing with two domain names. Unless the name to address resolution is liable to change, you might do best to hard code the IP addresses and dispense with DNS and its delays altogether.

This only leaves the HTTP requests to execute in parallel. Depending on the volume you are downloading, asynchronous requests might suffice. Unless you have multiple NICs to go with your multiple CPUs, it's not clear that parallel processes would be much benefit to the download time. Will it take longer to process a packet than it takes to receive it? I haven't used anything for asynchronous HTTP requests recently, but a quick search reveals HTTP::Async which looks like it might be worth a try, in addition to LWP::Parallel::UserAgent which you already found.

If all your HTTP requests are done in parallel, persistent connections would be irrelevant: each connection would handle a single request. On the other hand, if you are also downloading linked resources (you don't say) then persistent connections might help.

If, after all, parsing time is not negligible, then you will have the challenges of parallel processing and IPC, but this can be dealt with independent of the download issue.

I would start with simple solutions and experiment with more complex options only if the simple ones prove to be inadequate, at which point I would have more specific problems to deal with.

  • Comment on Re: Fastest way to download many web pages in one go?

Log In?

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://1057977]
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others chilling in the Monastery: (1)
As of 2023-06-02 19:48 GMT
Find Nodes?
    Voting Booth?

    No recent polls found