Hi fellow Monks!
A Perl script I just started to write should download a bunch of web pages from two different domains (up to 30 pages from each domain), parse each page and collect certain information from it, and then at the very end (after all pages have been fetched and all info extracted), a "report" of sorts should be printed.
The script will be triggered manually, and each second it takes to complete is a second the user will spend twiddling their thumbs. Thus the desire to complete quickly.
Parsing will undoubtedly be very fast compared to downloading, so it is the latter where I'd really like to see some performance boost compared to simply doing sequential LWP::UserAgent requests.
My (limited) experience with this kind of stuff suggests that one or more of the following might really help:
(Please tell me if I'm missing something entirely.)
Searching CPAN reveals many modules that seem to be able to help Perl developers with some of the above download acceleratiion techniques, including...
...which is kind of overwhelming.
If there are any Monks out there who have experience with this kind of problem, would you mind sharing some of it with your fellow acolyte? :)
To the point:
Which CPAN module, or combination of CPAN modules, or other solution, is known to provide the best performance and reliability for doing a whole bunch of GET requests against two different domains?
In reply to Fastest way to download many web pages in one go? by smls
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |