in reply to what modules you recommend for downloading hundreds of URLs per second in parallel?

I've heard LWP is slow and too CPU-intensive task for crawlers

Then the first step should be to test that. Maybe it was too slow for somebody on his own, weak machine, but it's no issue for you.

If it's to slow for you, try search cpan for spider and crawler, maybe some of the results might help you.

If you care very much for CPU time, consider using curl or wget, which are written in C and probably less CPU intensive.

  • Comment on Re: what modules you recommend for downloading hundreds of URLs per second in parallel?