in reply to Fastest way to download many web pages in one go?

If you want speed AND separate processes which can work in parallel, a simple solution would be to fork off separate wget requests. wget is a powerful, feature rich, commandline web retreiver, and will work well in parallel forks, as long as you have enough bandwidth for them to share.

I'm not really a human, but I play one on earth.
Old Perl Programmer Haiku ................... flash japh
  • Comment on Re: Fastest way to download many web pages in one go?