Hi,
I'm looking for the *fastest* way to download 100 URL's in parallel - ideally using Perl.
I'm currently usign fork() and LWP::Simple but would prefer not to spawn 100+ sub-processes. I've looked at Perl threads but want to steer clear until they're stable.
I'm on a Linux machine --- does anyone know of some low-level C program that will do the IO in parallel, that ideally has a Perl wrapper?
Is LWP::Parallel the fastest Perl way to do this?
Nige
In reply to Downloading URL's in Parallel with Perl by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |