Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:
Hi,
I'm looking for the *fastest* way to download 100 URL's in parallel - ideally using Perl.
I'm currently usign fork() and LWP::Simple but would prefer not to spawn 100+ sub-processes. I've looked at Perl threads but want to steer clear until they're stable.
I'm on a Linux machine --- does anyone know of some low-level C program that will do the IO in parallel, that ideally has a Perl wrapper?
Is LWP::Parallel the fastest Perl way to do this?
Nige
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Downloading URL's in Parallel with Perl
by mr.nick (Chaplain) on Sep 11, 2001 at 16:14 UTC | |
|
Re: Downloading URL's in Parallel with Perl
by MZSanford (Curate) on Sep 11, 2001 at 16:11 UTC | |
|
Re: Downloading URL's in Parallel with Perl
by eduardo (Curate) on Sep 11, 2001 at 17:01 UTC | |
|
Re: Downloading URL's in Parallel with Perl
by larsen (Parson) on Sep 11, 2001 at 23:49 UTC | |
|
Re: Downloading URL's in Parallel with Perl
by Ntav (Sexton) on Sep 11, 2001 at 22:39 UTC | |
|
Re: Downloading URL's in Parallel with Perl
by perrin (Chancellor) on Sep 11, 2001 at 17:24 UTC | |
|
Re: Downloading URL's in Parallel with Perl
by tachyon (Chancellor) on Sep 11, 2001 at 16:44 UTC |