in reply to parallel web get with lwp::simple

To my knowlege, LWP::Simple doesn't internally provide for multiple instance running concurrently. The standard solution, from what I understand, is to fork, and then allow each forked process to independantly do its LWP::Simple get() request simultaneously.

Dave

Replies are listed 'Best First'.
Re^2: parallel web get with lwp::simple
by biosysadmin (Deacon) on Jul 26, 2004 at 14:38 UTC
    Another very simple way to fork processes using LWP is by using Parallel::ForkManager. It's shockingly simple to make your code parallel in this way, here's how you could apply it to the loop in your code:
    use Parallel::ForkManager my $max_forks = 20; my $forkmanager = Parallel::ForkManager->new( $max_forks ); for ($count = 0; $count <= $max; $count++){ $forkmanager->start and next; my $content; unless (defined ($content = get $URL)) { die "could not get $URL\n"; } if ($content =~ /Test1/i) { print "."; } elsif ($content =~ /Test2/i) { print "Fetched page from Server2 \n"; $count++; &result; } else { print "Page not retreived \n" }; $forkmanager->finish; } $forkmanager->wait_all_children;
    This testing is also probably dependent on the maximum number of clients you support, this may not be high enough to stress your gateway either. Check on the MaxClients parameter in your httpd.conf for more information. Depending on what you're testing, it may be fruitful to use LWP to download larger files rather than making more requests.

    Best of luck. :)