Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

I have a WWW::Mechanize object logged into a company intranet page and I'm trying to download the org chart. I start at the CEO and work down until all the users are seen.

To get the info for 1 user three http requests are made and I'm trying to have them done asynchronously.

Below is a contrived example of what I'm trying to do and I boiled it down to this.

Why would these three events not be running in parallel? How can I get them to run in parallel? Thanks
#!/usr/bin/perl use Modern::Perl; use AnyEvent; my @users = ( 1 ); while ( @users ) { my $user_id = shift @users; my ( $info_1, $info_2, $info_3 ); my $cv = AnyEvent->condvar; # get the first user info $cv->begin; say "request 1"; $info_1 = "x"; sleep 10; $cv->end; # get the second user info $cv->begin; say "request 2"; $info_2 = "x"; sleep 10; $cv->end; # get the second user info $cv->begin; say "request 3"; $info_3 = "x"; sleep 10; $cv->end; # merge pont $cv->recv; say "$user_id: $info_1, $info_2, $info_3"; }

Replies are listed 'Best First'.
Re: AnyEvent Parallel HTTP
by Yary (Pilgrim) on Feb 19, 2015 at 20:00 UTC

      Uh - AnyEvent is a framework to do tasks in parallel that are not CPU bound. And usually, web scraping is not CPU bound. Distributing the load across more than one CPU makes sense if your load is CPU bound, but not necessarily when scraping a website over the network.

      There is no need to involve threads or forking when using AnyEvent.

Re: AnyEvent Parallel HTTP
by Corion (Patriarch) on Feb 21, 2015 at 16:48 UTC

    You will have to show us the way you are actually making your HTTP requests. As your code is, it can't really execute asynchronously, as it is still synchronous. You will have to show us how your callbacks actually call the $cv->end() parts.

    Also note that, unless this is for illustration purposes, sleep does not fit well with AnyEvent.