http://qs1969.pair.com?node_id=981041

Eyck has asked for the wisdom of the Perl Monks concerning the following question:

I'm trying to immitate web browser downloading some page, I have an array containing all components, and then use WWW::Mechanize to do downloading:
use WWW::Mechanize; use Time::HiRes qw(time); my $links=[ "http://web-page.to.download.to/", "http://static.to.download.to/background.jpg", "http://static.to.download.to/first.css", "http://www.google-analytics.com/ga.js", "http://static.ak.fbcdn.net/rsrc.php/v2/yl/r/6KM-54hh6R2.css", ]; my $start=time; foreach (@$links) { $mech->get($_); }; my $stop=time;

This works more-or-less the way I intended, there are two problems though - since the list of links is dynamic, and partly created using javascript, I had to use the browser to create that list.

I need a way of parsing web page, and getting a list of all its component, and this is my first problem.

The other problem is that I'm serializing all downloading here - I should be using something more similiar to what browsers do - maybe use 4 concurrent downloaders?

How can I emulate 4 concurrent downloading threads?