I was asked by a friend, why after starting 150 web spider threads, his system would bog down until it seem to stop, but was still running?
He may be asking the wrong question. The real question might be something similar to, "Is it possible to do 150 simultaneous requests without dragging the system to its knees?" Look at the problem that needs to be solved and remain open to lighter-weight solution that might not have previously been considered.
# Concurrent non-blocking requests (synchronized with a delay) Mojo::IOLoop->delay( sub { my $delay = shift; $ua->get('mojolicious.org' => $delay->begin); $ua->get('cpan.org' => $delay->begin); }, sub { my ($delay, $mojo, $cpan) = @_; say $mojo->result->dom->at('title')->text; say $cpan->result->dom->at('title')->text; } )->wait;
...from the Mojo::UserAgent docs.
So that sub containing the gets could look like:
sub { my $delay = shift; $ua->get($_ => $delay->begin) for @list_of_urls; },
Dave
In reply to Re: Critical sections; In the Perl interpreter
by davido
in thread Critical sections; In the Perl interpreter
by Wiggins
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |