Hello, I have a question regarding multiple http requests. I'd like to use Mojo, since I'm already taking advantage of Mojo::Pg, but it is not absolutely necessary. I have tried to do parallel http requests using perl threads, and came close, but have given up. Fork seems like it would take up too much of my resources.
Ideally, I want to go to 100 URLs at a time, and not wait for the previous one to finish before going onto the next URL. Then grab the next 100 and eventually go on to my function of the code.
I stole this from an earlier post, it is close, except that it creates a $tx transaction for each URL. Ideally, I want $tx to be something like $tx = "site1 site2 site3 site4" or something along those lines up to 100.
use Mojo::Client; use Mojo::Transaction; my $client = Mojo::Client->new; my $tx = Mojo::Transaction->new_get('http://labs.kraih.com'); my $tx2 = Mojo::Transaction->new_get('http://mojolicious.org'); $tx2->req->headers->expect('100-continue'); $tx2->req->body('foo bar baz'); $client->process_all($tx, $tx2); print $tx->res->code; print $tx2->res->code; print $tx2->res->content->file->slurp;
Thank you in advance
In reply to Parallel HTTP requests using Mojo? by ajmcello
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |