But that just processes the chunks 14 times. If I reverse the order, I chunk and process the data again 14 times. What's the best way to combine these and actually fire off forks for batches of DBI results? Thanks all for helping a new junior monk here...my $indexers = Parallel::ForkManager->new(14); while (my $data = $r->fetchall_arrayref(undef, 5000)){ $indexers->start and next; .. do stuff .. $indexers->finish; } $indexers->wait_all_children;
In reply to fetchall_arrayref batches and forking by mogmismo
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |