mogmismo has asked for the wisdom of the Perl Monks concerning the following question:

I'm trying to fork for each batch of a fetchall_arrayref():
my $indexers = Parallel::ForkManager->new(14); while (my $data = $r->fetchall_arrayref(undef, 5000)){ $indexers->start and next; .. do stuff .. $indexers->finish; } $indexers->wait_all_children;
But that just processes the chunks 14 times. If I reverse the order, I chunk and process the data again 14 times. What's the best way to combine these and actually fire off forks for batches of DBI results? Thanks all for helping a new junior monk here...

Replies are listed 'Best First'.
Re: fetchall_arrayref batches and forking
by NetWallah (Canon) on May 07, 2013 at 22:58 UTC
    It looks like you want to process a large set of database chunks, in sets of 14.

    This fits the "single queue, multiple server" model, and BrowserUk has a thread-based implementation that can help. Please see the discussion in dynamic number of threads based on CPU utilization. Look for his code for his "threads::Q implementation".

                 "I'm fairly sure if they took porn off the Internet, there'd only be one website left, and it'd be called 'Bring Back the Porn!'"
            -- Dr. Cox, Scrubs

      Thanks, I'll take a look at those resource.