shanu_040 has asked for the wisdom of the Perl Monks concerning the following question:

Hi Monks,
I working on a web application. Which does the Parallel searching on different sites and display the fetched content. I am doing it successfully using Parallel::ForkManager(PFM). The problem with using PFM is to wait for all the child to finish their search and then only I can display the results. My concern is I want to display the results as soon as they are retrieved by any child process, In nut shell I don't want to wait for all child to retrieve the result and display, I want some kind of incremental display.
My Queries are:
1. What would be the beat way to go about it?
2. Should I use Process or Thread?
3. What about Thread::Queue.
Shanu
  • Comment on Replacing Parallel::ForkManger With Thread::Queue

Replies are listed 'Best First'.
Re: Replacing Parallel::ForkManger With Thread::Queue
by ysth (Canon) on May 26, 2009 at 05:50 UTC
      HI,
      Following is the code for run_on_finish. I am writing data onto disk(/tmp/ directory) retrieved by each child process. and deleting those file after reading them.
      $pm->run_on_finish(sub { my ($pid, $exit_code, $ident) = @_; eval { my $filename = '/tmp/' . $$ . escape($ident); my $result_set; if (-e $filename) { $result_set = $obj->retrieve($filename); unlink($filename); }else { # need a new $timeout_result object every loop my $timeout_result = new DBWIZ::Search::ResultSet; $timeout_result->status('timeout'); $timeout_result->hits(-5); $result_set = $timeout_result; } $return{$ident} = $result_set; }; if ($@) { print STDERR "Problem with search module: $@\n"; } # print STDERR "database $ident done = exit $exit_code \n"; });

      $obj is the Data::Serializer object with following options
      my $obj = Data::Serializer->new( serializer => 'Storable', portable => '1', encoding => 'b64', );