Beefy Boxes and Bandwidth Generously Provided by pair Networks
Problems? Is your data what you think it is?
 
PerlMonks  

Re^3: Multi threading

by sandy1028 (Sexton)
on Apr 08, 2009 at 06:38 UTC ( [id://756247] : note . print w/replies, xml ) Need Help??


in reply to Re^2: Multi threading
in thread Multi threading

There is more than 10000 articles in a directory. Using this code I am creating the 5 processes. Each process should read 100 articles at once and write to the log file.
While writing to log file some of the files are missed.
How to use the lock on these process?
How can I create the different lof files for all 5 processes.
Can anyone please help me?
my $pm = new Parallel::ForkManager(5); $pm->run_on_finish( sub { my ($pid, $exit_code, $ident) = @_; $tmp +Files[$ident] = undef; } ); foreach my $i (0..$#tmpFiles) { # Forks and returns the pid for the child: my $pid = $pm->start($i) and next; $SIG{INT} = 'DEFAULT'; my $filename = $tmpFiles[$i]->filename(); my $file = IO::File->new("<$filename") or die "Can't open $filen +ame\n"; while((my $line) = $file->getline()) { last unless defined($line); chomp $line; my ($dir, $file) = split(/\t/, $line); $processor->($dir, $file, $config, $log); } $pm->finish; # Terminates the child process } $pm->wait_all_children;

Replies are listed 'Best First'.
Re^4: Multi threading
by sandy1028 (Sexton) on Apr 13, 2009 at 09:20 UTC
    sub processdir(){ my $pm = new Parallel::ForkManager(5); $pm->run_on_finish( sub { my ($pid, $exit_code, $ident) = @_; $tmp +Files[$ident] = undef; } ); foreach my $i (0..$#Files) { # Forks and returns the pid for the child: my $pid = $pm->start($i) and next; $SIG{INT} = 'DEFAULT'; my $filename = $Files[$i]->filename(); my $file = IO::File->new("<$filename") or die "Can't open $filen +ame\n"; while((my $line) = $file->getline()) { last unless defined($line); chomp $line; my ($dir, $file) = split(/\t/, $line); $processor->($dir, $file, $config, $log); } $pm->finish; # Terminates the child process } $pm->wait_all_children; return; } processdir($input_dir,$logfile,\&clean_value); sub clean_value(){ $logfile->print("-- Reading '$input_file' file\n"); }
    Here all the process executes,writes together and input file name overlaps in the file name
    How to avoid overlapping of file names into the $logfile.
    In which portion should I use locks or how to avoid averlapping of filename
      Can anyone please tell me how to create the threads for
      reading all the files from directories and sub-directories and write to the one log file or create a separate log file for each thread, without using sleep()
      Each thread should process(read ) 100 files at a time.