avanta has asked for the wisdom of the Perl Monks concerning the following question:
the code may have errors but what I want is the concept to duplicate data in two or more processes (forking) so that I parse the input file only once. The input file can be huge so cant store in some variable or array or hash.sub processInputFiles { <----doing something--->..... ...... LogInfoMsg("Processing $file Started"); if ($f_type eq ".gz") { $fp= new IO::Zlib($working_file,"rb"); $tfp= new IO::Zlib($working_file,"rb"); } elsif ($f_type eq ".log") { $fp= new IO::File($working_file,"r"); $tfp= new IO::File($working_file,"r"); } else { LogCriticalMsg("Unsupported file_type($f_type), Skip parsing $ +file."); return; } if (! $fp) { # if it fails for some reason, rename to its original name. rename ($working_file, $file); LogCriticalMsg("Open input file $file failed. $!"); } else { if(grep m/[\r]/,$line) { IO::File->input_record_separator("\r\n\r\n"); } else { IO::File->input_record_separator("\n\n"); } close($tfp); } while ($record= $fp->getline()) { <---parsing file per record----> }# end of while } } <----doing something--->
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: passing a file content from one process to another process.
by jethro (Monsignor) on May 03, 2010 at 08:56 UTC | |
|
Re: passing a file content from one process to another process.
by CountZero (Bishop) on May 03, 2010 at 08:53 UTC |