Those are global. In that "some" hashes are used to upload to DB & all the hashes used to write an xlsx sheet
So, presumably that post file-processing code runs where I've added comments below?
foreach my $linkarray (1 .. 2)
{
$pm->start and next LINKS; # do the fork
if ($first == 1)
{
my @cdr_list1 = `ls $cdr_directory/SMSBcastCDR_*_$bcat_cdrdate
+\_*.log`;
print "cdrs_file1 = @cdr_list1\n";
SMSBcastCDR(@cdr_list1);
$first++;
## Update DB and produce xlsx sheet from global hashes
+ and arrays here????
}
if ($first == 2)
{
my @smsc_cdr_list=`ls $smscdr_directory/SMSCDR_P*_$cdr
+date*.log`;
SMSCDR(@smsc_cdr_list);
## Update DB and produce xlsx sheet from global hashes
+ and arrays here????
}
$pm->finish; # do the exit in the child process
}
$pm->wait_all_children;
I have another suggestion: Instead of waiting for all your files to finish being written at midnight before starting your processing; run your process(es) to constantly monitor their respective directories throughout the day and process the files as they arrive.
That way, when the final file arrives there'll only be a tiny amount of processing left to do and your stats will be available very shortly after.
With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
In the absence of evidence, opinion is indistinguishable from prejudice.
|