Hello Dear Monks
in my script i am using a grep against each element in the array. this array containts around 8000 elements now and processing of each element is taking much time. this causes an script timeout on my remote system from where i execute this script. i am wondering if there is a way to improve the performance time? Below are the snippet of the code in question
my $path = "/tmp/testpatch"; opendir DIR, $path or die $!; my @tempfiles = readdir DIR; #this array has list of all the files, p +reviously processed and newly added (8000 files) closedir DIR; foreach my $strfile (@tempfiles) { if (!grep /$strfile/, @arraytocompare) #This array has list of files w +hich were processed in the passed { push (@newarray, $strfile); # in this array i get all the new files wh +ich i need to process now. } }
is there a fast grep? or i will have to live with it?
In reply to improve performance by swissknife
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |