Rita_G has asked for the wisdom of the Perl Monks concerning the following question:
I am writing script to search data from around 400 files. I need all the details like in each file how many times that pattern present. I have around 8000 paterns which I need to search each in all 400 files. currently I am using grep command in scriot which makes the performance down. Now I am thinking to use multithreading here..but I am totally newbee for this..could you please help me on getting correct solution
Here is my current code
while(@data = $sth->fetchrow_array()) { $result = `grep -i -w -c "$data[0]" /u05/oracle/R12COE/spotli +ghter/Search_Files/Forms/*`; @search = split('\n',$result); $arrsize = @search; for($i = 0; $i < $arrsize; $i++) { ($path, $count) = split(':',$search[$i]); ($filename, $fextn) = split('\.',$path); $filename = `basename $path`; #$fextn = `echo $fname | sed 's/.*\.//'`; #$fname = `echo $path | perl -pe 's|.*/||'`;
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Searching pattern in 400 files and getting count out of each file
by Athanasius (Archbishop) on Nov 08, 2012 at 07:21 UTC | |
by space_monk (Chaplain) on Nov 08, 2012 at 10:31 UTC | |
|
Re: Searching pattern in 400 files and getting count out of each file
by grondilu (Friar) on Nov 08, 2012 at 08:44 UTC | |
by space_monk (Chaplain) on Nov 08, 2012 at 11:30 UTC | |
by Rita_G (Initiate) on Nov 09, 2012 at 06:52 UTC | |
by Athanasius (Archbishop) on Nov 09, 2012 at 12:52 UTC |