ccmadd has asked for the wisdom of the Perl Monks concerning the following question:
At work, we have a large zipped log file (over 500MB) on a Linux server. We have a Perl script that generates a report using information from the log file. I'm new to Perl, I inherited the script. Everything ran ok (processing took about 35 seconds), then we increased volume. Now the same script takes 90 minutes due to the volume of data. I'm using several calls to the Linux grep command from within my script. Is there a faster way to do this using only Perl and not the Linux command, or is this the best way?
Some additional detail...I first get a list of unique things I'm interested in, similar to a product id(list contains about 7000 unique items). Then, I iterate the big log file one time using regex to find lines I'm interested in for gathering additional data about each product id. I write those lines out to a few new (smaller) files. Then, I loop through the product ID list one time, and execute several different grep commands on the new smaller files I created. Again, I'm using the Linux grep command, not Perl grep, like this:
Thanks
|
---|
Replies are listed 'Best First'. | |
---|---|
Re: Best way to search large files in Perl
by Laurent_R (Canon) on May 11, 2016 at 22:59 UTC | |
Re: Best way to search large files in Perl
by graff (Chancellor) on May 12, 2016 at 02:46 UTC | |
Re: Best way to search large files in Perl
by RonW (Parson) on May 11, 2016 at 21:07 UTC | |
Re: Best way to search large files in Perl
by Eily (Monsignor) on May 11, 2016 at 20:59 UTC | |
Re: Best way to search large files in Perl
by BillKSmith (Monsignor) on May 11, 2016 at 21:06 UTC | |
Re: Best way to search large files in Perl
by LanX (Saint) on May 12, 2016 at 02:19 UTC |