Well, you should preprocess the data as much as you can before getting into the while loop. As the others monks have said, you should use a hash for faster lookup. If creating the hash with the entire file is consuming too much memory, try to break the input file in pieces, if possible. You should try it for the second file, too, if possible. Putting the entire file into memory is faster than reading line per line (if memory is cheap for you).
There are some other improvements you can try. I recommend this IBM article if you want to improve the performance of your program: http://www-128.ibm.com/developerworks/library-combined/l-optperl.html.
You should try to profile your program (with less input data, of course) using Dprof, even before trying to optimize the code. Doing that will help you to find out the area in your code that is wasting more time to execute. Here is a good link for that: http://www.perl.com/pub/a/2004/06/25/profiling.html
In reply to Re: Data parsing - takes too long
by glasswalk3r
in thread Data parsing - takes too long
by josephjohn
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |