in reply to Data parsing - takes too long

Well, you should preprocess the data as much as you can before getting into the while loop. As the others monks have said, you should use a hash for faster lookup. If creating the hash with the entire file is consuming too much memory, try to break the input file in pieces, if possible. You should try it for the second file, too, if possible. Putting the entire file into memory is faster than reading line per line (if memory is cheap for you).

There are some other improvements you can try. I recommend this IBM article if you want to improve the performance of your program: http://www-128.ibm.com/developerworks/library-combined/l-optperl.html.

You should try to profile your program (with less input data, of course) using Dprof, even before trying to optimize the code. Doing that will help you to find out the area in your code that is wasting more time to execute. Here is a good link for that: http://www.perl.com/pub/a/2004/06/25/profiling.html


Alceu Rodrigues de Freitas Junior
---------------------------------
"You have enemies? Good. That means you've stood up for something, sometime in your life." - Sir Winston Churchill

Replies are listed 'Best First'.
Re^2: Data parsing - takes too long
by salva (Canon) on Jan 14, 2006 at 11:16 UTC
    If creating the hash with the entire file is consuming too much memory, try to break the input file in pieces, if possible

    or better use DB_File or similar to create an on disk hash/tree.