Hello all. I'm just getting started with Perl and I'm currently working on a script to convert fixed-width text records in a flat-file database to delimited-field records. The script works fine with one exception: it's really slow. I'm dealing with a large amount of data (usually millions of records) and the script takes a really long time. I'm just not sure I'm handling this problem in the most efficient manner. In the script, I'm converting each fixed-width record (a single line in a text file) into individual fields separated by some delimiter (a comma, for instance). Once I do this, I write the delimited record and a line ending (\n) out to an output file (using a print statement).
My question is: Is there a more efficient way to do this? On a test conversion, a file containing 554,152 records took over 16 minutes to complete. I just want to make sure I'm doing this as efficiently as possible and not dragging the process out unnecessarily.
Any assistance would be greatly appreciated.
Thanks in advance,
Larry Polk
larry@larrypolk.com
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |