Sorry for the delay, but I got distracted by work. :) Records tend to be 1300 characters in length and need to be separated into 80+ fields.
I apologize for not posting it before, but here's the code that does the bulk of the work.
while (<IN>) { $record = ""; @values = unpack($template, $_); foreach $field (@values) { $field =~ s/\s+$//; $record .= "\"" . $field . "\"" . $sep; } chop($record); $record .= "\n"; print OUT $record; }
I was able to trim the time down to just under 8 minutes on my 1.6 GHz P4 with ActiveState Perl. The original figure was on a Solaris server running Perl version 5.005_03.
I thought about accumulating several delimited records (say, 100 or so) into a single string to reduce the number of print commands. I thought this might reduce the I/O overhead. Anyone know if that will work or if it's just wishful thinking? :)
Many thanks for all the suggestions. :)
Larry Polk
In reply to Re: Re: What's the most efficient way to write out many lines of data?
by Anonymous Monk
in thread What's the most efficient way to write out many lines of data?
by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |