in reply to Re: What's the most efficient way to write out many lines of data?
in thread What's the most efficient way to write out many lines of data?
Sorry for the delay, but I got distracted by work. :) Records tend to be 1300 characters in length and need to be separated into 80+ fields.
I apologize for not posting it before, but here's the code that does the bulk of the work.
while (<IN>) { $record = ""; @values = unpack($template, $_); foreach $field (@values) { $field =~ s/\s+$//; $record .= "\"" . $field . "\"" . $sep; } chop($record); $record .= "\n"; print OUT $record; }
I was able to trim the time down to just under 8 minutes on my 1.6 GHz P4 with ActiveState Perl. The original figure was on a Solaris server running Perl version 5.005_03.
I thought about accumulating several delimited records (say, 100 or so) into a single string to reduce the number of print commands. I thought this might reduce the I/O overhead. Anyone know if that will work or if it's just wishful thinking? :)
Many thanks for all the suggestions. :)
Larry Polk
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Re: Re: What's the most efficient way to write out many lines of data?
by BrowserUk (Patriarch) on Jul 11, 2003 at 09:40 UTC | |
by Anonymous Monk on Jul 11, 2003 at 15:49 UTC |