chr1so has asked for the wisdom of the Perl Monks concerning the following question:
I have a simple fixed length file of about 1GB in size, with records 1001 bytes long. However, a few lines are the wrong size. I want to separate them into two files, good.txt and bad.txt.
I started with the following, which works on a small test file perfectly:
perl -ne 'print and next if length($_)==1001; print STDERR $_' suspect.txt > goodrecs.txt 2> badrecs.txt
The problem seems to be that there is a bad record in the real-world file that is around 250MB long and near the end. So it looks like Perl is thrashing for hours trying to read that line into $_. (In Windows XP, the memory usage bounces up and down 100MB every few seconds, and it's reading about 8k per second.)
I tried adding BEGIN {$|++} so at least it'd flush the buffer to the goodrecs.txt before thrashing, but it still stops writing mid-record. (But it does write a little more.)
What can I do? And what is a general solution? :)
Any guidance is appreciated, oh Perl monks.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Thrashing on very large lines
by GrandFather (Saint) on Apr 20, 2006 at 05:08 UTC | |
|
Re: Thrashing on very large lines
by ikegami (Patriarch) on Apr 20, 2006 at 06:50 UTC | |
by chr1so (Acolyte) on Apr 20, 2006 at 20:12 UTC | |
by ikegami (Patriarch) on Apr 20, 2006 at 21:48 UTC | |
by chr1so (Acolyte) on Apr 21, 2006 at 00:47 UTC | |
|
Re: Thrashing on very large lines
by salva (Canon) on Apr 20, 2006 at 09:21 UTC | |
by Anonymous Monk on Apr 20, 2006 at 18:09 UTC | |
by salva (Canon) on Apr 20, 2006 at 18:51 UTC | |
by chr1so (Acolyte) on Apr 24, 2006 at 23:34 UTC | |
|
Re: Thrashing on very large lines
by aufflick (Deacon) on Apr 20, 2006 at 06:05 UTC |