I have a simple fixed length file of about 1GB in size, with records 1001 bytes long. However, a few lines are the wrong size. I want to separate them into two files, good.txt and bad.txt.
I started with the following, which works on a small test file perfectly:
perl -ne 'print and next if length($_)==1001; print STDERR $_' suspect.txt > goodrecs.txt 2> badrecs.txt
The problem seems to be that there is a bad record in the real-world file that is around 250MB long and near the end. So it looks like Perl is thrashing for hours trying to read that line into $_. (In Windows XP, the memory usage bounces up and down 100MB every few seconds, and it's reading about 8k per second.)
I tried adding BEGIN {$|++} so at least it'd flush the buffer to the goodrecs.txt before thrashing, but it still stops writing mid-record. (But it does write a little more.)
What can I do? And what is a general solution? :)
Any guidance is appreciated, oh Perl monks.
In reply to Thrashing on very large lines by chr1so
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |