MajinMalak has asked for the wisdom of the Perl Monks concerning the following question:
I am having an issue where I'm iterating through a text file, but run out of memory when I read in some of the lines. I've had text files that contain 900 MB worth of data on one line. Is there a way to only read a partial line from a text file maybe the first 20 characters, or a way to skip a line if it's too large to read?
I wrote another quick script that write outs a modified text file containing only the lines I want (which are not too big to read), which still runs into the out of memory issue.
Here is the code I wrote for making the modified text file. Hopefully the logic to either skipping the large lines or only reading in the first few characters can be applied to my larger script.
open (LOG, $file); open (OUT, ">", $outf); my $header = <LOG>; print OUT $header; while (<LOG>) { print OUT $_ if ($_ =~ /^\{\|\d{4}-\d{2}-\d{2}_\d{2}.\d{2}.\d{2}\| +LOGIN/); } close LOG; close OUT;
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Out of Memory - Line of TXT too large
by Corion (Patriarch) on Jan 02, 2014 at 12:43 UTC | |
by MajinMalak (Initiate) on Jan 02, 2014 at 12:51 UTC | |
by Corion (Patriarch) on Jan 02, 2014 at 13:09 UTC | |
by MajinMalak (Initiate) on Jan 02, 2014 at 13:57 UTC | |
by roboticus (Chancellor) on Jan 02, 2014 at 14:04 UTC | |
by Corion (Patriarch) on Jan 02, 2014 at 14:00 UTC | |
by Anonymous Monk on Jan 02, 2014 at 15:13 UTC | |
|
Re: Out of Memory - Line of TXT too large (mmap)
by oiskuu (Hermit) on Jan 02, 2014 at 15:45 UTC |