I am having an issue where I'm iterating through a text file, but run out of memory when I read in some of the lines. I've had text files that contain 900 MB worth of data on one line. Is there a way to only read a partial line from a text file maybe the first 20 characters, or a way to skip a line if it's too large to read?
I wrote another quick script that write outs a modified text file containing only the lines I want (which are not too big to read), which still runs into the out of memory issue.
Here is the code I wrote for making the modified text file. Hopefully the logic to either skipping the large lines or only reading in the first few characters can be applied to my larger script.
open (LOG, $file); open (OUT, ">", $outf); my $header = <LOG>; print OUT $header; while (<LOG>) { print OUT $_ if ($_ =~ /^\{\|\d{4}-\d{2}-\d{2}_\d{2}.\d{2}.\d{2}\| +LOGIN/); } close LOG; close OUT;
In reply to Out of Memory - Line of TXT too large by MajinMalak
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |