I am attempting to parse a large file (~10 GB) but having issues with running out of system memory to do so. I start out with about 3.5 GB used and end up using all 16 GB before Linux kills the process.
Here is the code that is failing for me:
#!/usr/bin/perl use strict; use warnings; $| = 1; open(my $readHandle, '<', "File.txt") or die "Failed\n"; print "Start Read\n"; foreach my $line (<$readHandle>) { print "Read Line\n"; print "Found!\n" if ($line =~ /MatchingText/); } close $readHandle;
When I run this program, "Start Read" is printed to the screen, but I never see "Read Line".
I've done a bunch of Googling on this, and found a bunch of hits, but everything I read says to just read the file one line at a time to get around the size. But isn't that what I'm already doing??? All the specific examples I see for fixing the issue focus on other things the user is also doing which are also taking up memory.
In reply to How to read in large files by Only1KW
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |