Hi all
I have a text file that I want to extract some information from, however the file is too large to read all at once. So at the moment I'm trying to read in 3000 lines at a time, processing and extracting info, printing it, clearing memory and then go on to the next 3000 lines.
This is the code I am currently trying out:
my @array; my $counter = 0; while (<Input>) { my $line = $_; chomp $line; push (@array, $line) if ($counter = 3000) { my @information; foreach my $element (@array) { #extract info from $element and push into @information } for my $x (@information) { print Output "$x\n"; } $counter = 0; @information = (); } }
However when I try this the output file just never stops growing, so I think I might be creating a endless loop somewhere. Any ideas/pointers?
Thanks in advance for any help
In reply to Processing large files by Dr Manhattan
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |