Dr Manhattan has asked for the wisdom of the Perl Monks concerning the following question:
Hi all
I have a text file that I want to extract some information from, however the file is too large to read all at once. So at the moment I'm trying to read in 3000 lines at a time, processing and extracting info, printing it, clearing memory and then go on to the next 3000 lines.
This is the code I am currently trying out:
my @array; my $counter = 0; while (<Input>) { my $line = $_; chomp $line; push (@array, $line) if ($counter = 3000) { my @information; foreach my $element (@array) { #extract info from $element and push into @information } for my $x (@information) { print Output "$x\n"; } $counter = 0; @information = (); } }
However when I try this the output file just never stops growing, so I think I might be creating a endless loop somewhere. Any ideas/pointers?
Thanks in advance for any help
|
---|
Replies are listed 'Best First'. | |
---|---|
Re: Processing large files
by BrowserUk (Patriarch) on Aug 21, 2013 at 06:25 UTC | |
Re: Processing large files
by Athanasius (Archbishop) on Aug 21, 2013 at 06:36 UTC | |
Re: Processing large files
by mtmcc (Hermit) on Aug 21, 2013 at 12:17 UTC | |
Re: Processing large files
by kcott (Archbishop) on Aug 21, 2013 at 09:19 UTC | |
Re: Processing large files
by derby (Abbot) on Aug 21, 2013 at 11:32 UTC | |
Re: Processing large files
by Laurent_R (Canon) on Aug 21, 2013 at 11:39 UTC | |
Re: Processing large files
by Anonymous Monk on Aug 21, 2013 at 06:45 UTC | |
Re: Processing large files
by Preceptor (Deacon) on Aug 21, 2013 at 19:06 UTC | |
Re: Processing large files
by zork42 (Monk) on Aug 22, 2013 at 06:07 UTC |