JohnBrook has asked for the wisdom of the Perl Monks concerning the following question:
I am having a problem working with a large file under Perl 5.6.1, build MSWin32-x86-multi-thread, under Windows XP. In addition to Googling for an answer, I have also read How can I process large files efficiently? in the Questions and Answers, but it was not sufficient to solve my problem. I am already processing the file line by line (I think!).
Here is my code (stripped down to essentials, as the guidelines suggest, but which I had already done anyway):
The output is simply "Out of memory!" after the hard drive runs for about 2 minutes. The file is about 42 MB. What in this program could be gobbling up memory? Is this not the standard way to process a file line by line?use strict; use warnings; open IN, "test.txt" or die "Could not open 'test.txt'\n"; for(<IN>) { # do nothing } close IN;
Lastly, it just occurred to me to see if maybe the newlines in the file were not standard DOS newlines (CR/LF), but they are. So that ain't it.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Large file problem
by sweetblood (Prior) on Dec 01, 2004 at 17:26 UTC | |
by Happy-the-monk (Canon) on Dec 01, 2004 at 17:34 UTC | |
by JohnBrook (Acolyte) on Dec 01, 2004 at 17:36 UTC | |
|
Re: Large file problem
by radiantmatrix (Parson) on Dec 01, 2004 at 17:35 UTC | |
by JohnBrook (Acolyte) on Dec 01, 2004 at 17:42 UTC | |
by ikegami (Patriarch) on Dec 01, 2004 at 18:21 UTC | |
by Anonymous Monk on Dec 02, 2004 at 10:47 UTC | |
|
Re: Large file problem
by melora (Scribe) on Dec 01, 2004 at 20:11 UTC | |
by Happy-the-monk (Canon) on Dec 01, 2004 at 22:26 UTC | |
|
Re: Large file problem
by Anonymous Monk on Dec 03, 2004 at 02:57 UTC |