Your skill will accomplish what the force of many cannot |
|
PerlMonks |
Large file problemby JohnBrook (Acolyte) |
on Dec 01, 2004 at 17:18 UTC ( [id://411505]=perlquestion: print w/replies, xml ) | Need Help?? |
JohnBrook has asked for the wisdom of the Perl Monks concerning the following question:
Good afternoon (in my time zone), fellow seekers. I am a complete novitiate here, this is my first post, so although I have read the guidelines, forgive me if in my ignorance I transgress in any way. I do have a few years' self-taught experience in Perl.
I am having a problem working with a large file under Perl 5.6.1, build MSWin32-x86-multi-thread, under Windows XP. In addition to Googling for an answer, I have also read How can I process large files efficiently? in the Questions and Answers, but it was not sufficient to solve my problem. I am already processing the file line by line (I think!). There was a second suggestion there to use Tie::File. I tried this and don't seem to have it installed. Of course I can install it, but I'm wondering if there is a more obvious problem with what I'm doing. I'm not sure why I would need to use Tie::File if there is a way to just process the file line by line other than what I'm already doing.
Here is my code (stripped down to essentials, as the guidelines suggest, but which I had already done anyway): The output is simply "Out of memory!" after the hard drive runs for about 2 minutes. The file is about 42 MB. What in this program could be gobbling up memory? Is this not the standard way to process a file line by line? Lastly, it just occurred to me to see if maybe the newlines in the file were not standard DOS newlines (CR/LF), but they are. So that ain't it.
Back to
Seekers of Perl Wisdom
|
|