instinct_4ever has asked for the wisdom of the Perl Monks concerning the following question:
I am working on a couple of scripts for word processing/counting. One of them involves a hashmap, which could have around 41 million keys in the worst case. The trouble is that while running the script always runs out of memory. I cant flush the hashmap at some intervals because all the information in that is necessary. Is there some solution to this?
I am posting a snippet of my code below:
<FH4> can contain around 41 million lines on an average.while(<FH4>) { if(exists $wordCount{$_}) { $wordCount{$_} = $wordCount{$_} + 1; } else { $wordCount{$_} = 1;
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Out of Memory
by ikegami (Patriarch) on Oct 25, 2009 at 04:39 UTC | |
by Anonymous Monk on Oct 25, 2009 at 05:18 UTC | |
by ikegami (Patriarch) on Oct 25, 2009 at 05:39 UTC | |
|
Re: Out of Memory
by virtualsue (Vicar) on Oct 25, 2009 at 10:57 UTC | |
|
Re: Out of Memory
by afoken (Chancellor) on Oct 25, 2009 at 09:02 UTC | |
|
Re: Out of Memory
by JavaFan (Canon) on Oct 25, 2009 at 10:40 UTC | |
|
Re: Out of Memory
by BrowserUk (Patriarch) on Oct 25, 2009 at 16:07 UTC | |
|
[DUP] Re: Out of Memory
by ikegami (Patriarch) on Oct 25, 2009 at 04:41 UTC | |
|
Re: Out of Memory
by Jenda (Abbot) on Oct 26, 2009 at 11:17 UTC |