pachkov has asked for the wisdom of the Perl Monks concerning the following question:
Hi All,
I need your wisdom.
In my script I read some data to the hash (hash of arrays). It is quite big and takes around 1 GB of memory. Then I start reading another file line by line and if some data match to the hash key it is printed out.
!!!! As soon as I start printing things memory consumption grows like crazy resulting in "out of memory" error. Total amount of memory on the working machine is 4Gb.
How to reduce memory usage?
*** Solved! See my comment underneath. ***
My script looks like that:
##################### my %hash = get_hash(); open(IN, "$in"); open(OUT1, "> $out1"); open(OUT2, "> $out2"); open(OUT3 ,"> $out3"); while (<IN>) { my @data = split /\s+/, $_; if (defined($hash{$data[0]})) { print OUT1 "$data[0]\n"; print OUT2 join("\t", @data[1..$#data]) ."\n"; print OUT3 join("\t", @{$hash{$data[0]}}) ."\n"; } }
Thank you in advance!
Best,
Mike
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Memory consumption
by BrowserUk (Patriarch) on May 06, 2009 at 09:57 UTC | |
by pachkov (Novice) on May 06, 2009 at 10:01 UTC | |
by ELISHEVA (Prior) on May 06, 2009 at 10:50 UTC | |
|
Re: Memory consumption
by pachkov (Novice) on May 06, 2009 at 09:54 UTC | |
by chromatic (Archbishop) on May 07, 2009 at 00:02 UTC | |
|
Re: Memory consumption
by Anonymous Monk on May 06, 2009 at 09:44 UTC | |
by DrHyde (Prior) on May 06, 2009 at 10:25 UTC | |
by Anonymous Monk on May 06, 2009 at 11:08 UTC |