Hi All,
I need your wisdom.
In my script I read some data to the hash (hash of arrays). It is quite big and takes around 1 GB of memory. Then I start reading another file line by line and if some data match to the hash key it is printed out.
!!!! As soon as I start printing things memory consumption grows like crazy resulting in "out of memory" error. Total amount of memory on the working machine is 4Gb.
How to reduce memory usage?
*** Solved! See my comment underneath. ***
My script looks like that:
##################### my %hash = get_hash(); open(IN, "$in"); open(OUT1, "> $out1"); open(OUT2, "> $out2"); open(OUT3 ,"> $out3"); while (<IN>) { my @data = split /\s+/, $_; if (defined($hash{$data[0]})) { print OUT1 "$data[0]\n"; print OUT2 join("\t", @data[1..$#data]) ."\n"; print OUT3 join("\t", @{$hash{$data[0]}}) ."\n"; } }
Thank you in advance!
Best,
Mike
In reply to Memory consumption by pachkov
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |