Hello,
I'm a bit stuck trying to output a series of very large hashes. Using what I thought would be memory-efficient code:
open(AGG, '>', $aggregate_file); while ( my ($key1, $val1) = each %noninteracting_hash ) { while ( my ($key2, $val2) = each %$val1 ) { print AGG join( "\t", $key1, $key2, $noninteracting_hash{$key1}{$key2}, $interacting_hash{$key1}{$key2}, $literature_hash{$key1}{$key2}, $predicted_hash{$key1}{$key2} ), "\n"; } } close(AGG);
When I say huge hashes, what I mean is that each hash is keyed by a 10-char string and in total there are 3,895,529,794 elements in each entire HoH (e.g. the output to file should contain four-billion rows or so). The program reproducibly crashes after 6.8 million rows (variance in crash point from run to run is about 30 rows).
The specific error is my favourite: Out of memory!
Can anyone see what might be going on here, as I'm kinda lost!
In reply to Outputting Huge Hashes by bernanke01
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |