bernanke01 has asked for the wisdom of the Perl Monks concerning the following question:
Hello,
I'm a bit stuck trying to output a series of very large hashes. Using what I thought would be memory-efficient code:
open(AGG, '>', $aggregate_file); while ( my ($key1, $val1) = each %noninteracting_hash ) { while ( my ($key2, $val2) = each %$val1 ) { print AGG join( "\t", $key1, $key2, $noninteracting_hash{$key1}{$key2}, $interacting_hash{$key1}{$key2}, $literature_hash{$key1}{$key2}, $predicted_hash{$key1}{$key2} ), "\n"; } } close(AGG);
When I say huge hashes, what I mean is that each hash is keyed by a 10-char string and in total there are 3,895,529,794 elements in each entire HoH (e.g. the output to file should contain four-billion rows or so). The program reproducibly crashes after 6.8 million rows (variance in crash point from run to run is about 30 rows).
The specific error is my favourite: Out of memory!
Can anyone see what might be going on here, as I'm kinda lost!
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Outputting Huge Hashes
by brian_d_foy (Abbot) on Jan 29, 2006 at 18:47 UTC | |
by bernanke01 (Beadle) on Jan 31, 2006 at 16:38 UTC | |
|
Re: Outputting Huge Hashes
by ysth (Canon) on Jan 29, 2006 at 18:34 UTC | |
by bernanke01 (Beadle) on Jan 30, 2006 at 22:30 UTC | |
|
Re: Outputting Huge Hashes
by lima1 (Curate) on Jan 29, 2006 at 18:00 UTC | |
by bernanke01 (Beadle) on Jan 30, 2006 at 22:30 UTC | |
|
Re: Outputting Huge Hashes
by TedPride (Priest) on Jan 30, 2006 at 02:09 UTC |