Hi,
I have a VERY big hash i've built with the folowing structure:
$my_hash{$entry}{$timeStamp}{$uid} = number;
Basically each (entry,timeStamp) pair should contain a list of users and a nubmer of each user.
I've built this to (naturally) gather information that was not grouped the way I wanted.
The problem is that I later need to print the entire hash so that for each pair (entry,timestamp) i print the list of user X number in the same row and this takes me a LONG time (hours - computer is still working).
I should mention that i can't use the "keys" keyword for iterating (keys %my_hash) because that leads to an "out of memory!" message (big hash...).
what i've done so far is this (works but take a long time)
my $temp;<br/> while (($entry,$temp) = each %my_hash) {<br/> foreach $timeStamp ( sort keys %$temp) {<br/> print $Hentry "$entry\t$timeStamp\t" , scalar(keys %{$my_hash{$ent +ry}{$timeStamp}}) ;<br/> foreach my $id (sort keys %{$temp->{$timeStamp}}) {<br/> print $Hentry "\t$id\t" , $my_hash{$entry}{$timeStamp}{$id} ;<br +/> }<br/> print $Hentry "\n";<br/> }<br/> }<br/>
That's the only thing that finally let me iterate without using too much memory.
So what im looking for is either a better way to store that data or (preferebly) a good way to iterate through the hash.
any suggestions?
Thanks :)
PS
I've tried using DBM:Deep to save on memory but it just takes to long to fill the hash this way (i have millions of entries...)
In reply to Optimizing Iterating over a giant hash by oron
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |