Memory is stable (not climbing).
How much is it using at that point?
Best guess on the info so far is that you are running out of (physical) memory, and moving into swapping.
One assumes that the reason you are profiling memory is because you are using a lot of it--prior to deciding to profile. If so, building a hash that contains a reference to every SV in your application is likely to at least quadruple the ammount of memory used. Then creating a list of all the keys of that hash:
foreach my $size ( keys %$size_hash ) {
##.................^^^^^^^^^^^^^^^^
is going to stretch that by (guess!) half as much again.
If you use Data::Dumper on that same hash the memory requirement will likely quadruple again as it uses a hash internally (of the SV addresses), to detect circular and duplicate references.
Your best bet, (based upon my wild guesswork above), would be to avoid building lists by iterating the hash using while each, and iterating the arrays using the range iterator (.. which doesn't build a list):
while( my( $size, $ref ) = each %{ $size_hash } ) {
print $fh $size . '=[';
foreach my $i ( 0 .. $#{ $ref } ) {
print $fh ref( $ref->[ $i ] ) . ", ";
}
print $fh ']' . "\n";
}
That might just allow you to iterate through without breaking the memory bank and moving into swapping.
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.
|