in reply to out of memory problem after undef
The problem is that although the ~1.5 GB required by the AoHs is released back to the memory pool when you undef it, it isn't in a form that allows it to be completely re-usable by the subsequent big array.
Perl's arrays require a single, contiguous chunk of memory for the base (AV) allocation--the (C-style) array of pointers to SVs, and for your large array that means a single chunk of 10e6 * 4-byes or ~40MB.
Whilst that pales into insignificance relative to the memory previously used and freed from the AoHs, that memory was internally fragmented into much smaller chunks by the creation of the AoHs and is not reconsituted into contiguous lumps. So, when constructing the large hash, the runtime has no choice but to go back to the OS for another lump of virtual memory and that pushes you over the edge. What's worse is that it doesn't go back for one 40MB chunk, but rather has to go back for several large chunks as the array doubles and re-doubles in size as you construct it.
You might find that if you pre-size the array, after undefing the AoH:
... undef (@h_item_selector); print "did undef of array\n"; sleep 10; print "alocating string array\n"; my $g_freeSelectorIndex = 10000000 $#idsMap = $g_freeSelectorIndex; ### pre-allocate to final si +ze ...
that you avoid the intermediate doubling allocations, and so avoid pushing things over the edge.
Prior to needing the 40MB contiguous chunk, it has a 20MB chunk. And when that fills to capacity, it needs both concurrently in order to copy the old to the new. And the same was previously required at 10MB and 5MB etc. By preallocating to the final size in one step, all those previous intermediate stages can be avoided and that may allow your process to complete.
You might also avoid some accumulated large, but not large enough for re-use allocations by pre-allocating your hashes to their final size:
for(my $i = 1; $i < 50; $i++) { my $start = "aaa"; my %hash; keys %hash = 250_000; ### pre-allocate the hash for(my $j = 1; $j < 250000; $j++) { my $val = "$cnt $i $j $start"; $hash{$val} = $cnt; ### Simple hash assignments $cnt++; $start++; } $h_item_selector[$i] = \%hash; ### then assign a reference }
In addition, avoiding compound dereferencing for the assignments might speed up the construction a little.
(BTW: What is it with all those variable names? A weird mixture of lower_case_with_underscores, CamelCase and a_MixtureOfTheTwo. Just horrible to work with.)
|
---|
Replies are listed 'Best First'. | |
---|---|
Re^2: out of memory problem after undef
by sduser81 (Novice) on Dec 03, 2008 at 19:27 UTC | |
by Anonymous Monk on Aug 07, 2012 at 06:36 UTC | |
by BrowserUk (Patriarch) on Aug 07, 2012 at 07:14 UTC |