in reply to out of memory problem after undef

The problem is that although the ~1.5 GB required by the AoHs is released back to the memory pool when you undef it, it isn't in a form that allows it to be completely re-usable by the subsequent big array.

Perl's arrays require a single, contiguous chunk of memory for the base (AV) allocation--the (C-style) array of pointers to SVs, and for your large array that means a single chunk of 10e6 * 4-byes or ~40MB.

Whilst that pales into insignificance relative to the memory previously used and freed from the AoHs, that memory was internally fragmented into much smaller chunks by the creation of the AoHs and is not reconsituted into contiguous lumps. So, when constructing the large hash, the runtime has no choice but to go back to the OS for another lump of virtual memory and that pushes you over the edge. What's worse is that it doesn't go back for one 40MB chunk, but rather has to go back for several large chunks as the array doubles and re-doubles in size as you construct it.

You might find that if you pre-size the array, after undefing the AoH:

... undef (@h_item_selector); print "did undef of array\n"; sleep 10; print "alocating string array\n"; my $g_freeSelectorIndex = 10000000 $#idsMap = $g_freeSelectorIndex; ### pre-allocate to final si +ze ...

that you avoid the intermediate doubling allocations, and so avoid pushing things over the edge.

Prior to needing the 40MB contiguous chunk, it has a 20MB chunk. And when that fills to capacity, it needs both concurrently in order to copy the old to the new. And the same was previously required at 10MB and 5MB etc. By preallocating to the final size in one step, all those previous intermediate stages can be avoided and that may allow your process to complete.

You might also avoid some accumulated large, but not large enough for re-use allocations by pre-allocating your hashes to their final size:

for(my $i = 1; $i < 50; $i++) { my $start = "aaa"; my %hash; keys %hash = 250_000; ### pre-allocate the hash for(my $j = 1; $j < 250000; $j++) { my $val = "$cnt $i $j $start"; $hash{$val} = $cnt; ### Simple hash assignments $cnt++; $start++; } $h_item_selector[$i] = \%hash; ### then assign a reference }

In addition, avoiding compound dereferencing for the assignments might speed up the construction a little.

(BTW: What is it with all those variable names? A weird mixture of lower_case_with_underscores, CamelCase and a_MixtureOfTheTwo. Just horrible to work with.)


Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.
"Too many [] have been sedated by an oppressive environment of political correctness and risk aversion."

Replies are listed 'Best First'.
Re^2: out of memory problem after undef
by sduser81 (Novice) on Dec 03, 2008 at 19:27 UTC
    Hi,

    Thanks for the reply.

    Regarding allocation of memory for the AoH, I currently don't see a way of making that more efficient, because in context of the script where this is used, this structure is built up over time.

    Also, this seems like a fundamental problem with using such structures in Perl on Win32. By the way, I am using ActiveState ActivePerl distribution, if this makes a difference.

    Based on what you said, because I have this huge AoH, memory is fragmented in small chunks and it can't be easily reused for a large array. Do you, or anyone else, know if there is a way to force Perl to defragment its memory pool to enable reallocation of the freed memory?

    I tried to preallocate the array, but there is not enough system memory, so Perl runs out of memory.

    Thanks,


    Tim
      Hey All, I too ran into more or less the same issue as did Tim, so does anybody have an answer for suggesting perl to defragment its memory pool ? Its really bad and frustrating coz despite having a memory of 2GB eaten up by Perl it is not able to meet the need of storing 200MB of data. Big Monks - Please help me out....
        Perl it is not able to meet the need of storing 200MB of data.

        Here is a 200MB file:

        C:\>dir acktrace 14/06/2012 18:51 206,937,739 acktrace

        And here is a perl script loading 10 copies of that file into less than 2GB of memory; then discarding it and recovering all the memory:

        c:>perl -E"undef($/); $f[0]=<>; $f[$_]=$f[$_-1] for 1..9; say`tasklist +`; undef @f; say`tasklist`;" acktrace | find "perl" perl.exe 7996 Console 1 1,927 +,000 K perl.exe 7996 Console 1 4 +,796 K

        Don't blame Perl, or ActiveState or Windows because you don't know how to use them effectively.

        If you want actual help; post the actual code.


        With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
        Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
        "Science is about questioning the status quo. Questioning authority".
        In the absence of evidence, opinion is indistinguishable from prejudice.

        The start of some sanity?