in reply to out of memory problem after undef

You are bumping up really close to the 2 GB virtual memory limit for 32 bit Windows. It may be that you are actually running out of memory. It is quite possible that OSX's virtual memory limit is defferent - 4 GB for 32 bit hardware and untold for 64 bit.


Perl's payment curve coincides with its learning curve.

Replies are listed 'Best First'.
Re^2: out of memory problem after undef
by JadeNB (Chaplain) on Dec 03, 2008 at 01:27 UTC
    I'm no good with memory issues, so this may be a stupid question, but: Is it possible that the grandparent (as opposed to the GrandFather) just didn't realise that it's the operating system, not perl itself, that's running out of free memory (since—I think—perl doesn't release memory back to the OS until it exits)?

      The OP is, in effect, making two allocations with a free between them. The first allocation is a very large AOH that causes Perl to allocate very near 2 GB. That allocation is then "freed" by undefing the array containing the hashes. Then a largish array is allocated containing integers. It is in the second phase that I infer things are going pear shaped for the OP, although I can't reproduce that behavior.

      Perl doesn't generally free memory back to the system, but the space released following the first large allocation should be available for the second allocation however.


      Perl's payment curve coincides with its learning curve.