in reply to Re: Re: Memory usage breakup
in thread Memory usage breakup

Tying down how much memory a given program will use, and what if any of that memory will be recycled, either internally by perl or back to the OS, is extremely complicated. (As well as highly OS / perl -V / individual perl build depenedant.)

For example, these 2 one-liners

P:\test>perl -e" { for( 1 .. 100_000 ) { $x[ $_ ] = ' ' x 1000; $x[ $_ ] = undef; } <STDIN>; } <STDIN>;"

In this first example, each element of the 100_000 element global array @x is allocated a 1000-byte value, which is then immediately 'freed' by undefing it. At the end of the loop, (the first prompt), 100+MB is allocated to the process. The space for 100_000 elements of 1000-bytes + the overhead for perls array and scalar structures. Even though only 1 element of the array has any space allocated (usable) at any given time.

P:\test>perl -e" { my @x; for( 1 .. 100_000 ) { $x[ $_ ] = ' ' x 1000; $x[ $_ ] = undef; } <STDIN>; } <STDIN>;"

The same program, except that the array is now locally scoped. When the first prompt is reached after the loop completes, again, 100+MB is being used, meaning that 99_999 elements of discarded (undef'd) space are lying around unusable and unused. However, once the second prompt is reached, ie. after the local scope in which @x was defined has exited, the memory used by the process (on my system) drops to 12MB.

With care and motivation, it is possible to force perl to re-use discarded memory, (and even return some of it to the OS under win32), but every attempt I've made to formulate a strategy for doing either, has fallen on stoney ground. I can do it on a case-by-case basis for many apps. I have begun to recognise some cases where I am reasonably sure that I can optimise the memory requirements through fairly simple steps, but inevitably, there are always exceptions to the rules of thumb I use.

Unfortunately, the exceptions are too common to make the rules of thumb viable for anything other than cases of extreme need.


Examine what is said, not who speaks.
"Efficiency is intelligent laziness." -David Dunham
"Think for yourself!" - Abigail

Replies are listed 'Best First'.
Re: Re: Re: Re: Memory usage breakup
by Ven'Tatsu (Deacon) on May 01, 2004 at 15:38 UTC
    $var = undef and undef $var behave differently in respect to memory deallocation. The first never seems to free memory, while the second does instantly. BrowserUk's code uses about 136MB on my system, while the bellow never grows past 4MB on my system. (Win98/AS809)
    { for( 1 .. 100_000 ) { $x[ $_ ] = ' ' x 1000; undef($x[ $_ ]); } <STDIN>; } <STDIN>;

      Indeed++. I get the same results here.


      Examine what is said, not who speaks.
      "Efficiency is intelligent laziness." -David Dunham
      "Think for yourself!" - Abigail
Re: Re: Re: Re: Memory usage breakup
by sgifford (Prior) on May 01, 2004 at 06:05 UTC
    Interesting. In Unix, you can allocate memory using an anonymous mmap, then return it to the system with munmap. You can do that by hand if you want, packing data into the allocated region, then unpacking it back out when you need it.

    This is horribly inconvenient, of course. I wonder if an XS module could use mmap to allocate a big chunk of memory and instruct Perl's allocator to use it, then later free this memory and tell Perl to use the normal allocator. I don't know anything about Perl internals, so I shouldn't be speculating like this, but it's so much fun. :)