in reply to freeing hashes on Linux

That's due to the memory policies of Perl. You undef a large hash, which means that for each value stored, Perl has to decrement the refcount of the value, check to see if it's zero, and if it's zero, free its memory (and possibly decrement other ref counters). Finally, it needs to free up the memory structure of the hash itself, which will be lots of linked lists, so there are lots of little pieces of memory to be freed.

Abigail

Replies are listed 'Best First'.
Re: Re: freeing hashes on Linux
by arthas (Hermit) on May 22, 2003 at 15:16 UTC
    Hi Abigail! How about clearing all key/value pairs with:
    %myhash = ();
    How does this work internally?

    Thanks, Michele.
      It works the same way. The killer is freeing up all the bucket entries, and when you undef the hash they get freed just the same as when the hash goes out of scope. There's rather a lot of small memory malloc'ing going on (with corresponding extra overhead in memory footprint that's generally unaccounted for) and when those get freed it triggers some pathological behaviour in some versions of glibc's memory system.