in reply to Perl and memory usage. Can it be released?

Two thoughts:

  1. If your program processes the files line-by-line, then the maximum memory it would need at any given time is the length of the longest line.

    Which for most files is a trivial amount.

  2. If you really need to load the files in their entirety each time, then slurping them into a single huge scalar rather than an array of lines, would ensure that when the file is processed and the scalar is freed, the whole amount of the scalar would be returned to the OS, not just the process pool.

    Note: I know this to be true of Perl running under Windows for single allocations over 1MB.

    The picture of whether other OS mallocs have similar arrangements for large, single allocations isn't so clear.

    Of course, this will only help if you can avoid breaking the single scalar into an array or hash.


With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.
  • Comment on Re: Perl and memory usage. Can it be released?