nowise has asked for the wisdom of the Perl Monks concerning the following question:
hi
I have been facing an issue with the Hash , whereby it is not releasing the memory to the OS and neither reusing the memory already used. i have read that Perl donot release memory to the OS , becuase of which the memory keep on ticking.But is their any way by which it can reuse the same memory which it had returned to the process. I'm asking this becuase undefing the hash or reseting ( undef %hash or hash = () ) doesn't release the memory.
please provide advise
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Memory issue with Hash
by zentara (Cardinal) on Dec 02, 2012 at 11:40 UTC | |
See Parallel::ForkManager and IO::Pipe for some helpful modules.
I'm not really a human, but I play one on earth. Old Perl Programmer Haiku ................... flash japh | [reply] |
|
Re: Memory issue with Hash
by rjt (Curate) on Dec 02, 2012 at 11:54 UTC | |
Perl does re-use memory internally, and may release memory to the OS, although not reliably (but that's OS dependent and not as bad as it sounds in the vast majority of cases). If you are seeing your program's memory usage increase over time, it's probable you have a memory leak. Setting a hash to undef doesn't guarantee that all references will be garbage-collected, recursively, which is what you might expect. Also, keep in mind undef is not free(); it doesn't de-allocate anything. First, as is common with GC systems, Perl maintains a reference count to each reference. If you have, say, a map of keys to array refs, and you delete the hash, the arrays will still exist if (and only if) some other thing still references them. The second bit, which may sound more arcane, is that if, anywhere in your data, you have circular references ($a == $b, but possibly assigned through a longer assignment path as in a circular linked list for example), the memory for $a and $b will never be reused, even if nothing else references $a or $b, unless you break the circular reference. Usually, breaking circular references isn't obvious and is potentially very expensive. See Scalar::Util's weaken($ref) function, however, for one approach. If neither of those things seem to describe your problem, perhaps you could post some short sample code for us to have a look at. | [reply] [d/l] [select] |
|
Re: Memory issue with Hash
by BrowserUk (Patriarch) on Dec 02, 2012 at 18:18 UTC | |
I'm asking this becuase undefing the hash or reseting ( undef %hash or hash = () ) doesn't release the memory. Prove it! (Ie. demonstrate your claim by posting working code; because what you claim is not normally true.) With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.
| [reply] |
by Anonymous Monk on Dec 03, 2012 at 04:24 UTC | |
| [reply] |
by BrowserUk (Patriarch) on Dec 03, 2012 at 05:34 UTC | |
Should be obvious enough... Guess on if you like. I'll wait for the evidence. With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.
| [reply] |
|
Re: Memory issue with Hash
by flexvault (Monsignor) on Dec 02, 2012 at 19:53 UTC | |
Welcome nowise, I had a similar problem about a year ago. I had a hash that just grew and grew and it turned out to be a scoping problem. I created the hash in the main script and was trying to clear it in a subroutine. What I cleared was an empty hash and returned to the main script without touching the 'real' hash. If this isn't what your doing, then you'll have to show some code. The '%Hash' is very powerful and very reliable, so it's probably something your doing wrong. But that's how we learn. Good Luck! "Well done is better than well said." - Benjamin Franklin | [reply] |
by nowise (Initiate) on Dec 04, 2012 at 06:15 UTC | |
In order to describe it more clearly , i have 1 example below, in which i have created 1 big hash-of-hash. Although i have put only 2 HOH inside %TV overhere ( just for clarity sake ), but in my code i have it 300 or more HOH ( while incrementing the key e.g jetsons1, jetsons2... ) in order to check the memory usage.
| [reply] [d/l] [select] |
|
Re: Memory issue with Hash
by flexvault (Monsignor) on Dec 04, 2012 at 14:24 UTC | |
I ran your code (after a little clean-up) out to: And I verified the real memory usage as '1776MB' constant. I verified this with 'top' and with: I used the latest Debian Linux 2.6.32 distribution and used perl 5.12.2 for the test. I did not use 'Proc::ProcessTable', so I can't say whether that is part of your problem. I also deleted the 'sleep( 10 );' since I didn't see any value in waiting 10 seconds to see the screen fill up immediately again. Some suggestions:
From this test, your problem isn't the hash growing. It could be your system or the system's perl that's giving you the problem. Good Luck! Update: I remembered I had some AIX systems with perl5.8.8 and I ran the code to 'Record counter => 10320' and the memory usage stayed as '1940MB'. So it doesn't seem to be a problem on the AIX perl5.8.8 either Update2: My "MB" in above should be "KB" as the way 'top' and 'ps' return the information. "Well done is better than well said." - Benjamin Franklin | [reply] [d/l] [select] |
by nowise (Initiate) on Dec 05, 2012 at 10:36 UTC | |
i donot know how the system or perl is giving the error. can you please provide some more details on it. i had read somewhere that perl should be compiled with the "usemymalloc=y" compiler flag to release the memory. when i ran the perl -V on the commandline i could see this flag set to "y". Also the only contributing factor to increase in the process memory , seems to be because of hash. please see below the memory increase in the process ,it remain constant to some point , and then gaian start increasing : Record counter => 20 Process Status:6.328125MB Record counter => 30 Process Status:6.4296875MB Record counter => 40 Process Status:6.4375MB Record counter => 50 Process Status:6.4375MB Record counter => 60 Process Status:6.4453125MB Record counter => 70 Process Status:6.4453125MB Record counter => 80 Process Status:6.4453125MB Record counter => 90 Process Status:6.453125MB Record counter => 100 Process Status:6.453125MB Record counter => 110 Process Status:6.453125MB Record counter => 120 Process Status:6.453125MB Record counter => 130 Process Status:6.4609375MB | [reply] |