What are you doing with the file? Slurping the contents or producing large global hashes or arrays are the usual culprits. Look for places to restrict the scope of large variables and try to process the file a line at a time (or other smallish chunk).
We can help better if you show us code and say what the data is like.
| [reply] |
my script is a generic parser and creates this complex hash structure. Then I process this hash structure again to give me only the important data. Then I do a lot of boolean math on it
| [reply] |
Is there ways to release memory in perl?
That depends what you mean by the question. If you mean "give back to the OS", the answer is, no, on most OSses. And if it happens, it's not really under your control. If you mean "let Perl reuse memory I no longer need", then Perl is already doing so, if and only if it can figure out you're not going to need the memory again. And Perl figures this out by keeping track of the number of references a piece of data has - as long something is pointing to it, Perl will keep it. So, make your variables lexical, and give them small scopes. When they drop out of scope, Perl will resue their memory.
Of course, if you are using a bad algorithm, like slurping in an entire file in an array, when all you need is to inspect the file line-by-line, you will run out of memory because your program claims all that memory. They only way to deal with it is by using a better algorithm - like reading in the file line-by-line.
| [reply] |