in reply to Perl script end up on saying "Out of Memory !"

Question : How can I erase or clear the hash before my perl script takes the second file ?

The simplest way is to declare it in such a way that it goes out of scope when you stop processing the file. Something along the lines of:

for my $filename (@files) { my %data1; # do all processing of file $filename here }

Alternatively you can use undef %data1

my %data1 = %$list_a_ref; # Dereference lists

That doesn't just dereference, it also creates a copy. Do you want that?

Replies are listed 'Best First'.
Re^2: Perl script end up on saying "Out of Memory !"
by syedumairali (Initiate) on Sep 20, 2011 at 10:38 UTC
    Thanks Moritz, for your guidance. Refer to your question. Infact I donot want the copy of hash inside a MeasureFiles subroutine. Can you help me may how to only get the reference and not the copy of the hash inside the routine. Thanks !
        Hi I did that... but no better results . as I can see the the Available memory in the tast manager continously decreasing. However I would get the complete results 8 hours from now after script reaches 61st file :(.