Recently I identified a new problem with Archive::Tar module, I am trying to create an object of this module and then some kind of extractions and destroying the objecting explicitly at the end but to my surprise I can see in the top( unix command ) output that the memory allocated during the the process of object creation is still with the underlying perl code, it is not deallocated as long as perl code is running. Can you please suggest as to what is going in the background which is causing this issue. ( Actual scenario is to process around 150 tar.gz files one by one in foreach loop, each time object is created and destroyed but as explained above due to the memory issue it throws an error "Too many open files at ....." or sometimes it says " Can't locate Carp/Heavy.pm in @INC " and breaks out.)
Is this problem with Archive::Tar or I am mistaken somewhere, please help me out.