bgupta has asked for the wisdom of the Perl Monks concerning the following question:

Recently I identified a new problem with Archive::Tar module, I am trying to create an object of this module and then some kind of extractions and destroying the objecting explicitly at the end but to my surprise I can see in the top( unix command ) output that the memory allocated during the the process of object creation is still with the underlying perl code, it is not deallocated as long as perl code is running. Can you please suggest as to what is going in the background which is causing this issue. ( Actual scenario is to process around 150 tar.gz files one by one in foreach loop, each time object is created and destroyed but as explained above due to the memory issue it throws an error "Too many open files at ....." or sometimes it says " Can't locate Carp/Heavy.pm in @INC " and breaks out.) Is this problem with Archive::Tar or I am mistaken somewhere, please help me out.
  • Comment on Archive::Tar Memory deallocation problem

Replies are listed 'Best First'.
Re: Archive::Tar Memory deallocation problem
by mje (Curate) on Jan 28, 2009 at 11:33 UTC

    Perhaps (likely) your Archive::Tar object is not going out of scope so the tar files are held open and hence you run out of file handles. Show us the code.

      Here is the sample code :
      use strict; my $base_loc = "/base/location"; my @results = ( 'QFU010508.865714.tar.gz' , 'AAL_AGG.738208.tar.gz' , 'QFU010508.870496.tar.gz' , 'QFU010508.870512.tar.gz' , 'QFU010508.1017611.tar.gz' , 'QFU010508.1018350.tar.gz' , 'QFU020508.784543.tar.gz' , 'QFU020508.735377.tar.gz' , 'QFU020508.784632.tar.gz' , 'QFU020508.784637.tar.gz' , 'QFU020508.784641.tar.gz' , 'QFU020508.869793.tar.gz' ); foreach my $tar_file ( @results ) { my $path_to_file = $base_loc . "/" . $tar_file; my $extracts = Archive::Tar->new( $path_to_file ); print "Goin to sleep...\n"; sleep(5); if( !defined $extracts ) { print "INFO: Could not find : $tar_file file or Archive is not + valid\n"; } else { print "INFO: Processed $tar_file file\n"; } #Destroy the object undef $extracts; print "Object should get destroyed now, sleeping again.\n"; sleep(5); }
      During running the above code, we can see that the memory consumption keeps on increasing ( using top ) even after object gets destroyed.

        In future, post code in code tags - read Markup in the Monastery. Your $extracts variable goes out of scope each iteration so the object should be destroyed without you doing anything.