in reply to Archive::Tar Memory deallocation problem

Perhaps (likely) your Archive::Tar object is not going out of scope so the tar files are held open and hence you run out of file handles. Show us the code.

  • Comment on Re: Archive::Tar Memory deallocation problem

Replies are listed 'Best First'.
Re^2: Archive::Tar Memory deallocation problem
by bgupta (Novice) on Jan 28, 2009 at 11:56 UTC
    Here is the sample code :
    use strict; my $base_loc = "/base/location"; my @results = ( 'QFU010508.865714.tar.gz' , 'AAL_AGG.738208.tar.gz' , 'QFU010508.870496.tar.gz' , 'QFU010508.870512.tar.gz' , 'QFU010508.1017611.tar.gz' , 'QFU010508.1018350.tar.gz' , 'QFU020508.784543.tar.gz' , 'QFU020508.735377.tar.gz' , 'QFU020508.784632.tar.gz' , 'QFU020508.784637.tar.gz' , 'QFU020508.784641.tar.gz' , 'QFU020508.869793.tar.gz' ); foreach my $tar_file ( @results ) { my $path_to_file = $base_loc . "/" . $tar_file; my $extracts = Archive::Tar->new( $path_to_file ); print "Goin to sleep...\n"; sleep(5); if( !defined $extracts ) { print "INFO: Could not find : $tar_file file or Archive is not + valid\n"; } else { print "INFO: Processed $tar_file file\n"; } #Destroy the object undef $extracts; print "Object should get destroyed now, sleeping again.\n"; sleep(5); }
    During running the above code, we can see that the memory consumption keeps on increasing ( using top ) even after object gets destroyed.

      In future, post code in code tags - read Markup in the Monastery. Your $extracts variable goes out of scope each iteration so the object should be destroyed without you doing anything.

        Can you please help me in understanding why the memory is hold by the perl code until it ends ? It keeps on increasing as it processes more and more tar files.Is there any way we can avoid it.
        Can you please help me in understanding why the memory is hold by the perl code despite the object is destroyed. It keeps on increasing as it processess more and more tar.gz files.Thanks for all your help.