in reply to Re: Archive::Tar Memory deallocation problem
in thread Archive::Tar Memory deallocation problem

Here is the sample code :
use strict; my $base_loc = "/base/location"; my @results = ( 'QFU010508.865714.tar.gz' , 'AAL_AGG.738208.tar.gz' , 'QFU010508.870496.tar.gz' , 'QFU010508.870512.tar.gz' , 'QFU010508.1017611.tar.gz' , 'QFU010508.1018350.tar.gz' , 'QFU020508.784543.tar.gz' , 'QFU020508.735377.tar.gz' , 'QFU020508.784632.tar.gz' , 'QFU020508.784637.tar.gz' , 'QFU020508.784641.tar.gz' , 'QFU020508.869793.tar.gz' ); foreach my $tar_file ( @results ) { my $path_to_file = $base_loc . "/" . $tar_file; my $extracts = Archive::Tar->new( $path_to_file ); print "Goin to sleep...\n"; sleep(5); if( !defined $extracts ) { print "INFO: Could not find : $tar_file file or Archive is not + valid\n"; } else { print "INFO: Processed $tar_file file\n"; } #Destroy the object undef $extracts; print "Object should get destroyed now, sleeping again.\n"; sleep(5); }
During running the above code, we can see that the memory consumption keeps on increasing ( using top ) even after object gets destroyed.

Replies are listed 'Best First'.
Re^3: Archive::Tar Memory deallocation problem
by mje (Curate) on Jan 28, 2009 at 12:06 UTC

    In future, post code in code tags - read Markup in the Monastery. Your $extracts variable goes out of scope each iteration so the object should be destroyed without you doing anything.

      Can you please help me in understanding why the memory is hold by the perl code until it ends ? It keeps on increasing as it processes more and more tar files.Is there any way we can avoid it.

        I tried this myself with Archive::Tar 1.44 on an ubuntu box and got out of memory almost straight away:

        use strict; use warnings; use Archive::Tar; use Devel::Leak; my $handle; my @results = qw ( u1.tar.gz u2.tar.gz u3.tar.gz u4.tar.gz u5.tar.gz u +6.tar.gz u7.tar.gz u8.tar.gz u9.tar.gz u10.tar.gz ); foreach my $tar_file ( @results ) { my $count = Devel::Leak::NoteSV($handle); print $count,"\n"; my $x = Archive::Tar->new or die "Failed to get tar object"; my $extracts = $x->read($tar_file); print "extracts $extracts\n"; $x->clear; undef $extracts; print Devel::Leak::CheckSV($handle), "\n"; }

        with 10 tar.gz files in the current directly all the same and of size 6.2Mb it fails with out of memory after only one:

        ~/tmp$ perl t.pl 28447 extracts 1 new 0x85fdce0 : new 0x85fdcec : loads and loads of these old (1): 0 old (1): 0 quite a lot of those 28647 28649 Out of memory!

        Tracking it through it failed in

        Archive::Tar::_read_tar(/usr/local/share/perl/5.8.8/Archive/Tar.pm:318 +): 318: my $offset = eval { tell $handle } || 'unknown'; DB<1> Out of memory!
      Can you please help me in understanding why the memory is hold by the perl code despite the object is destroyed. It keeps on increasing as it processess more and more tar.gz files.Thanks for all your help.