Well, given 14,000 files, I'm guessing the tarball is a bit large for loading into memory as Archive::Tar is wont to do. However, please confirm that the file can reasonably fit into memory or not - you can use Archive::Tar to load your archive, and check how much memory that perl process is using, and see if it's unreasonable. If not, check that documentation to list the files, and extract each one that you want the way you want.
If it is too big for memory, your next best bet is to construct a tar command line that will do what you want. Using the -C flag may get you what you want, or at least pretty close - close enough that a bit of renaming might finish the job.
| [reply] |
Thank you!
I actually tried using Archive::TAR and had trouble using the objects involved, got an error saying the Archive::TAR may not be loaded.
-c flag is new to me , I will try that
Thanks
Anu
| [reply] |
Check the Module Reviews section here at the Monastery -- I posted something there about Archive::Tar a while back. HTH
| [reply] |
anu7
This is a simple routine to extract the zip:
use Data::Dumper;
unzip_tree($file);
print Dumper \@files;
sub unzip_tree{
my $zip_file = shift;
my $zip = Archive::Zip->new($zip_file);
$zip->extractTree( "", "$tmpdir/" );
my @files = grep { -f "$tmpdir/$_"} readdir( UNZIPED ) ;
close UNZIPED;
@files = map { $_ = "$tmpdir/$_" } @files;
return @files;
}
Hope this helps
| [reply] [d/l] |
Um... Does Archive::Zip work on tar files? (I haven't tried that, but I would not expect it to work.)
| [reply] |