in reply to Archiving Files
300_000 files is too long an argument list. If all the 300_000 files live within one folder, consider tarring the folder, not the files:
chdir $folder or die "Can't chdir to '$folder': $!\n"; my $ret = system("tar -cvzf $tar_file ."); # tar and zip on the fly and die "Couldn't create tarfile '$tar_file': $!\n"; # 'and', not +'or'
If the files live in different folders (e.g. the list was constructed with File::Find or similar) you could split the list into e.g. 1024 item chunks, create the tar file with the first chunk, then update the tar file with the u flag to tar:
my @ary = splice(@srcfiles,0,1024); my $ret = system("tar -cvf $tar_file @ary") # tar only, no update on ' +z' and die "Couldn't create tarfile '$tar_file': $!\n"; while(@ary = splice(@srcfiles,0,1024)) { $ret = system("tar -uvf $tar_file @ary"); and die "Couldn't update '$tar_file': $!\n"; }
Note also that on some systems performance sinks drastically with big directories (over 100_000 files) while reading file attributes with the stat(2) or lstat(2) system calls (which tar must do to store them), so you would be better off breaking your big folder into smaller ones.
--shmem
_($_=" "x(1<<5)."?\n".q·/)Oo. G°\ /
/\_¯/(q /
---------------------------- \__(m.====·.(_("always off the crowd"))."·
");sub _{s./.($e="'Itrs `mnsgdq Gdbj O`qkdq")=~y/"-y/#-z/;$e.e && print}
|
|---|