Hi Monks, Below is my code for checking 90 days older files and deleting those. If file's age is less than 90 days and more than 3 days it will gzip it.
But here problem is I have around 120 directories and each directory is having around 90,000 Files. Now perl unlink command(which I am using here in my script) is much faster than Unix rm command but issue is with Unix gzip command, system "gzip $full_name" is taking more time.
Is there any better way to do this gzip. Note : My script is running for last 2 hours and it has not finished with even one directory yet
#!/usr/bin/perl -w use strict; my @BASE_DIR = ("/u0001/DP/tandem_all/NewArchiveInsteadOfAtCABS/ARCHIV +ED" ); #my @BASE_DIR = ("/export/home/fwtwc/dushyant/myscript/test/test1" ); chomp (my $today = `date '+%Y%m%d'`); my $out_file = "output" . "$today"; open (FILE, ">>$out_file") or die "$!\n"; my $now = time(); sub go_dir { my $dh; opendir ($dh, "$_[0]") or die "$!\n"; print FILE "Starting with directory $_[0]\n"; foreach (readdir $dh) { chomp; next if (/^\./); my $full_name = "$_[0]" . "/" . "$_"; if ( -f $full_name ) { my $file_time = (stat($full_name))[9]; my $diff = $now - $file_time; $diff = $diff / 86400; my $read = localtime($file_time); if ( $diff > 93 ) { print FILE "$full_name : $diff : $read\n"; unlink "$full_name"; } elsif ( $diff > 3 ) { next if (/\.gz/); print FILE " Gziping file $full_name\n"; system "gzip $full_name"; } } elsif ( -d $full_name ) { &go_dir ("$full_name"); } } } foreach (@BASE_DIR) { &go_dir ("$_"); }
In reply to system "gzip $full_name" is taking more time by dushyant
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |