Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

I need a perl script that will grep through a base directory and report on stale files that are older that X days, or hogs that are larger than Y Bytes. I'm sure this has been written before - does anyone know of a tar, module, or URL that can help me with this. It's probbaly 5 lines of code for a perl monk, with another 4 to email the results. TIA, mcoughlan@gothambroadband.com
  • Comment on Are there canned fileserver grooming scripts?

Replies are listed 'Best First'.
Re: Are there canned fileserver grooming scripts?
by kilinrax (Deacon) on Feb 01, 2001 at 21:36 UTC
    You want to use 'File::Find'.
    Here are a couple of examples:
    perl -M'File::Find' -e '$\ = qq(\n); find sub { (-s > 1048576) && prin +t $File::Find::name }, q(.);'
    (lists files over a megabyte in size)
    perl -M'File::Find' -e '$\ = qq(\n); find sub { (-M $_ > 60) && print +$File::Find::name. -M $_ }, q(.);'
    (lists files with an access time over 60 days ago)
Re: Are there canned fileserver grooming scripts?
by arturo (Vicar) on Feb 01, 2001 at 21:30 UTC

    Any answer you're going to get is going to rely heavily on using the filetest operators Perl provides, unlink, rename and the like; and File::Find (or perhaps just readdir) will probably also be in there somewhere.

    Partially because Perl makes such things so easy to write, they don't get canned (it's a good learning exercise, for one).

    I know this wasn't what you're looking for, but I hope it helps anyway!

    Philosophy can be made out of anything. Or less -- Jerry A. Fodor

Re: Are there canned fileserver grooming scripts?
by mikfire (Deacon) on Feb 01, 2001 at 22:40 UTC
    At the risk of heresy, why would you need perl for this? Command lines like this:
    find / -mtime +7 -o -size +5000000c | mail -s 'Old and big files' mikf +ire
    have been around and used for a long time.

    If you wanted to get fancy, you could even try:

    find / -mtime +7 -o -size +5000000c -ls | mail -s 'ls of Old and big f +iles' mikfire
    which will basically dump a stat(3) on each file.

    Seeking the proper tool for the job at hand,
    mikfire

Re: Are there canned fileserver grooming scripts?
by Gloom (Monk) on Feb 01, 2001 at 21:55 UTC
    This is a skeleton for implementing many filters on files in a directory.
    Add your options and tests...
    #!/usr/bin/perl -w use strict; my( $max_size , $min_size ); #parse args for(@ARGV) { if( /^--max_size=(.*)/ ) { $max_size = int( $1 ) } if( /^--min_size=(.*)/ ) { $min_size = int( $1 ) } } # end prints with \n $\ = "\n"; # open curent directory opendir( DIR , "." ); # test directories one by one. while( $_ = readdir( DIR ) ) { # drop "." and ".." next if /^\.\.?$/; # test for max length next if( $max_size and -s $_ > $max_size ); # test for min length next if( $min_size and -s $_ < $min_size ); print } # close... closedir( DIR );
    Hope this helps :)