in reply to Logging deleted files

use strict; use warnings; use File::Find::Rule qw( ); use POSIX qw( strftime ); { my $ts = strftime('[%Y/%m/%d %H:%M:%S]', localtime()); my @files = File::Find::Rule->file() ->name('some_file_x') ->mtime('>' . 7*24*60*60) ->in('in_some_dir'); foreach my $file (@files) { if (unlink($file)) { print("$ts Deleted $file\n"); } else { print("$ts Failed to delete $file\n"); } } }

You could limit the number of changes by having the find tool simply print out a list of filenames to delete, then pipe the result to Perl to do the unlink and print.

Replies are listed 'Best First'.
Re^2: Logging deleted files
by ikegami (Patriarch) on Oct 17, 2007 at 21:49 UTC

    Here's the code for the suggestion I made at the bottom of my last post.

    old_files:

    #!/bin/sh find ... -print find ... -print find ... -print find ... -print

    unlink_and_log:

    #!/usr/bin/perl use strict; use warnings; use POSIX qw( strftime ); my $ts = strftime('[%Y/%m/%d %H:%M:%S]', localtime()); while (<>) { chomp; if (unlink()) { print("$ts Deleted $file\n"); } else { print("$ts Failed to delete $file\n"); } }

    Usage:

    old_files | unlink_and_log > unlink.log
Re^2: Logging deleted files
by Bloodnok (Vicar) on Oct 18, 2007 at 13:22 UTC
    Hi ,

    Why not use the find2perl script on your original command ... modifying the resulting perl script to utilise log file(s) ?

    At last, a user level that best describes my experience :-))