ppantazis has asked for the wisdom of the Perl Monks concerning the following question:

Monks, I have a perl script that is basically a collection of

find in_some_dir, some_file_x, -mtime +y, -exec /bin/rm -f {}\;

statements like the one above. These work great no problem. There are probably 50 or 60 of them. What I would like to do is log the files that each of the commands deleted. So if teh command above matched 10 files I would like to dump the results into a log file in the form:

File foo_1 deleted timestamp File foo_2 deleted timestamp etc.

What Perl constructs do I need to be looking at? how can this be done using Perl?

your help is appreciated.

20071019 Janitored by Corion: Added formatting, code tags, as per Writeup Formatting Tips

Replies are listed 'Best First'.
Re: Logging deleted files
by ikegami (Patriarch) on Oct 17, 2007 at 21:32 UTC

    use strict; use warnings; use File::Find::Rule qw( ); use POSIX qw( strftime ); { my $ts = strftime('[%Y/%m/%d %H:%M:%S]', localtime()); my @files = File::Find::Rule->file() ->name('some_file_x') ->mtime('>' . 7*24*60*60) ->in('in_some_dir'); foreach my $file (@files) { if (unlink($file)) { print("$ts Deleted $file\n"); } else { print("$ts Failed to delete $file\n"); } } }

    You could limit the number of changes by having the find tool simply print out a list of filenames to delete, then pipe the result to Perl to do the unlink and print.

      Here's the code for the suggestion I made at the bottom of my last post.

      old_files:

      #!/bin/sh find ... -print find ... -print find ... -print find ... -print

      unlink_and_log:

      #!/usr/bin/perl use strict; use warnings; use POSIX qw( strftime ); my $ts = strftime('[%Y/%m/%d %H:%M:%S]', localtime()); while (<>) { chomp; if (unlink()) { print("$ts Deleted $file\n"); } else { print("$ts Failed to delete $file\n"); } }

      Usage:

      old_files | unlink_and_log > unlink.log
      Hi ,

      Why not use the find2perl script on your original command ... modifying the resulting perl script to utilise log file(s) ?

      At last, a user level that best describes my experience :-))
Re: Logging deleted files
by gamache (Friar) on Oct 17, 2007 at 21:32 UTC
    What I would do is remove the -exec clause from the find command, capture the command's output into perl, and delete/log the files from inside perl:
    chomp (my @files = `find somedir -name somename -mtime 22`); my %del_log = (); for my $f (@files) { unlink $f; $del_log{$f} = localtime; }
    ...or something to that effect.

    Hope this helps.

    -pete

Re: Logging deleted files
by MidLifeXis (Monsignor) on Oct 17, 2007 at 21:33 UTC

    find ... | perl -ne 'print join(" ", time, $_)' | tee [-a] logfile | awk '{print $NF}' | xargs /bin/rm -f

    or

    find ... | perl -ne 'chomp; unlink && print join(" ", time, $_), "\n"' | tee [-a] logfile

    --MidLifeXis

      when I try this for some reason time comes out as a very long string of digits some huge num_seconds or something. Does perl have a problem with time?

        See the perldoc for time

        --MidLifeXis

Re: Logging deleted files
by bruceb3 (Pilgrim) on Oct 18, 2007 at 00:51 UTC
    While this can certainly be done in Perl and there have been a number of solutions already posted, the desired result can also be achieve with a slight change to the command.

    This is the original command;

    find in_some_dir, some_file_x, -mtime +y, -exec /bin/rm -f {}\;
    If you change it to -
    find in_some_dir, some_file_x, -mtime +y | tee /tmp/files-deleted | xa +rgs /bin/rm -rf

    A list of the file deleted will be in the file named /tmp/files-deleted

      Although I would not add the -r flag to the rm command. That might do more than you intend.

      --MidLifeXis

      Where is the timestamp added?

      Aside from that unsatisfied requirement, your solution is the same as MidLifeXis's.

        Whow, Monks that is great. I am going to try and implement some of the suggestions here and see were that goes. Perl Rocks!