neversaint has asked for the wisdom of the Perl Monks concerning the following question:

Dear Masters,

I have an interactive programs that creates files.
And these files are never modified, but will be accessed/read.

Is there an effective strategy in Perl to:
Or perhaps is there a better non-Perl solution?

Update: I'm under Linux/Unix system.

---
neversaint and everlastingly indebted.......
  • Comment on Automatically Deleting Files Periodically

Replies are listed 'Best First'.
Re: Automatically Deleting Files Periodically
by bpphillips (Friar) on Jun 06, 2006 at 15:32 UTC
    As already stated, this is a perfect task for cron and find if they are available on your system.

    The appropriate (but untested) find command will probably be:
    find /path/to/dir -type f -atime +10 -exec rm {} \;
    If you don't have find available, you could accomplish the same thing with the find2perl and File::Find (as mentioned above). There are a couple other modules that have nicer (in my opinion) interfaces to File::Find: File::Finder or File::Find::Rule). For instance, (using File::Find::Rule):
    use File::Find::Rule; File::Find::Rule->file() # only files ->atime("+10") # that haven't been accessed in 10 days ->exec( sub { unlink( $_[2] ) } ) # delete them ->in('/path/to/dir'); # starting in this path
    Regardless of how you implement the find functionality, you'll need some other scheduling utility (like cron) to handle the "perpetual" part of your question.

    -- Brian
Re: Automatically Deleting Files Periodically
by blue_cowdawg (Monsignor) on Jun 06, 2006 at 15:24 UTC
        automatically delete these files after 10 days they are last accessed.

    To answer the second half of your question first, you could use a very nifty tool like so:

    find2perl ${dir_I_am_watching} -atime +10 \ -exec rm {} \; > myNewScript.pl
    which will result in a Perl script that will act just like the Unix command "find" with the same arguments supplied to it. The code it generates in my opinion is very "raw" and will probably need some massaging, but it is a good place to start. Especially if you are not familiar with the module cpan:File::Find and all its features. A module that in any case is a good one to be familiar with as well as its many friends

    As to the first half of your question I'm not sure what you are asking. If what you are asking is to look in a particular directory and act upon any file in that directory then the strategy I've mapped out for you above should work just fine. Just replace the shell variable I've cited (${dir_I_am_watching}) with the actual name of the directory and it will be hardcoded into your generated script.

    Of course one of the immediate tweaks I'd make to the script is to eliminate hardcoding and add logic surrounding GetOpt::Long and provide a command line switch to the script that sets what directory it is examining as well as a command line switch that sets the age of the file I want to eliminate. Part of that logic would set defaults for sure, but by having command line switches there to modify those variables you've increased the usability of the resultant script a bit.

    Lastly, you failed to say waht platform you are doing this all on. This set of strategies will work fine on *nix platforms but on others YMMV. I'm not 100% sure it will work on *doze or Mac and I'm too lazy to go find out.

    HTH


    Peter L. Berghold -- Unix Professional
    Peter -at- Berghold -dot- Net; AOL IM redcowdawg Yahoo IM: blue_cowdawg
Re: Automatically Deleting Files Periodically
by grinder (Bishop) on Jun 06, 2006 at 16:18 UTC

    Here is some code, more or less lifted straight from the horse's mouth

    { my $now; sub days_ago { # days as number of seconds $now ||= time; return $now - (86400 * shift); } } for my $file ( File::Find::Rule->new ->mtime( '<' . days_ago(2) ) ->name( qr/^CGItemp\d+$/ ) ->file() ->maxdepth(1) ->in('tmp'), File::Find::Rule->new ->mtime( '<' . days_ago(14) ) ->file() ->maxdepth(1) ->in('/tmp/ep'), File::Find::Rule->new ->mtime( '<' . days_ago(30) ) ->file() ->in(qw( /var/db/ep/epdata/reports /var/db/ep/cmd/sync-log /var/db/ep/cmd/analyse-oracle/log )), ) { $body = "Old temporary files purged:\n" unless length($body); $body .= "$file " . (unlink($file) ? 'ok' : "not ok: $!") . "\n"; ++$files; }

    At the end of this, if anything was processed ($file > 0), it is sent off as an e-mail message to the interested parties (me). This is run daily at around 5:00 in the morning.

    • another intruder with the mooring in the heart of the Perl

Re: Automatically Deleting Files Periodically
by leocharre (Priest) on Jun 06, 2006 at 19:43 UTC

    What you want is if the file has not been accessed for 10+ days, delete. (-atime is last accessed, -mtime last mod, -ctime creationtime)

    Make sure however your contraption reads the file, is actually changing atime, likely it is. Whenever the file is accessed, *ix changes the atime.

    Login to your shell

    Open your crontab # crontab -e

    If you want it to run every hour:
    59 * * * * /usr/bin/find /path/to/dir -type f -atime +10 -exec rm {} \;

    that's an example, look up # man crontab

Re: Automatically Deleting Files Periodically
by ambrus (Abbot) on Jun 06, 2006 at 16:07 UTC
Re: Automatically Deleting Files Periodically
by ForgotPasswordAgain (Vicar) on Jun 06, 2006 at 17:19 UTC
    On debian systems at least, there's a program called 'tmpreaper':
    This package provides a program that can be used to clean out temporary-file directories. It recursively searches the directory, refusing to chdir() across symlinks, and removes files that haven't been accessed in a user-specified amount of time. You can specify a set of files to protect from deletion with a shell pattern. It will not remove files owned by the process EUID that have the `w' bit clear, unless you ask it to, much like `rm -f'. `tmpreaper' will not remove symlinks, sockets, fifos, or special files unless given a command line option enabling it to.
Re: Automatically Deleting Files Periodically
by eff_i_g (Curate) on Jun 06, 2006 at 15:08 UTC
    What do you mean by monitor?
    Are you on Unix? If so, you could use 'find' within a cron script to remove the file after 10 days.
Re: Automatically Deleting Files Periodically
by davidrw (Prior) on Jun 06, 2006 at 15:47 UTC
    using Cache::FileCache instead of directly creating/reading the files might work for you too, and you can take advantage of it's innate expiration functionality.
Re: Automatically Deleting Files Periodically
by ahmad (Hermit) on Jun 06, 2006 at 17:01 UTC

    well the best way i think , is using stat function

    # this will return last access time in seconds and you can compare tim +e then my $last_access = (stat($filename))[8];

    HTH