Re: Deleting old files from directory
by chrestomanci (Priest) on Sep 27, 2011 at 20:03 UTC
|
If you are using unix/linux, then an alternative way to find and delete old files is with find and xargs:
find <directory> -type f -ctime +10 -name "*some*pattern*.txt" | xargs rm -f
I have fragments like that in the crontab for the systems I maintain, as it is very easy to setup and then just leave to run for ever.
| [reply] [d/l] |
|
|
To handle spaces in filenames:
find <directory> -type f -mtime +10 -name "*some*pattern*.txt" -exec r
+m -f '{}' \;
On first run(s), you should consider using ls -l instead of rm -f.
I see you've used -ctime; here are the differences among Unix timestamps:
- Access Time (atime) - time that the file was last accessed (e.g. read)
or written to.
- Modify Time (mtime) - time the actual contents of the file were last modified.
- Change Time (ctime) - the time that the inode information (permissions, name, etc., i.e. the metadata) was last modified.
Update: atime info corrected; for more see man 2 stat.
| [reply] [d/l] [select] |
|
|
I use an explicit pipe to xargs instead of using the built in delete action, because that way it is easier to check I am deleting the correct files, as I can use ls on the end of the xargs command instead of rm.
Another way to handle spaces in filenames is to configure find to separate output with nulls instead of spaces, and then have xargs expect those nulls.
find <directory> <conditions> -print0 | xargs -0 ls -l
| [reply] [d/l] |
|
|
or better using -delete from find actions (in this case -maxdepth is needed too)
why ctime and not mtime ?
In the other side, maybe using Find, could be more efficient? I don't know
Regards,
| [reply] |
|
|
Notice also that systems like OS/X may require a slightly different syntax, as may Unix shells other than 'bash.'
| [reply] |
Re: Deleting old files from directory
by Lotus1 (Vicar) on Sep 27, 2011 at 20:41 UTC
|
Here is how to do it without the array and without using grep.
I prefer File::Glob these days since it allows whitespace in the path but it won't work like this in a while loop so I end up with a foreach loop and an array.
#!/usr/bin/perl
use warnings;
use strict;
while (<test/*.txt>) {
if (-f && 10 < -M _) {
print "\n Deleting::: $_\n";
unlink or warn "--Could not unlink $_: $!";
}
}
| [reply] [d/l] |
Re: Deleting old files from directory
by TomDLux (Vicar) on Sep 27, 2011 at 19:39 UTC
|
while (<>) {
chomp;
next unless -f $_; # ignore specials
#...
}
From which I gather the filetests don't take a default "$_". But once you've tested "-f $_", you can re-use the data structure from that query for the -M test by specifying a single underscore: "-M _". Personally I would do it as in the example, with a while, rather than loading all the filenames ... potentially a large number. I would also use a variable rather than "$_", because when you nest loops or call routines, the value of $_ might be altered.
As Occam said: Entia non sunt multiplicanda praeter necessitatem.
| [reply] [d/l] |
|
|
-X FILEHANDLE
-X EXPR
-X DIRHANDLE
-X A file test, where X is one of the letters listed below. This unary
operator takes one argument, either a filename, a filehandle, or a
dirhandle, and tests the associated file to see if something is true
about it. If the argument is omitted, tests $_, except for "-t", which
tests STDIN. Unless otherwise documented, it returns 1 for true
and '' for false, or the undefined value if the file doesn't exist. Despite
the funny names, precedence is the same as any other named unary operator.
| [reply] |
|
|
Um, grep always works with $_, and OP already has named variable $delfiles
| [reply] |
Re: Deleting old files from directory
by Anonymous Monk on Sep 27, 2011 at 19:48 UTC
|
Just need to know if this is efficient or if there is a better way of doing this.
It is more efficient to stat only once, ie
grep {-f $_ and 10 < -M _ }
stat, -X
| [reply] [d/l] |
|
|
foreach (grep {-f $_ and 10 < -M _ } @files) {
print "\n Deleting::: $_\n";
unlink $_;
}
| [reply] [d/l] |
|
|
| [reply] |