in reply to (jcwren) RE: Re: Deleting files over 2 weeks old?
in thread Deleting files over 2 weeks old?

OK, let's take the classic root crontab code:
find /tmp /usr/tmp -atime +7 -print | xargs /bin/rm -f
And then I come along (as an ordinary user) and do the following:
$ mkdir -p "/tmp/foo /etc" # yes, that's a newline after foo before the /etc $ touch "/tmp/foo /etc/passwd" # yes, that's a newline after foo again
And then sit back 7 days. Boom. You have no /etc/passwd.

The problem is that you are using newline as a delimiter, and yet it is a legal filename character. You need find .. -print0 and xargs with a -0, but that's not portable. Even though Perl isn't strictly everywhere, it's everywhere the perlmonks are, so my solution succeeds in a safe way.

-- Randal L. Schwartz, Perl hacker

Replies are listed 'Best First'.
RE: RE: (jcwren) RE: Re: Deleting files over 2 weeks old?
by Jonathan (Curate) on Aug 03, 2000 at 13:15 UTC
    Ouch! nice one merlyn Can I vote myself down for leaving crap find commands on nearly every server I've accessed?
RE: RE: (jcwren) RE: Re: Deleting files over 2 weeks old?
by jettero (Monsignor) on Aug 03, 2000 at 04:46 UTC
    I would have used:
    find /somewhere/or/something -atime +7 -exec rm -v {} \;
    Then there arn't any chars that can freak it out. If not a -exec I simply wouldn't use find for it. update: ahh... an excelent point merlyn. Perl rules!

    Got Perl?

      And that causes a separate process invocation on each file, so as a denial of service attack, I can create hundreds of thousands of empty files in /tmp, and you end up firing a separate rm for each.

      The Perl solution is still the best. Fewest processes, and no problem with special characters in the filename.

      -- Randal L. Schwartz, Perl hacker