in reply to Re: Deleting files over 2 weeks old?
in thread Deleting files over 2 weeks old?

An excellent point on the ctime, merlyn. However, why redirect and execute, as opposed to piping it into 'xargs', and using 'rm'?

--Chris

e-mail jcwren
  • Comment on (jcwren) RE: Re: Deleting files over 2 weeks old?

Replies are listed 'Best First'.
RE: (jcwren) RE: Re: Deleting files over 2 weeks old?
by merlyn (Sage) on Aug 03, 2000 at 02:57 UTC
    OK, let's take the classic root crontab code:
    find /tmp /usr/tmp -atime +7 -print | xargs /bin/rm -f
    And then I come along (as an ordinary user) and do the following:
    $ mkdir -p "/tmp/foo /etc" # yes, that's a newline after foo before the /etc $ touch "/tmp/foo /etc/passwd" # yes, that's a newline after foo again
    And then sit back 7 days. Boom. You have no /etc/passwd.

    The problem is that you are using newline as a delimiter, and yet it is a legal filename character. You need find .. -print0 and xargs with a -0, but that's not portable. Even though Perl isn't strictly everywhere, it's everywhere the perlmonks are, so my solution succeeds in a safe way.

    -- Randal L. Schwartz, Perl hacker

      Ouch! nice one merlyn Can I vote myself down for leaving crap find commands on nearly every server I've accessed?
      I would have used:
      find /somewhere/or/something -atime +7 -exec rm -v {} \;
      Then there arn't any chars that can freak it out. If not a -exec I simply wouldn't use find for it. update: ahh... an excelent point merlyn. Perl rules!

      Got Perl?

        And that causes a separate process invocation on each file, so as a denial of service attack, I can create hundreds of thousands of empty files in /tmp, and you end up firing a separate rm for each.

        The Perl solution is still the best. Fewest processes, and no problem with special characters in the filename.

        -- Randal L. Schwartz, Perl hacker