Jassica has asked for the wisdom of the Perl Monks concerning the following question:

Hi monks , here is what I am trying to do :
$find = qq(find . -name '*.std*' -o -name 'ULOG.*' -mtime +9 -type +f -exec rm {} \\;) ; system($find);
however , it doesnt' keep the single qute around *.std* and ULOG.* so I get and error . Is there a better way than what I am doing ? thanks

Replies are listed 'Best First'.
•Re: system command error
by merlyn (Sage) on Apr 03, 2003 at 22:37 UTC
    In addition to the other comments in this thread, you do realize that that parses as: (name matches *.std*) or (name matches ULOG.* and older than 9 days and is a file and I can delete the file)

    In other words, if the file matches *.std*, it'd never do any of the rest, because the "or" is like Perl's "or", and very low precedence (lower than the implied "and" between the rest of the steps).

    Perhaps you wanted:

    find . \( -name '*.std* -o -name 'ULOG.*' \) -mtime +9 -type f -exec r +m {} \;
    and then appropriate escaped from there...

    -- Randal L. Schwartz, Perl hacker
    Be sure to read my standard disclaimer if this is a reply.

Re: system command error
by AltBlue (Chaplain) on Apr 03, 2003 at 23:21 UTC
    you could easily implement this thingie in perl using File::Find avoiding messing around with system:
    File::Find::find( { wanted => sub { return unless -f _ && -M _ > 8; unlink if /\.std/ || /^ULOG\./; }}, '.' );
    --
    AltBlue.

      One of my complaints about File::Find is that its silly optimization (which has caused me tons of grief over the years because it was being used in so many places where it just plain breaks) of checking the nlinks from stat (in hopes of deciding that there are no subdirectories) means that you can't assume that it has just done a stat or lstat on the file in question so you have to restat the file by name and not use the magic _ as you have done above.

      So you either need to prepend a $ to your first _ or you need to do something to ensure that File::Find doesn't use the silly nlinks optimization:

      $File::Find::dont_use_nlink = 1;
      Note that if you don't disable the silly nlinks optimization, then you'll have to change that _ to $_ (or risk your script breaking in some directories on some systems) and so your script will run slower which just makes me laugh since File::Find's documentation says:
      If you do set $File::Find::dont_use_nlink to 1, you will notice slow-downs.
      I particularly like the "notice" part. Yeah, sure I will. (:

      More than once I've tried to make the nlinks optimization something that you have to request be done for you but the emotional attachment to it for some at p5p is just too great. It looks like several improvements to the code have been made regarding this feature so it is less likely to be used when it shouldn't (other than slowing down most correct uses of File::Find). I'm not convinced that it is perfect, however, because I'm pretty sure I've run into file systems where nlinks on directories are not either always 1 or always 2+number_of_subdirectories. So now I'm mostly amused with how the emotional attachment to this has manifested itself in the module documentation.

      Oh well, I don't use File::Find much anymore. For trivial uses, it is usually easier to use /bin/find and for non-trivial uses it is usually easier to roll my own directory traverser than try to figure out how to use awkward call-backs to get done what I want.

                      - tye
Re: system command error
by dga (Hermit) on Apr 03, 2003 at 22:28 UTC

    One way which is a lot more efficient resource wise.

    find . -name '*.std' -o -name 'ULOG.*' -mtime +9 -type -f -print | per +l -ne 'chop; unlink;'

    This is a fairly standard idiom and is a published example somewhere though I don't remember where I saw it.

    It would be run from the shell.

    Another option is the File::Find module do do this type of stuff directly from perl.

      Filenames may contain newlines though.
      find . \( -name '*.std' -o -name 'ULOG.*' \) -mtime +9 -type -f -print +0 | perl -0777 -F'/\0/' -lane 'unlink @F'

      Makeshifts last the longest.

Re: system command error
by tachyon (Chancellor) on Apr 04, 2003 at 04:39 UTC

    It is much better to use multi-arg list form of system system(@args) than the single arg/stringified version system( "$arg[1] $arg[2] $arg[3] $arg[4]") as you avoid a lot of quoting issues. BTW you could do this with File::Find (as demonstrated already) or grep and glob

    unlink $_ for grep { -f $_ and -M _ > 8 and m/\.std|^ULOG\./ } glob ( +"$dir/*" );

    cheers

    tachyon

    s&&rsenoyhcatreve&&&s&n.+t&"$'$`$\"$\&"&ee&&y&srve&&d&&print

Re: system command error (use File::Find::Rule)
by Aristotle (Chancellor) on Apr 04, 2003 at 10:06 UTC
    use File::Find::Rule; unlink File::Find::Rule ->name('*.std*', 'ULOG.*') ->file ->mtime(">9") ->in(".");

    Makeshifts last the longest.

Re: system command error
by Improv (Pilgrim) on Apr 04, 2003 at 13:47 UTC
    The direct cause of what's happening is that you're using qq() instead of q(). qq() interpolates its contents, q() does not. Of course, as the other posts point out, there are probably better ways to do this.

      The difference between q() and qq() only applies to the characters $, @, \, and the delimiter(s) (parens in this case) and the difference is rather subtle for the last few. Of those characters, I only see \ used and I see it used twice in a row. Now "\\" eq '\\' (they are both a single backslash character) so changing qq() to q() should make no difference at all.

                      - tye