in reply to File::Find considered hard?

The problem is not that it is hard to understand, the problem is, that in 99% of all cases, I just want a list of files, and not some code invoked on it - that's why File::Find::Rule is so much nicer. How often have you written code like the following?

my @files; File::Find::find( sub { push @files, $File::Find::name}, '.' );

The fact that File::Find only gives the local name as a parameter and not the full path to the file and that it sacrifices portability for speed ($USE_NLINK) just adds to that...

Replies are listed 'Best First'.
Re: Re: File::Find considered hard?
by Jenda (Abbot) on Mar 14, 2004 at 19:28 UTC

    Well ... never. I wrote something like

    my @files; find( sub { push @files, $File::Find::name if <some condition> }, '.')
    a few times though. And usualy the resulting list was much smaller than a list that would contain all files&directories. Most of the time though I want to actually DO something with the files.

    I do agree the several package variables and $_ are a bit strange, it would be cleaner if the filename and path was passed to &wanted as parameters, but I don't have a problem with it anyway.

    I do not understand your comment about the USE_NLINK though. From perldoc File::Find:

    You can set the variable $File::Find::dont_use_nlink to 1, if you want to force File::Find to always stat directories. This was used for file systems that do not have an "nlink" count matching the number of sub-directories. Examples are ISO-9660 (CD-ROM), AFS, HPFS (OS/2 file system), FAT (DOS file system) and a couple of others.

    You shouldn't need to set this variable, since File::Find should now detect such file systems on-the-fly and switch itself to using stat. This works even for parts of your file system, like a mounted CD-ROM.

    Jenda
    Always code as if the guy who ends up maintaining your code will be a violent psychopath who knows where you live.
       -- Rick Osborne

    Edit by castaway: Closed small tag in signature

      "A bit strange" is exactly the problem for a core module that solves a common problem - hence the comment about the abysmal interface.

      I do remember File::Find from the time where it always used the nlink entry for scanning for subdirectories, and where it failed in far too many cases. It's nice that they changed it now, but I think it's still an issue with Perl 5.6.1 - but I always set dont_use_nlink unless I forget nowadays. Still, for a module that should provide a nice and easy service, this is much too convoluted.

Re: File::Find considered hard?
by Abigail-II (Bishop) on Mar 14, 2004 at 21:10 UTC
    How often have you written code like the following?
    Never.
    chomp (my @files = `find .`);
    is shorter, immediately clear (at least to me), and works on any platform I would care about.

    Abigail

      Good to know... now I can be rox0rzing all over your systems by creating a file named /home/etcshadow/foo\n/etc/shadow.

      w00t!

      </script-kiddie>

      ------------ :Wq Not an editor command: Wq
        now I can be rox0rzing all over your systems by creating a file
        No, you won't be able to.

        Abigail

      works on any platform I would care about.

      Hrm. Do other version of Windows other than XP spit out the info you'd expect from 'find .'? Such a command on my WinXP box is invalid.

        Do other version of Windows other than XP spit out the info you'd expect from 'find .'?
        1. Yes, assuming you've installed an appropriate Unix toolset.
        2. I don't care at all about Windows.

        Abigail