sjd6 has asked for the wisdom of the Perl Monks concerning the following question:

Monks, I thought i'd better ask those with a superior knowledge to make sure the following problem can be solved before wasting huge amounts of time researching it...

Does anyone know if it is at all possible to create a script, that when specified a particular folder dir, can scan that folder and any sub folders inside it, for a file with a particular extension (.txt for example), open it, process a script already created (to extract some content and write it to a file in the root folder), close it and continue scanning for other files with the extension?

I assume, if possible, the code for this will be quite complex. If anyone can give me some pointers, that would be fantastic!

Thanks all

Replies are listed 'Best First'.
Re: file lookup
by broquaint (Abbot) on Oct 09, 2003 at 15:11 UTC
    You're most of the way there with the likes of File::Find::Rule at your side
    use File::Find::Rule; for(find(file => name => "*.txt", in => "your_dir_here")) { ## do stuff here }
    That will iterate over the filenames of all the files in 'your_dir_here' and its subdirectores with the .txt extension. See. the File::Find::Rule docs for more information on this most marvellous of modules.
    HTH

    _________
    broquaint

Re: file lookup
by Corion (Patriarch) on Oct 09, 2003 at 15:12 UTC

    I guess you are seeking to duplicate the functionality already found in the xargs command.

    For Perl, there is the module File::Find::Rule, which makes it very easy and convenient to find all files that match a certain criterion.

    After you have obtained that list of interesting files, you just have to use the system function of Perl for each entry in the list of interesting files :

    use strict; use File::Find::Rule; my @interesting_files = File::Find::Rule->file() ->name('*.txt') ->in(@ARGV); for my $filename (@interesting_files) { system("my_program $filename"); };
    perl -MHTTP::Daemon -MHTTP::Response -MLWP::Simple -e ' ; # The $d = new HTTP::Daemon and fork and getprint $d->url and exit;#spider ($c = $d->accept())->get_request(); $c->send_response( new #in the HTTP::Response(200,$_,$_,qq(Just another Perl hacker\n))); ' # web
Re: file lookup
by Abigail-II (Bishop) on Oct 09, 2003 at 15:17 UTC
    I wouldn't use Perl for that. 'find' will do that perfectly. Let's say you want to process "/path/to/dir", and your prepared script is called "/some/script". Then, from the command line:
    $ find /path/to/dir -name "*.txt" -exec /some/script {} \;
    And yes, find is there for Windows as well.

    Abigail

      If you insist on perl, you have find2perl,

      Here is the output of find2perl

      $ find2perl /path/to/dir -name "*.txt" -exec /some/script {} #! /usr/local/bin/perl -w eval 'exec /usr/local/bin/perl -S $0 ${1+"$@"}' if 0; #$running_under_some_shell use strict; use File::Find (); # Set the variable $File::Find::dont_use_nlink if you're using AFS, # since AFS cheats. # for the convenience of &wanted calls, including -eval statements: use vars qw/*name *dir *prune/; *name = *File::Find::name; *dir = *File::Find::dir; *prune = *File::Find::prune; # Traverse desired filesystems File::Find::find({wanted => \&wanted}, '/path/to/dir'); exit; sub wanted { /^.*\.txt\z/s && &doexec(0, '/some/script','{}'); } BEGIN { require Cwd; my $cwd = Cwd::cwd(); } sub doexec { my $ok = shift; for my $word (@_) { $word =~ s#{}#$name#g } if ($ok) { my $old = select(STDOUT); $| = 1; print "@_"; select($old); return 0 unless <STDIN> =~ /^y/; } chdir $cwd; #sigh system @_; chdir $File::Find::dir; return !$?; }

      -T

      use perl; use strict;