TASdvlper has asked for the wisdom of the Perl Monks concerning the following question:

Hello all,

I was wondering is there a "quick/lightweight" perl in-line command to search the current directory, and all it's sub-directories, and look for a particular keyword (or words) in each file ?

I'm sure I can script, but I was hoping just to run something from the command line. Not that it's particularly better than a script.

Basically, I have a lot of code, in many diretories, and sometimes I need to look for a particular variable and sometimes I forgot what files contain those variable names. (Just one example of a use for this code).

Thanks all ....

  • Comment on Searching word(s) in multiple file and directories

Replies are listed 'Best First'.
Re: Searching word(s) in multiple file and directories
by Roger (Parson) on Dec 03, 2003 at 19:43 UTC
    You could use the grep utility if you are working on Unix platforms.

    If you are working under Windows, you could use the find utility.

    I have written a simple search.pl script in perl to search a directory tree for files containing a regex pattern. It's only tested under Windows.
    use strict; use warnings; use File::Find; use Getopt::Long; use Pod::Usage; use Carp; # Parse command line arguments and assign corresponding variables GetOptions ( 'r|rootdir=s' => \( my $rootdir = "./" ), 'p|pattern=s' => \( my $pattern = undef ), 'i' => \( my $case_sensitive = 0 ), 'v|verbose' => \( my $verbose = 0 ), ); unless ( defined $pattern ) { print <<USAGE Usage: $0 [options] Options: -r|--rootdir [dir] Specify the top directory -p|--pattern [pattern] Specify the pattern to look for (regex) -i Case sensitive search -v|--verbose Print more info USAGE ; exit(0); } $pattern = "(?i)" . $pattern if $case_sensitive; print "looking under $rootdir for { /$pattern/ }\n" if $verbose; find({ wanted => \&filefilter }, $rootdir); sub filefilter { return if /^\.+$/; open FILE, "<$File::Find::name" or carp "could not open file: $File::Find::name"; my $file = do { local $/; <FILE> }; close FILE; print ">> ", $File::Find::name, "\n" if $file =~ /$pattern/; }
    And the output -
    P:\Perl>perl search.pl -r ./ -p "(Find|Dir)" -i >> ./f1.pl >> ./f15.pl >> ./f2.pl >> ./f48.pl >> ./f6.pl >> ./fun1.pl >> ./Fun2.pl >> ./result.txt >> ./search.pl
Re: Searching word(s) in multiple file and directories
by flounder99 (Friar) on Dec 03, 2003 at 20:26 UTC
Re: Searching word(s) in multiple file and directories
by Zed_Lopez (Chaplain) on Dec 03, 2003 at 20:29 UTC

    non-Perl unix approach:

    find . -exec fgrep variable_name {} /dev/null ;

    Of course, you can substitute any directory for '.'; you need to surround fgrep's argument with single-quotes if it has shell-interpretable characters like '$'; you can use grep instead of fgrep to search by regexp instead of for plaintext.

    I have this in my .bashrc:

    rfind() { find . -exec fgrep $1 {} /dev/null \; }
      FWIW, some flavors of grep handle recursion on their own with the -r flag; check your local manpage for confirmation or just try it:
      fgrep -r variable_name .
        Just what I was looking for. thanks.
      Or to just print the filenames with the matching text:
      find /path -type f -exec grep -l "StringToMatch" "{}" \;
      from there you can pipe on and actually do something with the output. Note that the -l is a lowercase L.


      -Waswas