Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Hello guys,

I have a strange issue with my code that I cannot really understand.I've written a code that searches a given string in all the files in given directory.So far so good.The program works only if I type the directory where the script is.If I choose another directory I get the msg "Couldn't open that file perl_ex1.pl: No such file or directory at ex1.pl line 20, <STDIN> line 2." for example and that file obviously exists in the given directory. Any clarification would be useful.

Here is my code:
#!/usr/bin/perl use warnings; use strict; print "Write the directory where you want to search: \n"; my $dir=<STDIN>; chomp $dir; print "Write the string you want to search for: \n"; my $string=<STDIN>; chomp $string; opendir DH, "$dir" or die "Couldn't open that directory $dir: $!"; while($_=readdir(DH)){ next if $_ eq "." or $_ eq ".."; if (-d $_){ print "$_ is a directory\n"; next; } open FILE, "$_" or die "Couldn't open that file $_: $!"; for my $i (<FILE>){ if($i=~/($string)/){ print "I have found |$string| in |$_| file on line:\n$i\n"; last; } } }

Replies are listed 'Best First'.
Re: Perl script does not work on other directories?
by Laurent_R (Canon) on Nov 23, 2014 at 10:46 UTC
    If you use the glob function, instead of readdir, you'll get the file names with the path that you used to search them. The readdir function returns only the file names, without the path.

      I added the following right after opndir line:

       chdir DH, "$dir" or die "Couldn't change to that directory $dir: $!";

      and it works correctly now.Thanks for help.

Re: Perl script does not work on other directories?
by 2teez (Vicar) on Nov 23, 2014 at 10:58 UTC

    I think you want to use chdir to change into the directory you want.
    Using opendir, you can do like so:

    use warnings; use strict; print "Enter a directory:"; chomp( my $dir = <STDIN> ); print "String looked for: "; chomp( my $str = <STDIN> ); chdir $dir or die $!; opendir DH, $dir or die $!; while ( readdir(DH) ) { next if $_ eq '.' or $_ eq '..'; unless (-d) { open my $fh, '<', $_ or die $!; while (<$fh>) { print $_ if /\Q$str/; last; } } } closedir DH or die $!;
    I also think, it would be great if you can open the files one after the other immediately and check for desired string.

    If you tell me, I'll forget.
    If you show me, I'll remember.
    if you involve me, I'll understand.
    --- Author unknown to me
Re: Perl script does not work on other directories? (readdir pitfalls)
by Anonymous Monk on Nov 23, 2014 at 09:40 UTC
    Its because you used readdir but didn't read readdir documentation close enough

    Solution is simple, don't use readdir ;) use Path::Tiny  qw/ path /; for my $kid ( path( $dir )->children ){ ...

    If you read readdir you can see an that you can also chdir

Re: Perl script does not work on other directories?
by CountZero (Bishop) on Nov 23, 2014 at 13:44 UTC
    Run the following one-liner in your CLI within the directory whose files you want to check:
    perl -nE 'say qq(found $1 in $ARGV, line $.) if /(my search string)/; +close ARGV if eof;' ./*
    Of course you replace "my search string" by the string or pattern you are looking for.

    CountZero

    A program should be light and agile, its subroutines connected like a string of pearls. The spirit and intent of the program should be retained throughout. There should be neither too little or too much, neither needless loops nor useless variables, neither lack of structure nor overwhelming rigidity." - The Tao of Programming, 4.1 - Geoffrey James

    My blog: Imperial Deltronics
Re: Perl script does not work on other directories?
by james28909 (Deacon) on Nov 23, 2014 at 11:03 UTC
    though i am not sure if this the right approach, but you can do something like the following or get some ideas, who knows:
    use strict; use warnings; use diagnostics; use File::Slurp; my @files = read_dir($ARGV[0]); #read files in dir into an array my $dir = $ARGV[0]; my $string = $ARGV[1]; my $file; my $line; foreach my $element (@files) { open $file, '<', "$dir/$element"; #open each file while (<$file>) { if ( $_ =~ $string ) { #match your string print "found $string in $element\n"; } } close $file; }
    this will read each file into an array, then it will open each of those files and search for a match.

    EDIT: in your test script in your original post, you may want to look into Cwd. i think that would allow you to get the current working directory AFTER you leave the dir that the script is in. and would only be slightly trivial to implement without a major code change
Re: Perl script does not work on other directories?
by james28909 (Deacon) on Nov 24, 2014 at 02:20 UTC
    heres another way, tho it still does not use chdir, but it does work from the scripts location and all you have to feed it is the absolute path or from script path as arg.
    use strict; use warnings; use diagnostics; use File::Slurp; #use Cwd; my @files; get_args(); sub get_args { my $dir; my $string; print "enter path\n"; chomp( $dir = <STDIN> ); print "enter match string\n"; chomp( $string = <STDIN> ); print "\n"; @files = read_dir($dir); print $_ for @files ; traverse( $dir, $string ); } sub traverse { foreach my $element (@files) { open $file, "<", $dir.'/'.$element; while (<$file>) { if ( $_ =~ m/$string/i ) { print "found $string in $element\n"; } } close $file; } get_args(); }
    to me its more manageable if it is broken up in subroutines. that way you can have one function to get the input, then the other to actually collect results.

    EDIT: here is the newest code, which i have not really looked any further into this tbh, but you should be able to add chdir and list dir contents pretty easily. Could maybe even add switches for searching cwd for the string you needed.
    use strict; use warnings; use diagnostics; use File::Slurp; #use Cwd; my @files; get_args(); sub get_args { print "enter path: "; chomp( my $dir = <STDIN> ); print "enter match string: "; chomp( my $string = <STDIN> ); @files = read_dir($dir); print "\n", $_ for @files, "\n" ; traverse( $dir, $string ); } sub traverse { my ( $dir, $string ) = @_; for my $element (@files) { open (my $file, '<', "$dir/$element") || next; while (<$file>) { if ( $_ =~ m/$string/i ) { print "found $string in $element\n"; } } close $file; } print "\n"; get_args(); }
      Why write  traverse( $dir, $string ); if you're using fake globals?
        the script complained when i tried to declare them only when i needed them. but yes you are correct and i completely missed that. i changed the script on my machine and declared the variables inside their respective subroutine and its working fine. thanks for pointing that out :)
      Here is my last attempt. This will print directories then files and you can traverse directories easily. typing ".." at "enter path" will get you to parent dir
      use strict; use warnings; use diagnostics; use File::Slurp; use Cwd; for(;;){ my @files; my $dir; print "enter path: "; chomp( $dir = <STDIN> ); chdir($dir); $dir = getcwd($dir); print $dir; # print "enter match string: "; # chomp( my $string = <STDIN> ); @files = read_dir($dir); foreach my $element (@files) { next if -f $element; print "$element\n"; } foreach my $element (@files) { next if -d $element; print "$element\n"; } print "\ncurrent dir is: $dir\n"; print "\n"; system("pause"); system("cls"); }
        i was trying to figure out how to sort the -d and -f without having to use two loops. but i am unsure how to use sort to do such a thing.