in reply to searching a string in many files

perl -ne 'print "$ARGV: $_" if /10450676/' nc05*/kn

:D

P.S. Please note that based on my own uses of perl vs grep on gigabytes worth of log files, perl is much, MUCH faster than grep. And another note, unix shells have limits to the length of the command line you submit, that is, the length of the line after it interprets your glob 'nc05*/kn'. If you only have 10 or 20 directories, I wouldn't worry, if you have 100 or 1000, you will probably reach that maximum(Don't worry, I know at least bash will complain.) Get familiar with xargs for that case:

ls nc05*/kn | xargs -n 10 perl -ne 'print "$ARGV: $_" if /10450676/'

Addendum: Updated to make perl print the actual filename just like grep would.

Replies are listed 'Best First'.
Re: Re: searching a string in many files
by revdiablo (Prior) on Jun 25, 2003 at 21:08 UTC
    Excellent post, but I have one [extremely] minor niggle. The use of -n 10 in your xargs command is most likely superfluous. Most of the time xargs can figure out by itself the maximum number of options it can pack onto a single command line. Setting an arbitrary number (especially one as low as 10) could potentially cause xargs to call perl far more times than necessary.

      I did not know that about xargs. That's the second new thing I learned today. Boy does my brain need to cool off now. But seriously, ++ and thanks for the tip.

      Later

Re: Re: searching a string in many files
by Anonymous Monk on Jun 25, 2003 at 21:10 UTC