Good solution, but you could run into trouble if the resulting command line is bigger than the maximum size supported by the OS.
If there are a lot of files to be checked, then it could be the case to check them one by one. So I'd slightly modify your command line this way:
find . -type f -exec grep -Hl {} \;
Update: grep here is the GNU grep
Ciao! --bronto
The very nature of Perl to be like natural language--inconsistant and full of dwim and special cases--makes it impossible to know it all without simply memorizing the documentation (which is not complete or totally correct anyway).
--John M. Dlugosz
| [reply] [d/l] [select] |
Good solution, but you could run into trouble if the resulting command line is bigger than the maximum size
supported by the OS.
Of course, you will only run into trouble if your xargs
is broken. The point of using xargs is to avoid the
problem you are describing.
The disadvantage of using -exec is that find
will spawn a grep process for each file found, while
with the use of xargs, far less processes will be
spawned.
Abigail
| [reply] |
The generated command line length will be the sum of the
size in bytes of the utility name and each argument treated
as strings, including a null byte terminator for each of
these strings. The xargs utility will limit the command line
length such that when the command line is invoked, the com-
bined argument and environment lists will not exceed
{ARG_MAX}-2048 bytes. Within this constraint, if neither the
-n nor the -s option is specified, the default command line
length will be at least {LINE_MAX}.
Not very clear, indeed. But one can go by abstraction :-) and suppose that the line length limit will be circumvented!
Thanks, I learnt something new!
Ciao! --bronto
The very nature of Perl to be like natural language--inconsistant and full of dwim and special cases--makes it impossible to know it all without simply memorizing the documentation (which is not complete or totally correct anyway).
--John M. Dlugosz | [reply] [d/l] |