in reply to Re: Re: greping for a word
in thread greping for a word
[Inevitable off-topic discussion ensues]
With a large amount of files (and with hundreds of directories, we can assume there are a large amount of files. Either that or someone has designed a very bad filesystem layout), using -exec with find can be substantially slower than using xargs. The reason being that -exec forks once for every file, whereas xargs will pack as many arguments on a single command line as possible. This is not only conceptually cleaner (imho), but like I said, with lots of files it's noticeably faster.
Update: Just to test my assertions, I ran a test to see what the difference was. As you can see, the directory I'm running this in has 29,388 files. Granted this benchmark might not be representative of the actual post, but the difference is quite dramatic. Maybe I've done something wrong:
$ time find ./ -type f | wc -l 29388 real 0m0.244s user 0m0.034s sys 0m0.181s $ time find ./ -type f -print0 | xargs -0 ls | wc -l 29388 real 0m1.049s user 0m0.420s sys 0m0.588s $ time find ./ -type f -exec ls '{}' \; | wc -l 29388 real 0m46.239s user 0m16.509s sys 0m25.974s
I ran each test multiple times, and posted the lowest time for each.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Re: Re: Re: greping for a word
by jdporter (Paladin) on Apr 29, 2004 at 20:03 UTC |