You can combine the two approaches using xargs:
find . -name '*.html' -type -f -print0 | \ xargs -0 -n 50 perl -pi -e 's/foo/bar/g'
This will use find to list all the files you want, and xargs to pass them to your perl script. By specifying the -n 50 option to xargs, each invocation of perl will be passed a maximum of 50 filenames to process (if you still get too many arguments because your paths are really long, lower the number). I haven't benchmarked it to make sure, but I suspect that under most circumstances the overhead of using grep first to find the files that contain the thing you want to replace will actually be less efficient than just running the replacement on every file you find.
| We're not surrounded, we're in a target-rich environment! |
|---|
In reply to Re: Large scale search and replace with perl -i
by jasonk
in thread Large scale search and replace with perl -i
by elbie
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |