I have a large number of files that I need to do a search and replace on. At this point, I'm resigned to using the following:
find . -name '*.html' -type f -exec perl -pi -e 's/foo/bar/g;' \{\} \;
Which has two problems that I can think of:
Every file acted upon spawns a new process through find's -exec option. I was thinking instead to try:
perl -pi -e 's/foo/bar/g;' `find . -name '*.html' -type f`
But I've had problems with that in the past when find returns a very large list.
Files that do not contain the match get operated on anyway. There's a lot of extra overhead here as well, and timestamps get changed to boot. Again, I could try:
perl -pi -e 's/foo/bar/g;' `find . -name '*.html' -type f -exec grep -l foo \{\} \;`
But I still have all the same problems as with the first item above.
Is there another way to use perl -i on a directory recursively so that only files matching a certain criteria are updated?
elbieelbieelbie
In reply to Large scale search and replace with perl -i by elbie
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |