in reply to Large scale search and replace with perl -i

Try this: find / -name "*.html" -exec perl -pi -e 's/find/replace/gi' {} \;

Update Ooops reread your questions. Hmmm, not sure about ignoring certain files. However, does filtering your find file list through grep really gain you any speed? You are having grep go through all your files and then have perl go through whatever files grep returns.

Neil Watson
watson-wilson.ca

Replies are listed 'Best First'.
Re: Large scale search and replace with perl -i
by Abigail-II (Bishop) on Apr 14, 2003 at 19:09 UTC
    If there are many files that will not have a match, this might actually be faster, because you will save on IO writes. The perl -i will always write to a new (temporary) file, even if it turns out the content is the same - after all, Perl can't know there isn't a match. So, you will do more IO writes, and your OS goes twice as fast through its buffer cache.

    It's hard to say whether a grep is worthwhile. Without knowing more about the content of the files, I won't dismiss it.

    Abigail