in reply to Large scale search and replace with perl -i

You can combine the two approaches using xargs:

find . -name '*.html' -type -f -print0 | \ xargs -0 -n 50 perl -pi -e 's/foo/bar/g'

This will use find to list all the files you want, and xargs to pass them to your perl script. By specifying the -n 50 option to xargs, each invocation of perl will be passed a maximum of 50 filenames to process (if you still get too many arguments because your paths are really long, lower the number). I haven't benchmarked it to make sure, but I suspect that under most circumstances the overhead of using grep first to find the files that contain the thing you want to replace will actually be less efficient than just running the replacement on every file you find.


We're not surrounded, we're in a target-rich environment!

Replies are listed 'Best First'.
Re: Large scale search and replace with perl -i
by Abigail-II (Bishop) on Apr 14, 2003 at 19:02 UTC
    If your xargs is any good, you don't have to use the -n option. xargs will know the limits of your OS, and create argument lists that will neither have too many argument, nor will the flattened argument list exceed your OSses limit.

    Abigail