in reply to Changing data in alot of files
This isn't fancy, but it will work,
This version is not much different from yours. It reads and processes the lines one at a time, instead of slurping to and array of lines. That will save memory by taking a smaller amount and reusing it. Efficient use of memory often increases speed in perl. Doing it that way requires a second file handle to write the lines to, so I opened a temporary file called "$filename!" to hold them. Once a file is done, rename does the file replacement very efficiently.for (@files) { open OLDDATA, "< $_" or warn 'File does not open: ', $! and next; open NEWDATA, "> $_!" or warn 'File not open: ', $! and next; while (<OLDDATA>) { s/aaaaa/FFFFFFF/gi; s/bbbbb/EEEEEEE/gi; s/cccccc/GGGGGGG/gi; print NEWDATA $_; } close NEWDATA or warn $! and next; close OLDDATA or warn $! and next; rename "$_!", $_; }
Beware that "$filename!" doesn't already exist. You probably won't have to worry about that if these files have some systemmatic naming, but if it's a problem the -e file test will help.
Update: The DATA filehandle is special to inlined data in perl. I changed the name.
After Compline,
Zaxo
|
|---|