arrow has asked for the wisdom of the Perl Monks concerning the following question:
I need to go through a large amount of perl files to change certain paths and urls contained in them. Unfortunately, i'm not at my home computer, so i don't have the script available right now, but will try to post it soon.
Basically, i have the readdir command and capture all the perl files in the directory. I then open up each file (using a foreach loop with the filenames being looped and going into an open command. Then i go through every line in the opened file, searching for a specific path or url template and replacing it with the new one. Then, before going on to the next file, it writes the now edited lines into the file. When i tried this, it merely created a blank copy of every perl file with .txt at the end.
I haven't had much time to play with it, but i was wondering if there is a better, more efficient way of going through all the perl files and replacing a certain part in a line...also, don't crucify me for not having the script, i would have posted it, but i've been grounded (don't laugh) and can't use my home computer for a while. Anyway, thanks for any help you may be able to give me...
Just Another Perl Wannabe
Re: Changing lots of files...
by skyknight (Hermit) on Jul 24, 2003 at 03:47 UTC
|
You're probably doing a lot more work in Perl than you really have to do. Try leveraging the power of command line tools to your advantage. I bet you could do what you want with a single command line.
find /path/to/dir -name '*.pm' | xargs perl -i.bak -n -e 's/original/n
+ew/; print;'
The find command is going to spew out a list of files. The xargs command is going to invoke perl with the switches specified, and drop the list of files on the end of the command line. For perl: the -i switch specifies in place editing, and copies the originals to files with the .bak extension added; -e specifies the code you're going to execute; -n specifies an implicit while (<>) { ... } around the stuff specified by -e.
I am constantly amazed by the incredible power that I can wield with a single, well constructed command line. However, as an addendum, your initial statement, "I need to go through a large amount of perl files to change certain paths and urls contained in them", makes me curious... Could you have created a common file that contained constants that all of your scripts might have included? It sounds like you might have committed the cardinal sin of hard coding constants all over your code, when a single hard coding and then repeated reference thereto would have been a far more maintainable solution.
| [reply] [Watch: Dir/Any] [d/l] [select] |
|
or even
... perl -i.bak -pe 's/original/new/'
see perldoc perlrun
-- Hofmator
| [reply] [Watch: Dir/Any] [d/l] |
|
Indeed, you have done me one better. I shall have to remember that.
| [reply] [Watch: Dir/Any] |
|
s(/old/path)(/new/path)g
to avoid the LTS (leaning toothpick syndrome) | [reply] [Watch: Dir/Any] [d/l] |
|
I am sure that that would work, but I wouldn't know, my scripts are all for the internet, thus I can only interact with them with a browser.
Just Another Perl Wannabe
| [reply] [Watch: Dir/Any] |
Re: Changing lots of files...
by graff (Chancellor) on Jul 24, 2003 at 03:42 UTC
|
Granting that none of your perl script files is ridiculously huge, something like this should have worked (though I haven't tested this example):
opendir(D,$path);
my @perlfiles = grep /\w+\.pl$/, readdir(D); # (or whatever works for
+you)
closedir D;
for my $file (@perlfiles) {
my $name = "$path/$file";
unless (open(P,$name)) {
warn "what happened to $name? $!";
next;
}
my @script = <P>;
close P;
my $newsize = 0;
my $badedit = 0;
for (@script) {
my $oldlen = length();
if ( $oldlen ) {
# do your edits here, and:
my $newlen = length()
unless ( $newlen ) { # maybe you would check other stuff...
warn "Oops -- bad edit for $name; check $name.new\n";
$badedit++;
}
$newsize += $newlen;
}
}
unless (open(P,">$name.new")) {
die "can't seem to open a new version of $name: $!";
}
print P for (@script);
close P;
if ( -s "$name.new" == $newsize and not $badedit ) {
rename "$name.new", $name;
} else {
die "couldn't write a complete edited version of $name\n";
}
}
There might be a few other ways to do it, but for automatic edits of my precious perl code, this is the sort of approach I would prefer. (This could be done more compactly, but again, it's worth typing a little more, just to be clear and simple.)
If you didn't close the input file before writing the edited output, if you didn't use at least some amount of error checking, etc, well, live and learn... | [reply] [Watch: Dir/Any] [d/l] |
Re: Changing lots of files...
by Hofmator (Curate) on Jul 24, 2003 at 07:12 UTC
|
As a sidenote - it's probably not necessary in your case - for finding files in general (especially if they are spread into sub and subsub-directories in a tree), have a look at File::Find.
-- Hofmator | [reply] [Watch: Dir/Any] |
|
I am aware of that module, and have used it occasionally, but going to the encumbrance of importing a module and specifying callbacks spoils the fun of making the simplest one liner solution possible. :-)
| [reply] [Watch: Dir/Any] |
Re: Changing lots of files...(TIMTOWTDI)
by barrd (Canon) on Jul 24, 2003 at 15:20 UTC
|
Hi arrow,
You ask:
"but i was wondering if there is a better, more efficient way of going
through all the perl files and replacing a certain part in a line"
The answer is yes... don't go to the bother of going through all your
files in the first place ;). What I mean by this is create a module
for yourself that acts like a config file and basically
allows you to change any (and all) variables from one location.
Heres the steps:
- Create a new text file (call it globalVars, or whatever you like...)
- Then type and edit the following to your needs;
package globalVars; # <- the name you chose
use Exporter;
@ISA = qw(Exporter);
@EXPORT = qw(
$PATH_FOO
$URL_FOO
);
# Some very good descriptive text about a path
our $PATH_FOO = "/var/www/my/site/";
# Some very good descriptive text about a url
our $URL_FOO = "http://www.foo.net/bar/baz.html";
- Now you must work out where in your system to put this file,
if from the command line you type perl -e 'print join("\n", @INC);'
this will give you a listing of all the directories that perl looks
in for external modules. Choose one and move it there(1).
- Heres the hard part :(, you must now using the techniques outlined
by the kind monks above replace all your hard coded paths and urls
within your scripts with the var name/s you entered into your config
file.
- Then add use globalVars; to the header of all your scripts.
e.g.
#!/usr/bin/perl -w
use strict;
use globalVars;
....
Once you have done that, (changed all the hard coded links into vars
& added use globalVars; to all your scripts) its a
simple matter of editing that text file to globally change all your
vars.
Its just another TIMTOWTDI, and whilst in the short term it may seem a
lot of work I'm sure you can see the advantages in so much as the next
time you need to do editing, you change one file and all scripts are
magically updated.
HTH ~ (well, its how I do it anyway... :)
(1) Which one you choose is up to you, everyone seems to have their
favourite, you can always move it later. | [reply] [Watch: Dir/Any] [d/l] [select] |
Re: Changing lots of files...
by arrow (Friar) on Aug 07, 2003 at 03:05 UTC
|
I decided in the end to just hard code the path and url in the first script, and since my scripts are all CGI scripts, I just put the url and path in a hidden field...I realize that it's not the best solution, but it works for me...anyway, thanks for all you're suggestions, I'm probably gonna copy graff's code for future use, so my question was not entirely useless...or not
Just Another Perl Wannabe | [reply] [Watch: Dir/Any] |
|
|