While I have been using Perl for a few years now, I have never quite come upon anything like this. I need to read in a text file containing a list of users (or maybe a comma separated file with users and a kind of access to be granted them) and then append the information to several output text files.
I am looking for the most efficient way to accomplish this. My code which currently just updates one text file is the base of the application.
The following reads in my text file and puts the lines into an array:and the output file:# input file open(USERS, $inFile) or die ("Cannot Open $inFile\n"); while($line = <USERS>){ chomp($line); #print $line; #debug print of var push(@userNames, $line); } close(USERS);
open(OUT, ">> $outFile") or die("Cannot Open $outFile\n"); #append to +this file foreach my $USER(@userNames){ print OUT "$USER\n"; } close(OUT);
Note: These are the standard filehandlers in Perl.
Let's say though we want to update 8 different files with the user data (these are a propritory application that has several parts to it, so several 'access files'.
So it would be very nice if someone could point me in the right direction: My thoughts are to make a sub that opens the output file and just call it for however many files need updated in a loop passing the filename to be updated into the sub. But... Is that the most efficient way to do this? Thanks for your help in advance!
In reply to updating multiple text files by jmneedhamco
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |