jmneedhamco has asked for the wisdom of the Perl Monks concerning the following question:

Hi Monks:

While I have been using Perl for a few years now, I have never quite come upon anything like this. I need to read in a text file containing a list of users (or maybe a comma separated file with users and a kind of access to be granted them) and then append the information to several output text files.

I am looking for the most efficient way to accomplish this. My code which currently just updates one text file is the base of the application.

The following reads in my text file and puts the lines into an array:
# input file open(USERS, $inFile) or die ("Cannot Open $inFile\n"); while($line = <USERS>){ chomp($line); #print $line; #debug print of var push(@userNames, $line); } close(USERS);
and the output file:
open(OUT, ">> $outFile") or die("Cannot Open $outFile\n"); #append to +this file foreach my $USER(@userNames){ print OUT "$USER\n"; } close(OUT);

Note: These are the standard filehandlers in Perl.

Let's say though we want to update 8 different files with the user data (these are a propritory application that has several parts to it, so several 'access files'.

So it would be very nice if someone could point me in the right direction: My thoughts are to make a sub that opens the output file and just call it for however many files need updated in a loop passing the filename to be updated into the sub. But... Is that the most efficient way to do this? Thanks for your help in advance!

Replies are listed 'Best First'.
Re: updating multiple text files
by Anonymous Monk on Mar 10, 2015 at 15:19 UTC
    And that is why people recommend using lexical filehandles. It is sad that global ones are considered 'standard' by some.
    use strict; use warnings; use autodie; open my $users, '<', 'users.txt'; my @files; for my $file ( qw( file1 file2 file3 ) ) { open my $fh, '>>', $file; push @files, $fh; } while ( my $line = <$users> ) { print $_ $line for @files; }

      It was what was taught by firstPerlers, which makes it a defacto standard. (I note here that this does not prevent it from being sad; it merely shows a valid reason why it might be seen as a standard by some).

      If memory serves, one used to have to turn off strictures in order to not use global handles, a workaround for which was given in one of my earliest PerlMonks SoPW posts. It was there that I learned stricture control was lexically scoped. : -)

      In the end, some folks will move forward with new features of a language and others will not; it is driven by needs and values.

      For example, I still use the &subName(); form for calling non-object-oriented subroutines. There are many who do not appreciate that calling sequence, as it is archaic v4 syntax; but it adds a clarity for these old eyes, and barring a coding standard to the contrary, will continue -- sad or not -- in this corner of the world.

      :-)

Re: updating multiple text files
by crusty_collins (Friar) on Mar 10, 2015 at 15:18 UTC
    What i would suggest is making an array with a list of files. Code Untested
    my @files = qw( file1 file2 ); my @userNames = qw( chris mike ); &WriteMyFiles(\@userNames,\@files); sub WriteMyFiles { my $userNames = shift(@_); my $files = shift(@_); foreach my $file ( @$files ) { open(OUT, ">> $file") or die("Cannot Open $file\n"); foreach my $USER(@$userNames){ print OUT "$USER\n"; } close(OUT); } }
Re: updating multiple text files
by GotToBTru (Prior) on Mar 10, 2015 at 15:14 UTC