I would probably avoid storing each file into individual arrays, and try to process each file sequentially, and change the code to something like this (incomplete and obviously untested):
Or possibly, if the files are passed as arguments to the script:use strict; use warnings; my %data=(); for my $file (qw /file_1.txt file_2.txt, file_3.txt ... file_n.txt/) { open my $FH, "<", $file or die "could not open $file $!"; while (<$FH>) { chomp; my ($sz,$sum,@f) = split /\s+/; push @{$data{$sz}},$_ for @f; } close $FH; } # ...
or even (still assuming the files are passed as arguments):use strict; use warnings; my %data=(); for my $file (@ARGV) { open my $FH, "<", $file or die "could not open $file $!"; while (<$FH>) { # ... } # ... } # ...
This latest solution might be used even if the argument passed to the script is not a list of files, but, say, the directory where they are stored:use strict; use warnings; use autodie; my %data=(); while (<>) { chomp; # ... } #...
Admittedly, the latest solutions look less robust and one might want to avoid them for production code. But are they really less robust? Hmm, if glob returns a list of files, then you basically know the files are there, the only thing that is lacking might be checking read privileges, no big deal.use strict; use warnings; use autodie; my $stat_dir = shift; my %data = (); { local @ARGV = glob ("$stat_dir/*.*"); while (<>) { chomp; # ... } # ... }
In reply to Re^3: Appending arrays into the rows of a 2 dimension array
by Laurent_R
in thread Appending arrays into the rows of a 2 dimension array
by SoftwareGoddess
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |