Sure, you can open them in a loop. You could put your file descriptors in an array, and then loop through them as you print each line. As long as your system will let a process have that many open files, something like this should work (untested). It's not at all flexible, but since you seem to know that all your files have the same number of lines and the same format, it doesn't need to be. Also, if you use a CSV module, you may want to make an array of object references rather than simple file descriptors, but the looping concept would be the same.
my @fds; for (1..200){ open my $fds[$_], '<', "datafile$_" or die $!; } for my $ln (1..948){ print "$ln has "; for my $fdn (1..200){ my $line = <$fds[$fdn]>; my $field3 = get_third_field_using_whatever_csv_method($line); print $field3; print ' ' unless $fdn == 200; } print "\n"; }
Aaron B.
My Woefully Neglected Blog, where I occasionally mention Perl.
In reply to Re: opening many files
by aaron_baugher
in thread opening many files
by wanttoprogram
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |