in reply to How do I process multiple files in parallel?

If you read x files at once, you will have to open and read from all of them:
use strict; use warnings; use IO::Handle; my @files = qw(t1.txt t2.txt t3.txt); my @handles = (); for ( @files ) { push @handles, new IO::Handle; open $handles[-1], "<", $_ or die "cannot open $_!\n"; } while (1) { my @data; for my $handle ( @handles ) { my $line = <$handle>; exit unless defined $line; chomp $line; $line =~ /= ([^=]+)/; push @data, $1; } print "@data\n"; }
This script will safely exit when one of the files runs into eof.

Update: I just noticed you need your data vertically:
use strict; use warnings; use IO::Handle; use Data::Dumper; my @files = qw(t1.txt t2.txt t3.txt); my @data = ([],[],[]); my @handles = (); for ( @files ) { push @handles, new IO::Handle; open $handles[-1], "<", $_ or die "cannot open $_!\n"; } EOF: while (1) { my $i=0; for my $handle ( @handles ) { my $line = <$handle>; last EOF unless defined $line; chomp $line; $line =~ /= ([^=]+)/; push @{$data[$i++]}, $1; } } print Dumper (\@data);
The code works with n-files and n-lines per file.


holli, /regexed monk/