in reply to append to a line arrays read from N different input files

I created a quick solution to your problem that loads all of the data from every file into a 3D array. It then outputs all of the data into separate files for each variable. See below.

use strict; my @file = qw(file1.txt file2.txt file3.txt); # read all files into one big 3D matrix my $data = []; for my $i (0 .. $#file) { open DATA, '<', $file[$i] or die; while (<DATA>) { chomp; my @line = split /\s+/, $_; my $linenum = shift @line; $data->[$_][$linenum][$i] = shift @line for (0..$#line); } } # output to files for each variable for my $n (0 .. $#$data) { open FILE, '>', "var$n.txt" or die; select FILE; for my $linenum (0 .. $#{$data->[$n]}) { print join "\t", $linenum, @{$data->[$n][$linenum]}, "\n"; } close FILE; }

I fed the above code with file1.txt, file2.txt, and file3.txt all identical to the file below:

0 0 1 2 3 4 5 1 0 1 2 3 4 5 2 0 1 2 3 4 5

Which then produced the following files:

var0.txt

0 0 0 0 1 0 0 0 2 0 0 0

var1.txt

0 1 1 1 1 1 1 1 2 1 1 1

var2.txt

0 2 2 2 1 2 2 2 2 2 2 2

var3.txt

0 3 3 3 1 3 3 3 2 3 3 3

var4.txt

0 4 4 4 1 4 4 4 2 4 4 4

var5.txt

0 5 5 5 1 5 5 5 2 5 5 5

I'm sure that there might be a more memory efficient method, perhaps using PDL. However, if your files are not extremely large, then you should be fine.