From your meagre description of the files, I assume that each files data is keyed by date & time?
If so, the rather than loading all the data into arrays, accumulate it in a hash:
my %data; open FILE, '<', 'gravity' or die; while( <FILE> ) { my @fields = split ' ', $_; $data{ @fields[ 0, 1 ] } = join "\t", @fields; } close FILE; open FILE, '<', 'magnetics' or die; while( <FILE> ) { my @fields = split ' ', $_; ## Pad the hash if we didn't see this date/time in the gravity fil +e $data[ "@fields[ 0, 1 ]" } //= join "\t", @fields[ 0,1 ], ('n/a') +x 3; $data{ "@fields[ 0, 1 ]" } .= join "\t", @fields[ 2 .. $#fields ]; } close FILE; open FILE, '<', 'bathymetry' or die; while( <FILE> ) { my @fields = split ' ', $_; ## Pad the hash if we didn't see this date/time before (How many field +s added by the magnetics?) $data[ "@fields[ 0, 1 ]" } //= join "\t", @fields[ 0,1 ], ('n/a') +x ???; $data{ "@fields[ 0, 1 ]" } .= join "\t", @fields[ 2 .. $#fields ]; } close FILE; for my $key ( sort keys %data ) { print $data{ $key }; }
Depending upon your date & time formats, you might need a more sophisticated sort.
In reply to Re: Joining separate data files to make one.
by BrowserUk
in thread Joining separate data files to make one.
by msexton
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |