in reply to rearranging an array of arrays

There is a major problem here. If you have 30 million items which take, say, 40 bytes of memory each (an empty string takes 28 before you put anything in it) you're talking 1.2 GB of RAM. If you have 2 million rows, you could find yourself going past the 2 GB addressing limit for 32-bit code. I would therefore strongly suggest not putting it into RAM. Instead insert the data as you read with something like this:
# Assume that %file has the filename for each field in your # table and @fields has the list of field names. my @fhs; for my $field (@fields) { open(my $fh, "<", $file{$field}) or die "Can't open '$file{$field}: $!"; push @fhs, $fh; } while (1) { my $did_read; my @data; for my $fh (@fhs) { my $rec = <$fh>; if (defined($rec)) { $did_read++; push @data, $rec; } else { push @data, ""; } } if ($did_read) { insert_data(@data); } else { # Came to the end of all data streams last; } }