in reply to Optimizing slow restructuring of delimited files

You didn't give much to go on but my first hypothesis would be disk I/O. If the file size is small, you could try assigning the entire file to an array and parsing that. Something like
@filetoread = <INFILE>; # read in file all at once my $linestooutput = ''; # place to save output until the en +d foreach (@filetoread){ @Line = split /\s+/; #split defaults to $_ $linestooutput .= join("\t",@Line[@ColumnNumbers])."\n"; } print OUTFILE $linestooutput; # write output # or even shorter @filetoread = <INFILE>; $linestooutput .= join("\t",(split /\s+/)[@ColumnNumbers])."\n" foreac +h (@filetoread); print OUTFILE $linestooutput;
I'm not sure about the speed impact of interpolated splices? I don't imagine that is the issue but you could try something like this
while(<INFILE>) { # get the current line and split into it's columns @Line = split /\s+/, $_; #print the selected columns to the output my $outline .= @Line[$col]."\t" foreach my $col (@ColumnNumbers); print OUTFILE $outline,"\n"; }


PJ
use strict; use warnings; use diagnostics;