sesemin has asked for the wisdom of the Perl Monks concerning the following question:
I have written the following script to read the second column of many tab delimited files (~200) and merge them side by side in an output file. The input files have three columns, tab delimited separated and 6.9 million lines each. It will take forever, if I want to read and write 6.9 million lines. Do you have any other better solution to do it quicker and more efficient.
#!/usr/bin/perl -w my(@handles); unlink"Results.txt"; #would loop if already present for(<*.TI>){ open($handles[@handles],$_); } open(OUTFILE,">Results.txt"); my$atleastone=1; while($atleastone){ $atleastone=0; for my$op(@handles){ if($_=readline($op)){ my@col=split; $col[1]+=0; #otherwise you print nothing but a \t if column 2 i +s undef print OUTFILE"$col[1]\t"; $atleastone=1; }else{ print OUTFILE"0\t"; } } print OUTFILE"\n"; } undef@handles; #closes all files close(OUTFILE);
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Merging Many Files Side by Side
by ELISHEVA (Prior) on Feb 19, 2009 at 19:34 UTC | |
|
Re: Merging Many Files Side by Side
by atemon (Chaplain) on Feb 19, 2009 at 20:36 UTC | |
|
Re: Merging Many Files Side by Side
by repellent (Priest) on Feb 20, 2009 at 00:14 UTC | |
by sesemin (Beadle) on Feb 20, 2009 at 01:07 UTC | |
|
Re: Merging Many Files Side by Side
by gone2015 (Deacon) on Feb 20, 2009 at 00:37 UTC |