Hi,
I have a large .gz file of 80GB in size to read. Currently I am using the utility use IO::Uncompress::Gunzip to read this .gz file
Each line of the file contains 27,000 entries and I need only particular columns from this source file in the output file. I am now parsing every line in this file using the code,
This will do the task and I am getting an OUTFILE with only the columns I need from the source file. I am looking for another method which could speed up this parsing process and getting the output into a new file. Any comments will be greatly appreciated. Thanks in advance, Praveen.while (defined(my $intensities = $INTENSITY_FILE->getline())) { my @intensities=split(/\t/,$intensities); @final_array=(); foreach($index=3;$index<scalar(@intensities);$index+=3) { push(@index_intensities,($index+1,$index+2)); } push(@final_array,(@intensities[@index_intensities])); print OUTFILE join("\t",@final_array); }
In reply to Parsing a large file 80GB .gz file and making an output file with specifice columns in this original file. by pillaipraveen
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |