How can I make script faster? Right now it runs around 15 minutes. File is 500MB. Contains 30 columns and million rows. colums are separated by tab. Data inside is text and numbers escaped with ""
#!/usr/local/bin/perl -w # $fn1 = '/in.CSV'; open (INST,"$fn1"); open (ABI,">/out.ins"); while (<INST>) { s/\õ/\ä/g; s/\"-/\"Ä/g; s/\--/\-Ä/g; s/\ - /\*-*/g; s/\ -/\ Ä/g; s/\*-*/\ - /g; s/\"_/\"Ü/g; s/\ _/\ Ü/g; s/\__/\_Ü/g; s/\³/\ü/g; s/\§/\õ/g; chomp; chop; @array = ' '; @array = split(/\t/); if ($array[0] eq "\"Branchno\"") { next; } if ($array[0] eq "\"\"" ) {next;} $result = join ("|",@array)."|"; $result =~ s/\"//g; print ABI $result,"\n"; } close (INST); close (ABI);
In reply to script optimization by tannx
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |