There are a number of ways of measuring speed. At the end of the day "time to correct solution" is much more important than "time to execute the program". In this case the (cleaned up) code that you supplied ran in 10 seconds on my system. However the Perl code is about 5 times shorter than the C code so the time to write the Perl code will have been less and the time to debug it will have been much less!
What do you like better:
# Test data generator use strict; use warnings; open outFile, "> data.txt"; print outFile ("0," x 99) . "$_," . ("0," x 10) . "0\n" for (1..100000 +0); close outFile; # Sum test use strict; use warnings; use Time::HiRes; my $val = 0; my @arr; my $start = time (); open inFile, "< data.txt"; while (<inFile>) { /(?:\d+,){99}(\d+),/; $val += $1; } close inFile; print "Final: [$val] (elapsed time " . (time () - $start) . " seconds\ +n";
In reply to Re: fast way to split and sum an input file
by GrandFather
in thread fast way to split and sum an input file
by egunnar
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |