in reply to Optimizing existing Perl code (in practise)
Results: 01:32:57 01:33:48 (51 seconds)#!c:/perl/bin/perl -w use strict; use POSIX qw(strftime); my $x; my $maxint = 200000; my $start = strftime "%H:%M:%S", localtime; for ($x=0; $x <$maxint;$x++) { print unpack "H*", "abc" } my $finish = strftime "%H:%M:%S", localtime; print "$start $finish";
Results: 01:31:56 01:32:50 (54 seconds)#!c:/perl/bin/perl -w use strict; use POSIX qw(strftime); my $x; my $maxint = 200000; my $start = strftime "%H:%M:%S", localtime; for ($x=0; $x <$maxint;$x++) { printf "%x%x%x",ord('a'),ord('b'),ord('c'); } my $finish = strftime "%H:%M:%S", localtime; print "$start $finish";
In this case, unpack is the clear winner, although the performance difference doesn't become apparent until after 100000 iterations. So, in my opinion, being that TIMTOWTDI, I would look for a performance differential between these methods and opt for the one that requires the least amount of execution time.
The second thing I would check to see if any shelling out can be replaced by an available perl function. I recently wrote a program that required that the date/time stamps in a log file be updated. For this, I made the mistake of relying on shelling out
when I should have usedmy $time1 = `date '+%H:%M:%S'`;
Hope this helps.my $time1 = strftime "%H:%M:%S", localtime;
cheers, -semio
|
|---|