Limbic~Region has asked for the wisdom of the Perl Monks concerning the following question:

All:
I am converting numerous shell scripts to Perl. Of course, I want to stream line the code for efficiency. This doesn't neccessarily mean less code, though it usually does.

To test how well I have done at making the Perl more efficient - I wanted to benchmark it.
I am wondering what everyone thinks is the most accurate way to do this.
I asked in the CB and one way to do it was use Benchmark and make a system call to the shell script.

Can any think of a better way?

Thanks - L~R

Replies are listed 'Best First'.
Re: Benchmarking shell script conversions
by Grateful_Dude (Acolyte) on Dec 31, 2002 at 19:46 UTC
    You could always use the time command:
    $ /usr/bin/time ./helloworld.pl Hello World 0.00user 0.01system 0:00.01elapsed 90%CPU (0avgtext+0avgdata 0maxresid +ent)k 0inputs+0outputs (291major+42minor)pagefaults 0swaps
Re: Benchmarking shell script conversions
by hardburn (Abbot) on Dec 31, 2002 at 19:24 UTC

    I'm hearing a bell in the back of my head, along with a quote about how premature optimization is the root of all evil.

    Since you're converting these from shell scripts, I doubt you really need them to be efficent. Make them easy to understand and add efficency later if you really need it.

    That said, if you want to test how long a certain block of code takes to execute, try this (untested):

    my $start_time = times(); # Bunch of code here my $end_time = times(); my $run_time = $end_time - $start_time; print "Took $run_time sec\n";

    Higher granuality clocks are also available (like Time::HiRes).