in reply to Performance issue in the loop.

...and my code started to work ~30 percent faster.

my guess is that it is because you have eliminated 1 out of 4 array accesses. But then with all sorts of caching with modern CPUs and OS, it's quite spectacular that it works as in the textbook case (1/4=25%) and even better! Perhaps I miss something ...

How about using a temp variable, declared outside of the scope, to cache $sums[ $j - $i ], like: ( $tmp = $sums[ $j - $i ] += $_[ $j ] ) > $max and $max = $tmp; Does it make a difference?

Any ideas why every ~10th run any of these programs runs much slower?
push @max, $max;

this has to extend @max from time to time. And since all data is random, its size is unpredictable. But this can not explain the regularity of performance decrease (re: > 1.5x every 10th time). On the other hand, the interaction of two or more uniform distributions create normal-distribution effects (re: Central Limit Theorem).

The least you can do is to print the size of @max at the end of each program run and see if there is some correlation with running times?

If your algorithm allows it, you may want to switch the loops order. First $j and then $i so that $_[ $j ]; is cached in the outer loop and you eliminate a lot of array access.

Regarding timing your program, I use this:

use Time::HiRes qw/time/; my $start_time = Time::HiRes::time(); ... my $time_taken = Time::HiRes::time() - $start_time; # floating seconds

bw, bliako