It should be noted that many algorithm's O time can be reduced. For instance, many O(n^2) algorithms can be reduced to O(n log(n)) by finding a way to cut the inner loop short. A famous example is bubblesort:
# O(n^2) for my $i (0..$#a-1) { for (0..$#a-1-$i) { ($a[$_],$a[$_+1]) = ($a[$_+1],$a[$_]) if ($a[$_+1] < $a[$_]); } } # O(n log(n)) for my $i (0..$#a-1) { # looping over already-sorted items is not needed for (0..$#a-1-$i) { ($a[$_],$a[$_+1]) = ($a[$_+1],$a[$_]) if ($a[$_+1] < $a[$_]); } }
O(n) algorithms that involve a search via a loop can sometimes be reduced from O(n) to O(log(n)) by breaking out of the loop once a "result" is found:
# O(n) my $truestatus; foreach my $item (@list) { $truestatus = 1 if (some_condition); } # O(log(n)) my $truestatus; foreach my $item (@list) { # if some_condition is going to be valid for at least one $item in # @list, then the limit of the average runtime will approach # O(log n). if (some_condition) { $truestatus = 1; last } }
Of course, the secret is to know where to apply these optimizations, which is often harder than it seems :)
In reply to Re: An informal introduction to O(N) notation
by jryan
in thread An informal introduction to O(N) notation
by dws
For: | Use: | ||
& | & | ||
< | < | ||
> | > | ||
[ | [ | ||
] | ] |