Looking at my copy of
The Art of Computer Programming: Seminumerical Methods, 3E, Knuth gives the worst case as:
O(GCD) ~= 4.785*log(N) + 0.6723
where
N is the smaller argument to Euclid's algorithm. The average is much better:
Avg(GCD) ~= 0.1775*N
Update: hv pointed out that the approximation for the mean didn't look right. I went back to the source, and see that I pulled out an intermediate result (I have to read more closely in the future). So it's not the mean.
The mean is given as
Avg(GCD) ~= 1.9405*log10(N)
where
N is the smaller argument to Euclid's algorithm, and ignoring a subtracted correction term based on the divisors of N.
-QM
--
Quantum Mechanics: The dreams stuff is made of