Here is the usual way we compute exponents.
def power(x, n):
res = 1
for i in range(n):
res *= x
Assuming that x
is a float
, each multiplication takes a constant amount of time. That means that the complexity of this function is $\mathcal{O}(n)$, since the block repeats $n$ times, and each iteration takes the same amount of time.
Here's an idea for computing $x^n$ quickly, exponentiation by squaring. The key insight is that
$$(...((x^2)^2)^2... ^2)^2 = x^{2^{k+1}}$$if the squaring is performed $k$ times. For example,
$$((x^2)^2)^2 = x^{16} = x^{2^{3+1}}$$.
Suppose that $n = 2^{k+1}$ for some whole $k$ (i.e., $n$ is a power of two.) We can now use the repeated squaring trick and the fact that $(x^{n/2})
def power(x, n):
'''Return x^n
Arguments:
x -- a float
n -- a non-negative integer that is a power of 2'''
if n == 0:
return 1 #x^0 = 1
if n == 1:
return x #x^1 = x
half_power = power(x, n//2)
return half_power * half_power #using **2 would be cheating
#(kind of.) The whole idea is that
#we can't use **
This is just repeated squaring in reverse. Here is the call tree (for convenience, we just keep track of n
, since x
does not change within recursive calls.)
1
/|
x |
\ |
2
/ |
x^2 |
|
.
.
|
|
|
n/8
/|
x^(n/8) |
\|
n/4
/|
x^(n/4) |
\ |
n/2
/ |
x^(n/2)|
\ |
n
\
\
x^n
How many calls in total are there? We make the initial call, plus as many calls as are necessary to get from n
to 1
by dividing by 2
every time (i.e., $\log_2 n$, by the definition of $\log_2$.) In total, we make $\log_2 n + 1$ calls. Each call takes the same amount of time, which means that the complexity here is $\mathcal{O}(\log n)$ -- much faster than $\mathcal{O}(n)$. Here's an example:
1
/|
x |
\ |
2
/ |
x^2 |
\4
/|
x^4|
\ |
8
/|
x^8 |
\|
16
/ |
x^16 |
\ |
32
\
\
x^32
We just had to make $6$ calls to compute $x^32$ (compared to 32 iterations of the $\mathcal{O}(n)$ algorithm.
Here's another true fact: $2^n = 2\times 2^{n-1} = 2^{n-1} + 2^{n-1}$. Let's use it to compute (really slowly) the powers of 2.
def power2(n):
'''Return 2^n, for n>=1'''
if n == 1:
return 2
return power2(n-1) + power2(n-1)
Here is the call tree:
number of calls
1 1 1 ................... 1 2^(n-1)
. . .
........................../ .
. / .
\ \ / \ / .
n-2 n-2 n-2 n-2 2^2
\ / \ /
\ / \ /
n-1 n-1 2^1
\ /
\ /
\ /
n 2^0
The computation to figure out complexity is pretty easy: we have 1 call on the bottom level, 2 calls on the second level, then 4 calls, 8 calls, etc. There are n levels in total, and we start with a level that has $2^0$ calls, so the last level has $2^{n-1}$ calls.
The total number of calls is $2^0 + 2^1 + ... + 2^{n-1} = \frac{2^n-1}{2-1} = 2^n-1$. The complexity is therefore $\mathcal{O}(2^n)$.