Let's try to estimate how long an operation takes in Python. In order to do that, we will use Python's time.time() function. time.time() returns the number of seconds that have passed since midnight on January 1, 1970. Here is an example:

In [1]:
import time
time.time()
Out[1]:
1478927164.2931104

Here is another example:

In [2]:
time.time()
Out[2]:
1478927164.3259952

At any point in time, time.time() will return the number of seconds elapsed, at that point in time, since January 1, 1970.

Let's use this in order to estimate how long an operation takes on the computer on which the document was generated:

In [3]:
import random

t0 = time.time()
N = 1000000
s = 0

#Approximately 10 elementary operations
#(Note: this is just an approximation)
for i in range(N): 
    s += random.random()

t1 = time.time()
print((t1-t0)/(10*N))
1.7249608039855957e-08

We repeated the same code block for N = 1000000 times, and then divided the number of seconds elapsed by 10*N, since we estimate that each iteration takes (approximately) 10 operations.

Here is how we can verify that we are approximately right? Add some more code that we know consists of two more operations:

In [4]:
t0 = time.time()
N = 1000000
s = 0
a = 0
#Approximately 10 elementary operations
#(Note: this is just an approximation)
for i in range(N): 
    s += random.random()
    a += s

t1 = time.time()
print((t1-t0)/(10*N))
2.496826648712158e-08

We would expect that 12 operations per iterations (as opposed to 10) would take (12/10)*1.86e-08 = 2.23e-08 , so we weren't that far off.