Tutorial 10 Lecturer: Craig MacDonald Tutorial: Mon@10:10am 971117 =========== CSC 108, Fall 1997 Tutorial notes, T10 ========================================================================== Topics ====== - questions on the project - a recursion example - some complexity examples Summary ======= I introduced briefly counting statements, Big-O notation and recursion this week, covering linear and binary searching but NOT sorting methods. Some material is beyond the text, so look at http://www.cs.toronto.edu/~craig/csc108/lect11/lect11.html for a summary. RECURSION: recall 4.13 factorial (non recursive)... int fact (int n) { prod = 1; for (int i;i<=n;i++) prod *= i; return prod; } Develop base case and recursive case and discuss and trace recursive algorithms: int fact (int n) { if (n <= 1) return 1; return i * fact(n-1) ; } int fact (int n) { if (n <= 1) return 1; return fact(n-1) * i; } COMPLEXITY: Consider this pseudo code: for (i=1..n) for (j=i+1..n) STATEMENTS; } } How many times is "STATEMENTS" executed? This can be expressed as a function of n. Within the inner loop "STATEMENTS" executes n ----- \ n - i = ) 1 / ----- j = i + 1 Each iteration of the outer loop executes a different number of times but this can be expressed algebraically each time as n - i. So "STATEMENTS" get executed in time proportional to n ----- 2 \ 1/2 n - 1/2 n = ) n - i / ----- i = 1 2 2 2 Since 1/2 n - 1/2 n < n this algorithm runs in O( n ) time. Even if you have other sections of code in the algorithm that run in time linear in n the algorithm still can be said to run in O(n^2) time. Why? Any section of code that runs in O(n) time will, for large enough n be dominated by an n^2 term. In terms of BIG-O notation, the O(n) constant can be buried in the O(n^2) constant since for n > 1, 2 2 2 2 2 n = n + n > n + n