420 likes | 598 Views
Lecture 9. Recurrences and Running Time. An equation or inequality that describes a function in terms of its value on smaller inputs. T(n) = T(n-1) + n Recurrences arise when an algorithm contains recursive calls to itself What is the actual running time of the algorithm?
E N D
Recurrences and Running Time • An equation or inequality that describes a function in terms of its value on smaller inputs. T(n) = T(n-1) + n • Recurrences arise when an algorithm contains recursive calls to itself • What is the actual running time of the algorithm? • Need to solve the recurrence
Example Recurrences • T(n) = T(n-1) + n Θ(n2) • Recursive algorithm that loops through the input to eliminate one item • T(n) = T(n/2) + c Θ(lgn) • Recursive algorithm that halves the input in one step • T(n) = T(n/2) + n Θ(n) • Recursive algorithm that halves the input but must examine every item in the input • T(n) = 2T(n/2) + 1 Θ(n) • Recursive algorithm that splits the input into 2 halves and does a constant amount of other work
Methods for Solving Recurrences • Iteration method • Substitution method • Recursion tree method • Master method
The Iteration Method • Convert the recurrence into a summation and try to bound it using known series • Iterate the recurrence until the initial condition is reached. • Use back-substitution to express the recurrence in terms of n and the initial (boundary) condition.
The Iteration Method T(n) = c + T(n/2) T(n) = c + T(n/2) = c + c + T(n/4) = c + c + c + T(n/8) Assume n = 2k T(n) = c + c + … + c + T(1) = clgn + T(1) = Θ(lgn) T(n/2) = c + T(n/4) T(n/4) = c + T(n/8) k times
Iteration Method – Example Assume: n = 2k T(n) = n + 2T(n/2) T(n) = n + 2T(n/2) = n + 2(n/2 + 2T(n/4)) = n + n + 4T(n/4) = n + n + 4(n/4 + 2T(n/8)) = n + n + n + 8T(n/8) … = in + 2iT(n/2i) = kn + 2kT(1) = nlgn + nT(1) = Θ(nlgn) T(n/2) = n/2 + 2T(n/4)
The substitution method • Guess a solution • Use induction to prove that the solution works
Substitution method • Guess a solution • T(n) = O(g(n)) • Induction goal: apply the definition of the asymptotic notation • T(n) ≤ d g(n), for some d > 0 and n ≥ n0 • Induction hypothesis: T(k) ≤ d g(k) for all k < n • Prove the induction goal • Use the induction hypothesis to find some values of the constants d and n0 for which the induction goal holds (strong induction)
Substitution MethodT(n) = 2T(n/2) + n = O(n lg n) • Thus, we need to show that T(n) c nlgn with an appropriate choice of c • Inductive hypothesis: assume T(n/2) c (n/2) lg (n/2) • Substitute back into recurrence to show thatT(n) c n lgn follows, when c 1 • T(n) = 2T(n/2) + n 2(c (n/2) lg (n/2)) + n = cnlg(n/2) + n = cnlgn – cnlg 2 + n = cnlgn – cn + n cnlgn for c 1 = O(nlgn) for c 1
Recursion-tree method • A recursion tree models the costs (time) of a recursive execution of an algorithm. • The recursion-tree method can be unreliable. • The recursion tree method is good for generating guesses for the substitution method
Example of recursion tree • Solve T(n) = T(n/4) + T(n/2)+ n2: T(n)
Example of recursion tree • Solve T(n) = T(n/4) + T(n/2)+ n2:
Example of recursion tree • Solve T(n) = T(n/4) + T(n/2)+ n2:
Example of recursion tree • Solve T(n) = T(n/4) + T(n/2)+ n2:
Master’s method • When analyzing algorithms, recall that we only care about the asymptotic behavior. • Recursive algorithms are not different. Rather than solve exactly the recurrence relation associated with the cost of an algorithm, it is enough to give an asymptotic characterization. • The main tool for doing this is the master theorem.
Counting Sort CountingSort(A, B, k) for i=1 to k C[i]= 0; for j=1 to n C[A[j]] += 1; for i=2 to k C[i] = C[i] + C[i-1]; for j=n downto 1 B[C[A[j]]] = A[j]; C[A[j]] -= 1;
Sorting in linear time Counting sort: No comparisons between elements. • Input: A[1 . . n], where A[j]{1, 2, …, k}. • Output: B[1 . . n], sorted. • Auxiliary storage: C[1 . . k].
4 1 3 4 3 Counting-sort example 1 2 3 4 5 1 2 3 4 A: C: B:
4 1 3 4 3 0 0 0 0 Loop 1 1 2 3 4 5 1 2 3 4 A: C: B: fori 1tok doC[i] 0
4 1 3 4 3 0 0 0 1 Loop 2 1 2 3 4 5 1 2 3 4 A: C: B: forj 1ton doC[A[j]] C[A[j]] + 1⊳C[i] = |{key = i}|
4 1 3 4 3 1 0 0 1 Loop 2 1 2 3 4 5 1 2 3 4 A: C: B: forj 1ton doC[A[j]] C[A[j]] + 1⊳C[i] = |{key = i}|
4 1 3 4 3 1 0 1 1 Loop 2 1 2 3 4 5 1 2 3 4 A: C: B: forj 1ton doC[A[j]] C[A[j]] + 1⊳C[i] = |{key = i}|
4 1 3 4 3 1 0 1 2 Loop 2 1 2 3 4 5 1 2 3 4 A: C: B: forj 1ton doC[A[j]] C[A[j]] + 1⊳C[i] = |{key = i}|
4 1 3 4 3 1 0 2 2 Loop 2 1 2 3 4 5 1 2 3 4 A: C: B: forj 1ton doC[A[j]] C[A[j]] + 1⊳C[i] = |{key = i}|
4 1 3 4 3 1 0 2 2 1 1 2 2 B: Loop 3 1 2 3 4 5 1 2 3 4 A: C: C': fori 2tok doC[i] C[i] + C[i–1]⊳C[i] = |{key i}|
4 1 3 4 3 1 0 2 2 1 1 3 2 B: Loop 3 1 2 3 4 5 1 2 3 4 A: C: C': fori 2tok doC[i] C[i] + C[i–1]⊳C[i] = |{key i}|
4 1 3 4 3 1 0 2 2 1 1 3 5 B: Loop 3 1 2 3 4 5 1 2 3 4 A: C: C': fori 2tok doC[i] C[i] + C[i–1]⊳C[i] = |{key i}|
4 1 3 4 3 1 1 3 5 3 1 1 2 5 Loop 4 1 2 3 4 5 1 2 3 4 A: C: B: C': forjndownto1 doB[C[A[j]]] A[j] C[A[j]] C[A[j]] – 1
4 1 3 4 3 1 1 2 5 3 4 1 1 2 4 Loop 4 1 2 3 4 5 1 2 3 4 A: C: B: C': forjndownto1 doB[C[A[j]]] A[j] C[A[j]] C[A[j]] – 1
4 1 3 4 3 1 1 2 4 3 3 4 1 1 1 4 Loop 4 1 2 3 4 5 1 2 3 4 A: C: B: C': forjndownto1 doB[C[A[j]]] A[j] C[A[j]] C[A[j]] – 1
4 1 3 4 3 1 1 1 4 1 3 3 4 0 1 1 4 Loop 4 1 2 3 4 5 1 2 3 4 A: C: B: C': forjndownto1 doB[C[A[j]]] A[j] C[A[j]] C[A[j]] – 1
4 1 3 4 3 0 1 1 4 1 3 3 4 4 0 1 1 3 Loop 4 1 2 3 4 5 1 2 3 4 A: C: B: C': forjndownto1 doB[C[A[j]]] A[j] C[A[j]] C[A[j]] – 1
Analysis fori 1tok doC[i] 0 (k) forj 1ton doC[A[j]] C[A[j]] + 1 (n) fori 2tok doC[i] C[i] + C[i–1] (k) forjndownto1 doB[C[A[j]]] A[j] C[A[j]] C[A[j]] – 1 (n) (n + k)
Running time If k = O(n), then counting sort takes (n) time. • But, sorting takes (nlgn) time! • Where’s the fallacy? Answer: • Comparison sorting takes (nlgn) time. • Counting sort is not a comparison sort. • In fact, not a single comparison between elements occurs!
4 1 3 4 3 A: 1 3 3 4 4 B: Stable sorting Counting sort is a stable sort: it preserves the input order among equal elements. Exercise: What other sorts have this property?