130 likes | 298 Views
T ( n ) = aT ( n / b ) + cn 1 = a ( aT ( n / b / b ) + cn / b ) + cn 2 = a 2 T ( n / b 2 ) + cna / b + cn = a 2 T ( n / b 2 ) + cn ( a / b + 1) = a 2 ( aT ( n / b 2 / b ) + cn / b 2 ) + cn ( a / b + 1) 3
E N D
T(n) = aT(n/b) + cn 1 = a(aT(n/b/b) + cn/b) + cn 2 = a2T(n/b2) + cna/b + cn = a2T(n/b2) + cn(a/b + 1) = a2(aT(n/b2/b) + cn/b2) + cn(a/b + 1) 3 = a3T(n/b3) + cn(a2/b2) + cn(a/b + 1) = a3T(n/b3) + cn(a2/b2 + a/b + 1) = … = akT(n/bk) + cn(ak-1/bk-1 + ak-2/bk-2 + … + a2/b2 + a/b + 1) k
So we have • T(n) = akT(n/bk) + cn(ak-1/bk-1 + ... + a2/b2 +a/b+1) • To stop the recursion, we should have • n/bk = 1 n = bk k = logbn • T(n) = akT(1) + cn(ak-1/bk-1 +...+ a2/b2 + a/b+1) = akc + cn(ak-1/bk-1 + ... + a2/b2 + a/b + 1) = cak + cn(ak-1/bk-1 + ... + a2/b2 + a/b + 1) = cnak/bk + cn(ak-1/bk-1 + ... +a2/b2 +a/b+1) = cn(ak/bk+ ... + a2/b2 + a/b + 1)
So with k = logb n • T(n) = cn(ak/bk + ... + a2/b2 + a/b + 1) • What if a = b? • T(n) = cn(1+…+1+1+1) //k+1 times = cn(k + 1) = cn(logb n + 1) = (n logb n)
So with k = logb n • T(n) = cn(ak/bk + ... + a2/b2 + a/b + 1) • What if a < b? • Recall that • (xk + xk-1 + … + x + 1) = (xk+1 -1)/(x-1) • So: • T(n) = cn ·(1) = (n)
So with k = logbn • T(n) = cn(ak/bk+ ... + a2/b2 + a/b + 1) • What if a b? • T(n) = cn · (ak / bk) = cn · (alogbn / blogbn) = cn · (alogbn / n) recall logarithm fact: alogbn = nlogba = cn · (nlogba / n) = (cn · nlogba / n) = (nlogba)
The Master Method • Based on the Master theorem. • “Cookbook” approach for solving recurrences of the form T(n) = aT(n/b) + f(n) • a 1, b > 1 are constants. • f(n) is asymptotically positive. • Requires memorization of three cases.
The Master Theorem • Let a 1 and b > 1be constants, let f(n) be a function, and Let T(n) be defined on nonnegative integers by the recurrence T(n) = aT(n/b) + f(n), where we can replace n/b by n/b or n/b. T(n) can be bounded asymptotically in three cases: • If f(n) = O(nlogba–) for some constant > 0, then T(n) = (nlogba). • If f(n) = (nlogba), then T(n) = (nlogbalg n). • If f(n) = (nlogba+) for some constant > 0, and if, for some constant c < 1 and all sufficiently large n, we have a·f(n/b) c f(n), then T(n) = (f(n)).
The Master Theorem • Given: a divide and conquer algorithm • An algorithm that divides the problem of size n into a subproblems, each of size n/b • Let the cost of each stage (i.e., the work to divide the problem + combine solved subproblems) be described by the function f(n) • Then, the Master Theorem gives us a cookbook for the algorithm’s running time:
The Master Theorem • if T(n) = aT(n/b) + f(n) where a ≥ 1 & b > 1 • then
Understanding Master Theorem • In each of the three cases, we are comparing f(n) with nlogba , the solution to the recurrence is determined by the larger of the two functions. • In case 1, if the function nlogba is the larger, then the solution T(n) = Θ(nlogba). • In case 3, if the function f(n) is the larger, then the solution is T(n) = (f(n)). • In case 2, if the two functions are the same size, then the solution is T(n) = Θ(nlogba lg n) = Θ(f(n) lg n).
Understanding Master Theorem • In case 1, not only must f(n) be smaller than nlogba, it must be polynomially smaller. That is f(n) must be asymptotically smaller than nlogba by a factor of nε for some constant ε > 0. • In case 3, not only must f(n) be larger than nlogba , it must be polynomially larger and in addition satisfy the “regularity” condition that: a f(n/b) ≤ c f(n).
Understanding Master Theorem • It is important to realize that the three cases do not cover all the possibilities for f(n). • There is a gap between cases 1 and 2 when f(n) is smaller than nlogba but not polynomially smaller. • There is a gap between cases 2 and 3 when f(n) is larger than nlogba but not polynomially larger. • If f(n) falls into one of these gaps, or if the regularity condition in case 3 fails to hold, the master method cannot be used to solve the recurrence.