120 likes | 131 Views
This text discusses the running times of MergeSort and other sorting algorithms, as well as topics like counting inversions and closest pair of points. It also introduces the Master Theorem for analyzing recursive algorithms.
E N D
Running times continued • some running times are more difficult to analyze • e.g., recursive algorithms • Example: MergeSort
Running times continued Merging two sorted lists Input: Output: Two arrays A = {a1, a2, …, am}, B = {b1, b2, …, bn}, in increasing order Array C containing A [ B, in increasing order • MERGE(A,B) • i=1; j=1; k=1; • am+1=1; bn+1=1; • while (k · m+n) do • if (ai < bj) then • ck=ai; i++; • else • ck=bj; j++; • k++; • RETURN C={c1, c2, …, cm+n} Running time ?
Running times continued Sorting Input: Output: An array X = {x1, x2, …, xn} X sorted in increasing order MergeSort – a divide-and-conquer algorithm • MERGESORT(X,n) • if (n == 1) RETURN X • middle = n/2 (round down) • A = {x1, x2, …, xmiddle} • B = {xmiddle+1, xmiddle+2, …, xn} • As = MERGESORT(A,middle) • Bs = MERGESORT(B,n-middle) • RETURN MERGE(As,Bs)
A recurrence Running time of MergeSort: T(n) How to bound T(n) ? -> “unrolling the recurrence”
A recurrence Running time of MergeSort: T(n) How to bound T(n) ? -> “substitution / induction”
More on sorting Other O(n log n) sorts ? Can do better than O(n log n) ?
A lower-bound on sorting: (n log n) Every comparison-based sort needs at least (n log n) comparisons, thus its running time is (n log n).
Divide-and-conquer algorithms Counting inversions Input: A permutation a1,a2,…,an of 1,2,…,n Output: # inversions, i.e. (i,j) pairs where i<j but ai>aj
Divide-and-conquer algorithms Closest Pair of Points Input: n points in the plane (x1,y1),…,(xn,yn) Output: distance of the closest pair of points, i.e., mini,j (i j) distance of (xi,yi) and (xj,yj)
Master Theorem • Let a ¸ 1 and b>1 be constants, f(n) be a function and for positive integers we have a recurrence for T of the form • T(n) = a T(n/b) + f(n), • where n/b is rounded either way. • Then, • If f(n) = O(nlog a/log b - e) for some constant e> 0, then • T(n) = (nlog a/log b). • If f(n) = (nlog a/log b), then T(n) = (nlog a/log blog n). • If f(n) = (nlog a/log b + e) for some constant e > 0, and if af(n/b) · cf(n) for some constant c < 1 (and all sufficiently large n), then • T(n) = (f(n)).
Master Theorem • Let a ¸ 1 and b>1 be constants, f(n) be a function and for positive integers we have a recurrence for T of the form • T(n) = a T(n/b) + f(n), • where n/b is rounded either way. • Then, • If f(n) = O(nlog a/log b - e) for some constant e> 0, then • T(n) = (nlog a/log b). • If f(n) = (nlog a/log b), then T(n) = (nlog a/log blog n). • If f(n) = (nlog a/log b + e) for some constant e > 0, and if af(n/b) · cf(n) for some constant c < 1 (and all sufficiently large n), then • T(n) = (f(n)).
Master Theorem • Let a ¸ 1 and b>1 be constants, f(n) be a function and for positive integers we have a recurrence for T of the form • T(n) = a T(n/b) + f(n), • where n/b is rounded either way. • Then, • If f(n) = O(nlog a/log b - e) for some constant e> 0, then • T(n) = (nlog a/log b). • If f(n) = (nlog a/log b), then T(n) = (nlog a/log blog n). • If f(n) = (nlog a/log b + e) for some constant e > 0, and if af(n/b) · cf(n) for some constant c < 1 (and all sufficiently large n), then • T(n) = (f(n)).