1 / 46

CS 3343: Analysis of Algorithms

Learn about asymptotic notations O, Ω, Θ, how to analyze non-recursive algorithms, and compare growth rates using mathematical definitions and proofs. Understand the significance of logarithms, binary search, and trees in algorithm analysis. Explore examples and properties of asymptotic notations.

Download Presentation

CS 3343: Analysis of Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS 3343: Analysis of Algorithms Lecture 3: Asymptotic Notations, Analyzing non-recursive algorithms

  2. Outline • Review of last lecture • Continue on asymptotic notations • Analyzing non-recursive algorithms

  3. Mathematical definitions • O(g(n)) = {f(n):  positive constants c and n0 such that 0 ≤ f(n) ≤ cg(n)  n≥n0} • Ω(g(n)) = {f(n):  positive constants c and n0 such that 0 ≤ cg(n) ≤ f(n)  n≥n0} • Θ(g(n)) = {f(n):  positive constants c1, c2, and n0 such that 0  c1 g(n)  f(n)  c2 g(n)  n  n0}

  4. Big-Oh • Claim: f(n) = 3n2 + 10n + 5  O(n2) • Proof by definition: (Hint: Need to find c and n0 such that f(n) <= cn2 for all n ≥ n0. You can be sloppy about the constant factors. Pick a comfortably large c when proving big-O or small one when proving big-Omega.) (Note: you just need to find one concrete example of c and n0, but the condition needs to be met for all n ≥ n0. So do not try to plug in a concrete value of n and show the inequality holds.) Proof: 3n2 + 10n + 5  3n2 + 10n2 + 5,  n ≥ 1  3n2 + 10n2 + 5n2, n ≥ 1  18 n2,  n ≥ 1 If we let c = 18 and n0 = 1, we have f(n)  c n2,  n ≥ n0. Therefore by definition 3n2 + 10n + 5  O(n2).

  5. How to prove logn < n Let f(n) = 1 + log n, g(n) = n. Then f(n)’ = 1/n g(n)’ = 1. f(1) = g(1) = 1. Because f(n)’ ≤ g(n)’  n ≥ 1, by the racetrack principle, we have f(n) ≤ g(n)  n ≥ 1, i.e., 1 + log n ≤ n. Therefore, log n < 1 + log n ≤ n for all n From now on, we will use that fact that log n < n  n ≥ 1 without proof.

  6. True or false? • 2n2 + 1 = O(n2) T (also ) • Sqrt(n) = O(log n) F () • log n = O(sqrt(n)) T (also o) • n2(1 + sqrt(n)) = O(n2 log n) F () • 3n2 + sqrt(n) = O(n2) T (also ) • sqrt(n) log n = O(n) T (also o)

  7. True or false? • 2n2 + 1 = O(n2) T (also ) • Sqrt(n) = O(log n) F () • log n = O(sqrt(n)) T (also o) • n2(1 + sqrt(n)) = O(n2 log n) F () • 3n2 + sqrt(n) = O(n2) T (also ) • sqrt(n) log n = O(n) T (also o)

  8. Questions • If f(n)  O(g(n)) • compare f(n) and f(n) + g(n) • compare g(n) and f(n) + g(n) • compare h(n) f(n) and h(n) g(n) where h(n) > 0. • How about f(n) Ω (g(n)) and f(n) Θ (g(n))?

  9. Asymptotic notations • O: <= • o: < • Ω: >= • ω: > • Θ: = (in terms of growth rate)

  10. O, Ω, and Θ The definitions imply a constant n0beyond which they are satisfied. We do not care about small values of n. Use definition to prove big-O (Ω,Θ): find constant c (c1 and c2) and n0, such that the definition of big-O (Ω,Θ) is satisfied

  11. Use limits to compare orders of growth 0 • lim f(n) / g(n) = c > 0 ∞ f(n)  o(g(n)) f(n)  O(g(n)) f(n) Θ (g(n)) n→∞ f(n)  Ω(g(n)) f(n) ω (g(n))

  12. Examples • Compare 2n and 3n • lim 2n / 3n = lim(2/3)n = 0 • Therefore, 2n o(3n), and 3nω(2n) • How about 2n and 2n+1? 2n / 2n+1 = ½, therefore 2n = Θ (2n+1) n→∞ n→∞

  13. L’ Hopital’s rule Condition: If both lim f(n) and lim g(n) =  or 0 lim f(n) / g(n) = lim f(n)’ / g(n)’ n→∞ n→∞

  14. Example • Compare n0.5 and logn • lim n0.5 / logn = ? • (n0.5)’ = 0.5 n-0.5 • (log n)’ = 1 / n • lim (n-0.5 / 1/n) = lim(n0.5) = ∞ • Therefore, log n  o(n0.5) • In fact, log n  o(nε), for any ε > 0 n→∞

  15. Stirling’s formula or (constant)

  16. Examples • Compare 2n and n! • Therefore, 2n = o(n!) • Compare nn and n! • Therefore, nn = ω(n!) • How about log (n!)?

  17. Example

  18. Properties of asymptotic notations • Textbook page 51 • Transitivity f(n) = (g(n)) and g(n) = (h(n)) => f(n) = (h(n)) (holds true for o, O, , and  as well). • Symmetry f(n) = (g(n)) if and only if g(n) = (f(n)) • Transpose symmetry f(n) = O(g(n)) if and only if g(n) = (f(n)) f(n) = o(g(n)) if and only if g(n) = (f(n))

  19. About exponential and logarithm functions • Textbook page 55-56 • It is important to understand what logarithms are and where they come from. • A logarithm is simply an inverse exponential function. • Saying bx = y is equivalent to saying that x = logb y. • Logarithms reflect how many times we can double something until we get to n, or halve something until we get to 1. • log21 = ? • log22 = ?

  20. Binary Search • In binary search we throw away half the possible number of keys after each comparison. • How many times can we halve n before getting to 1? • Answer: ceiling (lg n)

  21. Logarithms and Trees • How tall a binary tree do we need until we have n leaves? • The number of potential leaves doubles with each level. • How many times can we double 1 until we get to n? • Answer: ceiling (lg n)

  22. Logarithms and Bits • How many numbers can you represent with k bits? • Each bit you add doubles the possible number of bit patterns • You can represent from 0 to 2k – 1 with k bits. A total of 2k numbers. • How many bits do you need to represent the numbers from 0 to n? • ceiling (lg (n+1))

  23. logarithms • lg n = log2 n • ln n = loge n, e ≈ 2.718 • lgkn = (lg n)k • lg lg n = lg (lg n) = lg(2)n • lg(k) n = lg lg lg … lg n • lg24 = ? • lg(2)4 = ? • Compare lgkn vs lg(k)n?

  24. Useful rules for logarithms For all a > 0, b > 0, c > 0, the following rules hold • logba = logca / logcb = lg a / lg b • logban = n logba • blogba = a • log (ab) = log a + log b • lg (2n) = ? • log (a/b) = log (a) – log(b) • lg (n/2) = ? • lg (1/n) = ? • logba = 1 / logab

  25. Useful rules for exponentials • For all a > 0, b > 0, c > 0, the following rules hold • a0 = 1 (00 = ?) • a1 = a • a-1 = 1/a • (am)n = amn • (am)n = (an)m • aman = am+n

  26. More advanced dominance ranking

  27. Analyzing the complexity of an algorithm

  28. Kinds of analyses • Worst case • Provides an upper bound on running time • Best case – not very useful, can always cheat • Average case • Provides the expected running time • Very useful, but treat with care: what is “average”?

  29. General plan for analyzing time efficiency of a non-recursive algorithm • Decide parameter (input size) • Identify most executed line (basic operation) • worst-case = average-case? • T(n) = i ti • T(n) = Θ (f(n))

  30. Example repeatedElement (A, n) // determines whether all elements in a given // array are distinct for i = 1 to n-1 { for j = i+1 to n { if (A[i] == A[j]) return true; } } return false;

  31. Example repeatedElement (A, n) // determines whether all elements in a given // array are distinct for i = 1 to n-1 { for j = i+1 to n { if (A[i] == A[j]) return true; } } return false;

  32. Best case? • Worst-case? • Average case?

  33. Best case • A[1] = A[2] • T(n) = Θ (1) • Worst-case • No repeated elements • T(n) = (n-1) + (n-2) + … + 1 = n (n-1) / 2 = Θ (n2) • Average case? • What do you mean by “average”? • Need more assumptions about data distribution. • How many possible repeats are in the data? • Average-case analysis often involves probability.

  34. Find the order of growth for sums • T(n) = i=1..n i = Θ (n2) • T(n) = i=1..n log (i) = ? • T(n) = i=1..n n / 2i = ? • T(n) = i=1..n 2i = ? • … • How to find out the actual order of growth? • Math… • Textbook Appendix A.1 (page 1058-60)

  35. Arithmetic series • An arithmetic series is a sequence of numbers such that the difference of any two successive members of the sequence is a constant. e.g.: 1, 2, 3, 4, 5 or 10, 12, 14, 16, 18, 20 • In general: Recursive definition Closed form, or explicit formula Or:

  36. Sum of arithmetic series If a1, a2, …, an is an arithmetic series, then e.g. 1 + 3 + 5 + 7 + … + 99 = ? (series definition: ai = 2i-1) This is ∑i = 1 to 50 (ai) = 50 * (1 + 99) / 2 = 2500

  37. Geometric series • A geometric series is a sequence of numbers such that the ratio between any two successive members of the sequence is a constant. e.g.: 1, 2, 4, 8, 16, 32 or 10, 20, 40, 80, 160 or 1, ½, ¼, 1/8, 1/16 • In general: Recursive definition Closed form, or explicit formula Or:

  38. Sum of geometric series if r < 1 if r > 1 if r = 1

  39. Sum of geometric series if r < 1 if r > 1 if r = 1

  40. Important formulas

  41. Sum manipulation rules Example:

  42. Sum manipulation rules Example:

  43. i=1..n n / 2i = n * i=1..n (½)i = ? • using the formula for geometric series: i=0..n (½)i = 1 + ½ + ¼ + … (½)n = 2 • Application: algorithm for allocating dynamic memories

  44. i=1..n log (i) = log 1 + log 2 + … + log n = log 1 x 2 x 3 x … x n = log n! = (n log n) • Application: algorithm for selection sort using priority queue

  45. Recursive definition of sum of series • T (n) = i=0..n i is equivalent to: T(n) = T(n-1) + n T(0) = 0 • T(n) = i=0..n ai is equivalent to: T(n) = T(n-1) + an T(0) = 1 Recurrence Boundary condition Recursive definition is often intuitive and easy to obtain. It is very useful in analyzing recursive algorithms, and some non-recursive algorithms too.

  46. Recursive definition of sum of series • How to solve such recurrence or more generally, recurrence in the form of: • T(n) = aT(n-b) + f(n) or • T(n) = aT(n/b) + f(n)

More Related