350 likes | 365 Views
Understand algorithm analysis, order of growth, runtime analysis, and asymptotic notations in advanced algorithms. Learn to analyze code efficiency and runtime complexities. Mathematical induction and common functions are also covered.
E N D
CMPS 256: Advanced Algorithms and Data Structures Growth of Functions
Credits Slides based on: • Monica Nicolescu @ UNR • F. Abu Salem, J. Boulos, W. Keyrouz, & George Turkiyyah @ AUB
Algorithm Analysis • The amount of resources used by the algorithm • Space • Computational time • Running time: • The number of primitive operations (steps) executed before termination • Order of growth • The leading term of a formula • Expresses the behavior of a function toward infinity
Runtime Analysis • In this course, we will typically use T(n) to refer to worst-case running time, unless otherwise noted • It is difficult to determine T(n) experimentally • Too many possible inputs • Don’t know a priori which input leads to worst-case behavior • We will instead determine T(n) theoretically- by analyzing/counting number of primitive operations in algorithm pseudocode
Order of growth • Alg.: MIN (a[1], …, a[n]) m ← a[1]; for i ← 2 to n if a[i] < m then m ← a[i]; • Running time: T(n) =1 [first step] + (n) [for loop] + (n-1) [if condition] + (n-1) [the assignment in then] = 3n - 1 • Order (rate) of growth: • The leading term of the formula • Gives a simple characterization of the algorithm’s efficiency • Expresses the asymptotic behavior of the algorithm • T(n) grows like n
Analyzing Code • Conditional statements if C then S1 else S2 time time(C) + Max( time(S1), time(S2)) • Loops for i a1 to a2 do S time time(Si) for all i
Analyzing Code • Nested loops for i = 1 to n do for j = 1 to n do sum = sum + 1 Number_of_iterations * time_per_iteration
Analyzing Code • Nested loops for i = 1 to n do for j = i to n do sum = sum + 1
Exercise • A more complicated example for ( i=0; i<=n2; i++) for ( j=i; j>=1; j=j/5) sum = sum + 1 T(n) = ?
Typical Running Time Functions • 1 (constant running time): • Instructions are executed once or a few times • logN (logarithmic) • A big problem is solved by cutting the original problem in smaller sizes, by a constant fraction at each step • N (linear) • A small amount of processing is done on each input element • N logN (log-linear)s • A problem is solved by dividing it into smaller problems, solving them independently and combining the solution
Typical Running Time Functions • N(1+c) where c is a constant satisfying 0 < c < 1 (super-linear) • N2 (quadratic) • Typical for algorithms that process all pairs of data items (double nested loops) • N3(cubic) • Processing of triples of data (triple nested loops) • NK (polynomial) • kN (exponential) • Few exponential algorithms are appropriate for practical use
Logarithms • In algorithm analysis we often use the notation “log n” without specifying the base Binary logarithm Natural logarithm
Some Common Functions • Floors and ceilings • x : the greatest integer less than or equal to x • x : the least integer greater than or equal to x • x – 1 < x ≤ x ≤ x < x + 1
Some Simple Summation Formulas • Arithmetic series: • Geometric series: • Special case: x < 1: • Harmonic series: • Other important formulae:
Mathematical Induction • Used to prove a sequence of statements (S(1), S(2), … S(n)) indexed by positive integers • Proof: • Basis step: prove that the statement is true for n = 1 • Inductive step: assume that S(n) is true and prove that S(n+1) is true for all n ≥ 1 • The key to mathematical induction is to find case n “within” case n+1
Example • Prove that: 2n + 1 ≤ 2n for all n ≥ 3 • Basis step: • n = 3: 2 3 + 1 ≤ 23 7 ≤ 8 TRUE • Inductive step: • Assume inequality is true for n, and prove it for (n+1): 2n + 1 ≤ 2nmust prove: 2(n + 1) + 1 ≤ 2n+1 2(n + 1) + 1 = (2n + 1 ) + 2 ≤ 2n + 2 (by induction) 2n + 2n = 2n+1 (2 ≤ 2nfor n ≥ 1)
Another Example • Prove that: • Basis step: • n = 1: • Inductive step: • Assume inequality is true for n, and prove it is true for (n+1): • Assume and prove:
Asymptotic Notations • A way to describe behavior of functions in the limit • How we indicate running times of algorithms • Describe the running time of an algorithm as n grows to • O notation: asymptotic “less than”: f(n) “≤” g(n) • notation: asymptotic “greater than”: f(n) “≥” g(n) • notation: asymptotic “equality”: f(n) “=” g(n)
Asymptotic notations • O-notation • Intuitively: O(g(n)) = the set of functions with a smaller or same order of growth as g(n) • An implicit assumption in order notation is that f(n) is an asymptotically nonnegative function, i.e. f(n)0 sufficiently large n
Examples • 2n2 = O(n3): • n2 = O(n2): • 1000n2+1000n = O(n2): • n = O(n2): 2n2≤ cn3 2 ≤ cn possibly c = 1 and n0= 2 n2≤ cn2 c ≥ 1 possibly c = 1 and n0= 1 1000n2+1000n ≤ cn2 ≤ cn2+ 1000n possibly c=1001 and n0 = 1000 n ≤ cn2 cn ≥ 1 possibly c = 1 and n0= 1
Examples • E.g.:prove that n2≠ O(n) • Assume c & n0 such that: n≥ n0: n2 ≤ cn • Choose n = max (n0, c) • n2 = n n ≥ n c n2 ≥ cn contradiction!!!
Comments • Our textbook uses “asymptotic notation” and “order notation” interchangeably • Order notation can be interpreted in terms of set membership, for e.g., • T(n) = O(f(n)) is equivalent to T(n) O(f(n)) • O(f(n)) is said to be “the set of all functions for which f(n) is an upper bound”
Asymptotic notations (cont.) • - notation • Intuitively: (g(n)) = the set of functions with a larger or same order of growth as g(n)
Examples • 5n2 = (n) • 100n + 5 ≠(n2) • n = (2n), n3 = (n2), n = (logn) cn 5n2 • Possibly c = 1 and n0 = 1 • c, n0such that: 0 cn 5n2 Spse c, n0 such that: 0 cn2 100n + 5 100n + 5 100n + 5n ( n 1) = 105n cn2 105n n(cn – 105) 0 n 105/c Since n is positive cn – 105 0 contradiction: n cannot be smaller than a constant
Asymptotic notations (cont.) • -notation • Intuitively(g(n)) = the set of functions with the same order of growth as g(n)
Examples • n2/2 –n/2 = (n2) • ½ n2 - ½ n ≤ ½ n2n ≥ 0 c2= ½ • ½ n2 - ½ n ≥ ½ n2 - ½ n * ½ n ( n ≥ 2 ) = ¼ n2 c1= ¼ • n ≠ (n2): c1 n2≤ n ≤ c2 n2 only holds for: n ≤ 1/c1 • 6n3 ≠ (n2): c1 n2≤ 6n3 ≤ c2 n2 only holds for: n ≤ c2 /6 • n ≠ (logn): c1logn≤ n ≤ c2 logn c2 ≥ n/logn, n≥ n0 – impossible
More on Asymptotic Notations • There is no unique set of values for n0 and c in proving the asymptotic bounds • Prove that 100n + 5 = O(n2) • 100n + 5 ≤ 100n + n = 101n ≤ 101n2 for all n ≥ 5 n0 = 5 and c = 101is a solution • 100n + 5 ≤ 100n + 5n = 105n ≤ 105n2for all n ≥ 1 n0 = 1 and c = 105is also a solution Must findSOMEconstants c and n0 that satisfy the asymptotic notation relation
Asymptotic Notations - Examples • notation • n2/2 – n/2 • (6n3 + 1)lgn/(n + 1) • n vs. n2 • notation • n vs. 2n • n3 vs. n2 • n vs. logn • n vs. n2 = (n2) = (n2lgn) n ≠ (n2) • O notation • 2n2 vs. n3 • n2 vs. n2 • n3 vs. nlogn n = (2n) 2n2 = O(n3) n2 = O(n2) n3 = (n2) n3 O(nlgn) n = (logn) n (n2)
Comparisons of Functions • Theorem: f(n) = (g(n)) f = O(g(n)) and f = (g(n)) • Transitivity: • f(n) = (g(n))andg(n) = (h(n)) f(n) = (h(n)) • Same for O and • Reflexivity: • f(n) = (f(n)) • Same for O and • Symmetry: • f(n) = (g(n)) if and only if g(n) = (f(n)) • Transpose symmetry: • f(n) = O(g(n)) if and only if g(n) = (f(n))
Asymptotic Notations - Examples • For each of the following pairs of functions, either f(n) is O(g(n)), f(n) is Ω(g(n)), or f(n) = Θ(g(n)). Determine which relationship is correct. • f(n) = log n2; g(n) = log n + 5 • f(n) = n; g(n) = log n2 • f(n) = log log n; g(n) = log n • f(n) = n; g(n) = log2 n • f(n) = n log n + n; g(n) = log n • f(n) = 10; g(n) = log 10 • f(n) = 2n; g(n) = 10n2 • f(n) = 2n; g(n) = 3n f(n) = (g(n)) f(n) = (g(n)) f(n) = O(g(n)) f(n) = (g(n)) f(n) = (g(n)) f(n) = (g(n)) f(n) = (g(n)) f(n) = O(g(n))
Asymptotic Notations in Equations • On the right-hand side • (n2) stands for some anonymous function in (n2) 2n2 + 3n + 1 = 2n2 + (n) means: There exists a function f(n) (n) with the desired properties such that 2n2 + 3n + 1 = 2n2 + f(n) • On the left-hand side 2n2 + (n) = (n2) No matter how the anonymous function is chosen on the left-hand side, there is a way to choose the anonymous function on the right-hand side to make the equation valid.
Limits and Comparisons of Functions Using limits for comparing orders of growth: • compare ½ n (n-1) and n2
Limits and Comparisons of Functions L’Hopital rule: • compare lgn and
Some Common Functions • Factorials Stirling’s approximation
Exercises • Is this true: f(n)=O(g(n)) f(n)+g(n)=W(f(n)) If so, give a proof that uses formal definition of O, W, and Q • Order these functions asymptotically from slowest to fastest: