280 likes | 459 Views
Asymptotic Analysis. Chapter 3. Given several algorithms for a given problem …. How do we decide which is “the best”? What do we even mean by “the best”? Do we mean the “most efficient”?. What do mean by the “most efficient”?. Time?. Space?. Both?. How to determine execution time?.
E N D
Asymptotic Analysis Chapter 3
Given several algorithms for a given problem … • How do we decide which is “the best”? • What do we even mean by “the best”? • Do we mean the “most efficient”?
What do mean by the “most efficient”? Time? Space? Both?
How to determine execution time? • Empirically • Theoretically • How does execution time change as size of problem grows very large? (or reaches a limit in the calculus sense) • What is the Cost Function for an algorithm? • What is Growth Rate of an algorithm?
Upper Bound or big-Oh Notation: O( f(n) ) For T(n) a non-negatively valued function, T(n) is in set O( f(n) ) if there exist two positive constants c and n0 such that T(n) ≤ c f(n) for all n > n0
Lower Bound or big-Omega Notation:Ω( f(n) ) For T(n) a non-negatively valued function, T(n) is in set Ω( f(n) ) if there exist two positive constants c and n0 such that T(n) ≥ c f(n) for all n > n0
big-Theta Notation: Θ( f(n) ) For T(n) a non-negatively valued function, T(n) is Θ( f(n) ) if T(n) is in set Ω( f(n) ) and T(n) is in set O( f(n) )
Apply these definitions to: and by graphing them
and So T(n)∊Ω(1)
and So T(n)∊Ω(log2n)
and So T(n)∊Ω(n)
and So T(n)∊Ω(n2)
and So T(n)∊O(n3)
and So T(n)∊O(n2)
Conclusion: Since T(n)∊Ω(n2) and T(n)∊O(n2) , we say that T(n)isΘ(n2) Remember that
So, let’s analyze some code segments to come up with cost functions / growth functions
Simplification rules (p. 67) 1. If f(n)is Θ(g(n)) and g(n)is Θ(h(n)), then f(n)is Θ(h(n)). 2. If f(n)is Θ(kg(n)) for any constant k > 0, then f(n)is Θ(g(n)). 3. If f1(n)is Θ(g1(n)) and f2(n) is Θ(g2(n)), then f1(n) + f2(n) is Θ(max(g1(n), g2(n))). 4. If f1(n)is Θ(g1(n)) and f2(n)is Θ(g2(n)), then f1(n)f2(n)is Θ(g1(n)g2(n)).
Example 3.9, p. 69 a = b; Θ(1)
Example 3.10, p.69 sum = 0; for (i=1; i<=n; i++) sum += n; Θ(n)
Example 3.12, p.70 sum1 = 0; for (i=1; i<=n; i++) // do n times for (j=1; j<=n; j++) // do n times sum1++; sum2 = 0; for (i=1; i<=n; i++) // do n times for (j=1; j<=i; j++) // do i times sum2++; Θ(n2) Θ(n2)
Example 3.11 , p.69 sum = 0; for (i=1; i<=n; j++) for (j=1; j<=i; i++) sum++; for (k=0; k<n; k++) A[k] = k; Θ(n2)
Example 3.13, p.70 sum1 = 0; for (k=1; k<=n; k*=2) // do log n times for (j=1; j<=n; j++) // do n times sum1++; sum2 = 0; for (k=1; k<=n; k*=2) // do log n times for (j=1; j<=k; j++) // do k times sum2++; Θ(n log n) Θ(n)
Sequential Search for value k (unsorted array) /** @return Position of value K in array A if found, A.length otherwise */ static intfind(int[] A, int k) { for (inti=1; i<A.length; i++) // For each element if (A[i] == k) // if A[i] is K return i; // found it, so return i return A.length; // didn’t find it, so return length } Are best case, worst case, average cases different? Yes! Θ(1) Θ(n) Θ(n)
Largest Value Sequential Search (unsorted array) /** @return Position of largest value in array A */ static int largest(int[] A) { intcurrlarge = 0; // Holds largest element position for (inti=1; i<A.length; i++) // For each element if (A[currlarge] < A[i]) // if A[i] is larger currlarge= i; // remember its position return currlarge; // Return largest position } Are best case, worst case, average cases different? No! Θ(n) Θ(n) Θ(n)
Binary Search (sorted array) Θ(log n) /** @return The position of an element in sorted array A with value k. If k is not in A, return A.length. */ static int binary(int[] A, int k) { intl = -1; intr = A.length; // l and r are beyond array bounds while (l+1 != r) { // Stop when l and r meet inti = (l+r)/2; // Calcmiddle of remaining subarray if (k < A[i]) r = i; // In left half, move r down if (k == A[i]) return i; // Found it if (k > A[i]) l = i; // In right half, move l up } return A.length; // Search value not in A }
Largest Value Search (sorted array) Constant growth Θ(1)