1 / 65

Chapter 2

Chapter 2. Program Performance – Part 2. Step Counts. Instead of accounting for the time spent on chosen operations, the step-count method accounts for the time spent in all parts of the program/function

marcy
Download Presentation

Chapter 2

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 2 Program Performance – Part 2

  2. Step Counts • Instead of accounting for the time spent on chosen operations, the step-count method accounts for the time spent in all parts of the program/function • Program step: loosely defined to be a syntactically or semantically meaningful segment of a program for which the execution time is independent of the instance characteristics • Return a+b*c/(a-b)*4 • X =y

  3. Use a global variable to count program steps Count = 2n + 3

  4. Counting steps in a recursive function • tRsum = 2, n=0 • tRsum = 2+tRsum(n-1), n>0 • tRsum = 2+2+tRsum(n-2), n>0 • tRsum = 2(n+1), n>=0

  5. Matrix Addition

  6. Count steps in Matrix Addition count = 2rows*cols+2rows+1

  7. Using a Step TableSum

  8. Rsum

  9. Matrix Addition

  10. Matrix Transpose Template <class T> void transpose(T** a, int rows) { for (int i = 0; i < rows ; i++) for (int j = i+1; j < rows ; j++) swap(a[i][j], a[j][i]) }

  11. Matrix Transpose

  12. Inefficient way to compute the prefix sumsfor j = 0, 1, …, n-1 Note: number of S/E for sum() varies depending on parameters

  13. Steps Per Execution • Sum(a, n) requires 2n + 3 steps • Sum(a , j + 1) requires 2(j+1) + 3 = 2j +5 steps • Assignment statement: b[j]=sum(….) • ==>2j + 6 steps n-1 • Total: ∑ (2j +6) = n(n+5) j=0

  14. Prefix sums

  15. Sequential Search - Best case

  16. Sequential Search - Worst case

  17. Average for successful searches • X has equal probability of being any one element of a. • Step count if X is a[j]

  18. Average for successful searches

  19. Insertion in a Sorted Array – Best Case

  20. Insertion – Worst Case

  21. Insertion - Average • the step count for inserting into position j is 2n-2j+3 • Average count is:

  22. Asymptotic Notation • Objectives of performance Evaluation: • Compare time complexities of two programs that do the same function • Predict the growth in run time as instance characteristics change • Operation count and step count methods not accurate for either objectives • Op count: counts some ops and ignores others • Step count: definition of a step is inexact

  23. Asymptotic Notation • If two programs: • Program A with complexity C1n2+C2n • Program B with complexity C3n • Program B is faster than program A for sufficiently large values of n • For Small values of n, either could be faster and it may not matter any way. • There is a break-even point for n beyond which B is always faster than A.

  24. Asymptotic Notation • Describes behavior of space and time complexities of programs for LARGE instance characteristics • To establish a relative order among functions. • To compare their relative rate of growth • Allows us to make meaningful, though inexact statements about the complexity of programs

  25. Mathematical background T(n) denotes the time or space complexity of a program Big- Oh: Growth rate of T(n) is <= f(n) • T(n)=O ( f(n) ) iff constants c and n0 exist such that T(n)<=c f(n) when n>=n0 • f is an upper bound function for T • Example: Algoritm A is O(n2) means, for data sets big enough (n>n0), algorithm A executes less than c*n2 (c a positive constant).

  26. The Idea • Example: • 1000n • larger than n2 for small values of n • n2 grows at a faster rate and thus n2 will eventually be the larger function. • Here we have • T(n) = 1000n, f(n) = n2 , n0 = 1000, and c=1 • T(n) <= f(n) and n > n0 • Thus we say that • 1000n = O (n2 ) • Note that we can get a tighter upper bound

  27. Example • Suppose T(n) = 10n2 + 4n + 2 • for n>= 2, T(n) <= 10n2 + 5n • for n>=5, T(n) <= 11n2 • T(n) = O(n2 )

  28. Big Oh Ratio Theorem • T(n) = O(f(n)) iff (T(n)/f(n)) < c for some finite constant c. • f(n) dominates T(n).

  29. Examples • Suppose T(n) = 10n2 + 4n + 2 • T(n)/n2 = 10 + 4/n + 2/n2 • (T(n)/ n2) = 10 • T(n) = O (n2 )

  30. Common Orders of Magnitude Functions Name 1 Constant log n Logarithmic log2n Log-squared n log n n2 Quadratic n3 Cubic 2n Exponential n! Factorial

  31. Loose Bounds • Suppose T(n) = 10n2 + 4n + 2 • 10n2 + 4n + 2 <= 11n3 • T(n) = O(n3) • Need to get the smallest upper bound.

  32. Polynomials • If T(n) = amnm + ….+a1n1 +a0n0 then T(n) = O(nm)

  33. Omga Notation--Lower Bound Omega: • T(n)=W ( g(n) ) iff constants c and n0 exist such that T(n)>=c g(n) for all n >=n0 • Establishes a lower bound • eg: T(n) = C1n2+C2n • C1n2+C2n  C1n2 for all n  1 • T(n) C1 n2 for all n  1 • T(n) is W (n2) • Note: T(n) is also W (n) and W (1). Need to get largest lower-bound

  34. Omega Ratio Theorem • T(n) = W (f(n)) iff (f(n)/T(n)) <= c for some finite constant c.

  35. Lower Bound of Polynomials • If T(n) = amnm + ….+a1n1 +a0n0 then T(n) = W(nm) • T(n) = n4 + 3500n3 + 400n2 +1 • T(n) is W (n4)

  36. Theta Notation Theta: When O and W meet we indicate that with Qnotation • Definition: T(n)=Q ( h(n) ) iff constants c1, c2 and n0 exist such that c1h(n)<=T(n)<=c2h(n) for all n > n0 • T(n)=Q ( h(n) ) iff T(n)=O(h(n)) and T(n)= W (h(n)) • e.g. T(n) = 3n + 8 • 3n<= 3n+8 <= 11n for n >= 1 • T(n) = Q(n) • T(n) = 20log2(n) +8 = Q log2 (n) • log2 (n) < 20log2 (n) + 8<= 21log2 (n) for all n>=32

  37. Theta Notation cntd • T(n) = 1000n • T(n) = O(n2) • but T(n) != Q(n2) because T(n) !=W(n2)

  38. Theta of Polynomials • If T(n) = amnm + ….+a1n1 +a0n0 then T(n) = Q(nm)

  39. Little o Notation Little- Oh: Growth rate of T(n) is < p(n) • T(n)=o ( p(n) ) if T(n)=O ( p(n) ) and T(n)!= W ( p(n) ) • T(n) = 1000n • T(n) o(n2)

  40. Simplifying Rules • If f(n) is O(g(n)) and g(n) is O(h(n)), then f(n) is O(h(n)). • If f(n) is O(kg(n)) for any k>0, then f(n) is O(g(n)). • f1(n) = O(g1(n)) and f2 (n) = O(g2(n)), then (a) (f1 + f2 )(n) = max (O(g1 (n)), O(g2(n))), (b)f1 (n) * f2 (n) = O(g1(n) * g2(n))

  41. Some Points • DO NOT include constants or low-order terms inside a Big-Oh. • For example: • T(n) = O(2n2) or • T(n) = O(n2 + n) • are the same as: • T(n) = O(n2)

  42. Examples • Example1: a = b; This assignment takes constant time, so it is Q(1) • Example 2: sum =0; for( I= 0; I<= n; I++) sum += n; • time complexity is Q(n)

  43. Examples CNTD a = 0; for (i=1; i<=n; i++) for (j=1; j<=n; j++) a++; • time complexity is Q(n2)

  44. Examples CNTD a = 0; for (i=1; i<=n; i++) for (j=1; j<= i ; j++) a++; • a++ statement will execute n(n+1)/2 times • time complexity is Q(n2)

  45. Examples CNTD a = 0; Q(1) for (i=1; i<=n; i++) for (j=1; j<= i ; j++) a++; Q(n2) for (k=1; k<=n; k++) Q(n) A[k] = k-1; • time complexity is Q(n2)

  46. Examples CNTD • Not all doubly nested loops execute n2 times a = 0; for (i=1; i<=n; i++) for (j=1; j<= n ; j *= 2) a++; • Inner loop executes log2(n) • Outer loop execute n times • time complexity is Q(n log2 (n))

  47. First determine the asymptotic complexity of each statement and then add up

  48. Asymptotic complexity of Rsum

  49. Asymptotic complexity of Matrix Addition

  50. Asymptotic complexity of Transpose

More Related