1 / 40

Intro to Analysis of Algorithms

Intro to Analysis of Algorithms. Algorithm. “A sequence of unambiguous instructions for solving a problem, i.e., for obtaining a required output for any legitimate input in a finite amount of time.”

daria-cruz
Download Presentation

Intro to Analysis of Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Intro to Analysis of Algorithms

  2. Algorithm • “A sequence of unambiguous instructions for solving a problem, i.e., for obtaining a required output for any legitimate input in a finite amount of time.” • Named for Al Khwarizmi, who laid out basic procedures for arithmetic functions. (Read about him!)

  3. Analysis of Algorithms • Correctness • Generality • Optimality • Simplicity • Time Efficiency • Space Efficiency

  4. Measuring Efficiency • What is basic unit to measure input size? (n) • What is basic unit of resource? • Time: basic unit operation • Space: memory units • Best, worst, or average case? • Find its efficiency class

  5. Why do we care? • Let’s look at Fibonacci numbers: 1, 1, 2, 3, 5, 8, 13, …

  6. Fibonacci Sequence • We want to compute the nth number in the sequence. (F3 = 2, for example.)

  7. This definition can be translated directly into code – a recursive method. • How many additions does it take to compute Fn?

  8. Which is better? function fib1(n) if n = 0: return 0 if n = 1: return 1 return fib1(n-1) + fib1(n-2) function fib2(n) create an array f[0...n] f[0] = 0, f[1] = 1 for i = 2...n: f[i] = f[i-1] + f[i-2] return f[n]

  9. Consider calculating F200. The fib1 method takes over 2138 steps. • Computers can do several billion instructions per second. • Suppose we have a supercomputer that does 40 trillion instructions per second.

  10. Consider calculating F200. The fib1 method takes over 2138 steps. • Computers can do several billion instructions per second. • Suppose we have a supercomputer that does 40 trillion instructions per second. • Even on this machine, fib1(200) takes at least 292 seconds, or 1018 centuries, long after the expected end of our sun!!!

  11. Consider calculating F200. The fib1 method takes over 2138 steps. • Computers can do several billion instructions per second. • Suppose we have a supercomputer that does 40 trillion instructions per second. • Even on this machine, fib1(200) takes at least 292 seconds, or 1018 centuries, long after the expected end of our sun!!! • But, fib2(200) would take less than a billionth of a second to compute!!!

  12. 106 instructions/sec, runtimes

  13. Some helpful mathematics • 1 + 2 + 3 + 4 + … + N • N(N+1)/2 = N2/2 + N/2 is O(N2) • N + N + N + …. + N (total of N times) • N*N = N2 which is O(N2) • 1 + 2 + 4 + … + 2N • 2N+1 – 1 = 2 x 2N – 1 which is O(2N )

  14. Basics of Efficiency • Big-oh – upper bound on the efficiency class • Efficiency classes don’t worry with constants • A cubic is worse than a quadratic, quartic worse than cubic… • Getting big-oh analysis of non-recursive code is pretty easy

  15. What is input size? • What is unit of time? • What is big-oh analysis? • Maximum(A[1..n]) • max <- A[0] • for i<- 1 to n • if A[i] > max • max <- A[i] • return max

  16. 1 assignment n times: 1 compare maybe 1 assignment 1 addition to i 1 return ______________________ 1 + n(3) + 1 = O(3n+2) • Maximum(A[1..n]) • max <- A[0] • for i<- 1 to n • if A[i] > max • max <- A[i] • return max

  17. The algorithm is O(3n+2), which is O(n). • We only care about the efficiency class. Why?

  18. The algorithm is O(3n+2), which is O(n). • We only care about the efficiency class. Why? • At some point, every parabola (n2) overtakes any line (n). We only really care about large input.

  19. Efficiency classes • So we really just care about the leading term, which determines the shape of the graph. • This means for non-recursive algorithms, what’s important is the loops.

  20. Analyze this… AllUnique(A[1..n]) for i<-1 to n for j<- 1 to n if A[i] = A[j] and i ≠ j return false return true Best case? Worst case? Average case?

  21. Analyze this… AllUnique(A[1..n]) for i<-1 to n for j<- 1 to n if A[i] = A[j] and i ≠ j return false return true Average case? Quit halfway through, O(n*n/2) Still O(n2) Often, Average case = Worst case.

  22. AllUnique(A[1..n]) for i<-1 to n for j<- i to n if A[i] = A[j] return false return true

  23. AllUnique(A[1..n]) for i<-1 to n for j<- i to n if A[i] = A[j] return false return true n + (n-1) + (n-2) + … + 3 + 2 + 1 = n(n+1)/2

  24. MatrixMultiply(A[nxn], B[nxn]) Initialize empty C[nxn] for i<- 1 to n for j <- 1 to n C[i,j] <- 0 for k<- 1 to n C[i,j] <- C[i,j] + A[i,k]*B[k,j] return C

  25. MethodA(n) • answer <- 1 • for i<- 1 to n/2: • answer <- answer + i • for i<- 1 to n • answer <- answer + 1 • return answer • MethodB(n) • answer <- 1 • for i <- 1 to lg n: • answer = answer + MethodA(n) return answer

  26. Big-oh Definition • Let f(n) and g(n) be functions from positive integers to positive reals. We say f = O(g) if there exists some constant c > 0 such that f(n) ≤ cg(n) for all n.

  27. Big-oh Definition • Let f(n) and g(n) be functions from positive integers to positive reals. We say f = O(g) if there exists some constant c > 0 such that f(n) ≤ cg(n) for all n. Say what?

  28. Big-O • f(n) = O(g(n), think: f(n) ≤ g(n) n = O(n2) because I can find a big number to multiply the parabola by that makes it lie completely above the line. I can’t do that in reverse.

  29. Know the shapes exponential quadratic linear logarithmic constant

  30. Theta Notation • Theta =tight bound • Multiply constants by a function, results are always (asymptotically) above and below. • What it means: If you have a parabola, you can always find parabolas that stay above and below the given parabola. Same for lines, and other curves. Just tells us the efficiency category.

  31. Practice – True or False? • ½n(n-1) = Ө(n) • if f(n) = Ө g(n), then g(n) = Ө(f(n)) • -n – 100000 = Ө(n)

  32. O - notation • f = O(g) if f grows slower than or the same as g • That is, asymptotically, g is not below f • g is an upper bound • Think: O is “≤”

  33. Practice – True or False? • n = O(n2) • n3 = O(n2) • .00000001n3 = O(n2) • 100n + 5 = O(n2) • ½ n(n-1) = O(n2) • n4 +n+1= O(n2)

  34. -Notation • Opposite of big-oh • f = (g) means f is always above or equal to g. • It gives an asymptotic lower bound. • Think:  is “≥”

  35. Practice – True or False? • n3 = (n2) • ½ n(n-1) = (n2) • 100n + 5 = (n2)

  36. o and ω • f = O(g) means g is above or equal to f • f = o(g) means g is strictly above f • Also means – there is a better, tighter bound • ω is analogous for  - there is a tighter bound available

  37. Function Growth Rates

  38. Polynomials are Easy – What about other functions? • constant (1): Very few examples of algorithms that don’t grow with input size • logarithmic (lg n): Usually result of cutting input size by a constant factor each time through the loop • linear (n): Look at each input element a constant number of times • nlgn: Divide and conquer • quadratic (n2): Two embedded loops • cubic (n3): Three embedded loops • exponential (2n): Generate all subsets of the input elements • factorial (n!): Generate all permutations of the input • nn: Generate all permutations of input, allowing repetitions

  39. Other standard functions • Polylog – log3n = (logn)3 • Any log grows slower than any polynomial: logan = o(nb), for any a,b > 0 • Any polynomial grows slower than any exponential with base c > 1. nb = o(cn) for any c > 1 • n! = o(nn) • n! = ω(2n) • lg(n!) = Ө(nlgn)

  40. Relative Growths –O, , or Ө? lg2n 2n n4+3n2n √n n2/3 4n 4n/2 n+2n3n 100n + lgn n+(lgn) 2 lg(n2) lg(n3)

More Related