320 likes | 443 Views
i206: Lecture 7: Analysis of Algorithms, continued. Marti Hearst Spring 2012. Algorithms. A clearly defined procedure for solving a problem. A clearly described procedure is made up of simple clearly defined steps. The procedure can be complex, but the steps must be simple and unambiguous.
E N D
i206: Lecture 7:Analysis of Algorithms, continued Marti Hearst Spring 2012
Algorithms • A clearly defined procedure for solving a problem. • A clearly described procedure is made up of simple clearly defined steps. • The procedure can be complex, but the steps must be simple and unambiguous.
Problem Solving Strategies • Try to work the problem backwards • Reverse-engineer • Once you know it can be done, it is much easier to do • What are some examples? • Stepwise Refinement • Break the problem into several sub-problems • Solve each sub-problem separately • Produces a modular structure • Look for a related problem that has been solved before • Software design patterns
Example of Stepwise RefinementSpiderman • Peter Parker’s goal: Make Mary Jane fall in love with him • To accomplish this goal: • Get a cool car • To accomplish this goal: • Get $3000 • To accomplish this goal: • Appear in a wrestling match … • Each goal requires completing just one subgoal
Example of Stepwise RefinementStar Wars Episode IV • Luke Skywalker’s goal: Make Princess Leia fall in love with him (they weren’t siblings yet) • To accomplish this goal: • Rescue her from the death star • To accomplish this goal: • Land on Death Star • Remove her from her Prison Cell • Disable the Tractor Beam • Get her onto the Millennium Falcon • To accomplish subgoal (2) • Obtain Storm Trooper uniforms • Have Wookie pose as arrested wild animal • Find Location of Cell • Have R2D2 communicate coordinates • To accomplish subgoal (3) • Have last living Jedi walk across catwalks • To accomplish subgoal (4) • Run down hall • Survive in garbage shoot • Fight garbage monster • Have R2D2 stop compaction …
“Divide and Conquer” Algorithms • Break the problem into smaller pieces • Recursively solve the sub-problems • Combine the results into a full solution • If structured properly, this can lead to a much faster solution.
“Divide and Conquer” Algorithms • Problem: Cut up the licorice into 64 pieces!.
“Divide and Conquer” Algorithms • Problem: Cut up the licorice into 64 pieces! • How do we do this in: • O(n) • O(log n) • O(sqrt n)
Santa’s Socks • Problem: • The elves have packed all 1024 boxes with skates • But Santa’s dirty smelly socks fell into one of them. • We have to find the box with the dirty socks! But it’s almost midnight!!!!
Santa’s Socks • http://www.youtube.com/watch?v=wVPCT1VjySA
Analysis of Algorithms • Characterizing the running times of algorithms and data structure operations • Secondarily, characterizing the space usage of algorithms and data structures
Algorithm Complexity • Worst Case Complexity: • the function defined by the maximum number of steps taken on any instance of size n • Best Case Complexity: • the function defined by the minimum number of steps taken on any instance of size n • Average Case Complexity: • the function defined by the average number of steps taken on any instance of size n Adapted from http://www.cs.sunysb.edu/~algorith/lectures-good/node1.html
Best, Worst, and Average Case Complexity Number of steps Worst Case Complexity Average Case Complexity Best Case Complexity N (input size) Adapted from http://www.cs.sunysb.edu/~algorith/lectures-good/node1.html
Monotonicity • mon·o·tone (m n -t n ) • A succession of sounds or words uttered in a single tone of voice • mon·o·ton·ic (m n -t n k) Mathematics. Designating sequences, the successive members of which either consistently increase or decrease but do not oscillate in relative value. • A function is monotonically increasing if whenever a>b then f(a)>=f(b). • A function is strictly increasing if whenever a>b then f(a)>f(b). • Similar for decreasing • Nonmonotonic is the opposite
Monotonic and Nonmonotonic • https://en.wikipedia.org/wiki/Monotonic_function
The Seven Common Functions • Constant: f(n) = c • E.g., adding two numbers, assigning a value to some variable, comparing two numbers, and other basic operations • Useful free web tool: Wolfram’s Alpha
The Seven Common Functions Linear: f(n) = n • Do a single basic operation for each of n elements, e.g., reading n elements into computer memory, comparing a number x to each element of an array of size n
The Seven Common Functions • Quadratic: f(n) = n2 • E.g., nested loops with inner and outer loops Brookshear Figure 5.19
The Seven Common Functions Cubic: f(n) = n3
The Seven Common Functions • More generally, all of the above are polynomial functions: • f(n) = a0 + a1n + a2n2 + a3n3 + … + annd • where the ai’s are constants, called the coefficients of the polynomial. Brookshear Figure 5.19
The Seven Common Functions Logarithmic: f(n) = logbn • Intuition: number of times we can divide n by b until we get a number less than or equal to 1 • The base is omitted for the case of b=2, which is the most common in computing: • log n = log2n Brookshear Figure 5.20
The Seven Common Functions n-log-n function: f(n) = n log n • Product of linear and logarithmic
Comparing Growth Rates • 1 < log n < n1/2 < n < n log n < n2 < n3 < bn http://www.cs.pomona.edu/~marshall/courses/2002/spring/cs50/BigO/ John Chuang
Polynomials Only the dominant terms of a polynomial matter in the long run. Lower-order terms fade to insignificance as the problem size increases. http://www.cs.pomona.edu/~marshall/courses/2002/spring/cs50/BigO/ John Chuang
Definition of Big-Oh A running time is O(g(n)) if there exist constants n0 > 0 and c > 0 such that for all problem sizes n > n0, the running time for a problem of size n is at most c(g(n)). In other words, c(g(n)) is an upper bound on the running time for sufficiently large n. c g(n) Example: the function f(n)=8n–2 is O(n) The running time of algorithm arrayMax is O(n) http://www.cs.dartmouth.edu/~farid/teaching/cs15/cs5/lectures/0519/0519.html
More formally • Let f(n) and g(n) be functions mapping nonnegative integers to real numbers. • f(n) is (g(n)) if there exist positive constants n0 and c such that for all n>=n0,f(n) <= c*g(n) • Other ways to say this: f(n) is orderg(n) f(n) is big-Oh ofg(n) f(n) is Oh ofg(n) f(n) O(g(n)) (set notation)
Comparing Running Times Adapted from Goodrich & Tamassia
Analysis Example: Phonebook • Given: • A physical phone book • Organized in alphabetical order • A name you want to look up • An algorithm in which you search through the book sequentially, from first page to last • What is the order of: • The best case running time? • The worst case running time? • The average case running time? • What is: • A better algorithm? • The worst case running time for this algorithm?
Analysis Example (Phonebook) • This better algorithm is called Binary Search • What is its running time? • First you look in the middle of n elements • Then you look in the middle of n/2 = ½*n elements • Then you look in the middle of ½ * ½*n elements • … • Continue until there is only 1 element left • Say you did this m times: ½ * ½ * ½* …*n • Then the number of repetitions is the smallest integer m such that
Analyzing Binary Search • In the worst case, the number of repetitions is the smallest integer m such that • We can rewrite this as follows: Multiply both sides by Take the log of both sides Since m is the worst case time, the algorithm is O(logn)
Hint for Homework Problem 1 • Hint 1: • Write a program that computes the number of gifts for each value of N just to get a feeling for how it works. • Hint 2: • Last time we showed how triangles of dots prove: • 1 + 2 + 3 + ... + n = n(n + 1)/2 • Similarly, to add up the number of gifts, a closed form version of the total numbers is: • (n/6)(n+1)(n+2)
Summary: Analysis of Algorithms • A method for determining, in an abstract way, the asymptotic running time of an algorithm • Here asymptotic means as n gets very large • Useful for comparing algorithms • Useful also for determing tractability • Meaning, a way to determine if the problem is intractable (impossible) or not • Exponential time algorithms are usually intractable. • We’ll revisit these ideas throughout the rest of the course.