170 likes | 327 Views
CSE 3358 Note Set 2. Data Structures and Algorithms. Overview:. What are we measuring and why? Computational complexity introduction Big-O Notation. Problem. For a problem Different ways to solve – differing algorithms Consider the problem of searching, possible searching algorithms?.
E N D
CSE 3358 Note Set 2 Data Structures and Algorithms
Overview: • What are we measuring and why? • Computational complexity introduction • Big-O Notation
Problem • For a problem • Different ways to solve – differing algorithms • Consider the problem of searching, possible searching algorithms?
Efficiency • Limited amount of resources to use in solving a problem. • Resource Examples: • We can use any metric to compare various algorithms intended to solve the same problem. • Using some metrics in for some problems might not be enlightening…
Computational Complexity • Computational complexity – the amount of effort needed to apply an algorithm or how costly it is. • Two most common metrics (and the ones we’ll use) • Time (most common) • Space • Time to execute an algorithm on a particular data set is system dependent. Why?
Seconds? Microseconds? Nanoseconds? • Do not use the above when talking about algorithms unless specific to a particular machine at a particular time. • Why not?
What will we use? • Logical units that express the relationship between the size n of a data set and the amount of time t required to process the data • For algorithm X using dataset n, T(n) = amount of time needed to execute X using n. • Problem: Ordering of the values in n can affect T(n). What to do?
Three Cases • Best Case Analysis: • when the number of steps needed to complete an algorithm with a data set is minimized • e.g. Sorting a sorted list • Worst Case Analysis: • when the maximum number of steps possible with an algorithm is needed to solve a problem for a particular data set • Average Case Analysis:
Example • The problem is SORTING (ascending) • Best Case: • Worst Case: • Average Case: Data Set: n = _______ 5 3 9 8
Asymptotic Notation: Big-O • Can be read as: • f is big-O of g if there is a positive number c such that f is not larger than c*g for sufficiently large ns (for all ns larger than some number N) • In other words: • The relationship between f and g can be expressed by stating either that g(n) is an upper bound on the value of f(n) or that , in the long run, f grows at most as fast as g. Definition f(n) is O(g(n)) if there exist positive numbers c and N such that f(n) <= c*g(n) for all n >= N.
Asymptotic Notation: Big-O Graphic Example
Asymptotic Notation: Big-O Definition For f(n)= 2n2 + 3n + 1, 2n2 + 3n + 1 is O(n2) f(n) is O(g(n)) if there exist positive numbers c and N such that f(n) <= c*g(n) for all n >= N. These are the cs
Asymptotic Notation: Big-O Example: Show that 2n2 + 3n + 1 is O(n2). By the definition of Big-O 2n2 + 3n + 1 <= c*n2for all n >= N. So we must find a c and N such that the inequality holds for all n > N. How?
Asymptotic Notation: Big-O • Reality Check: • We’re interested in what happens to the number of ops needed to solve a problem as the size of the input increases toward infinity. • Not too interested in what happens with small data sets.
Notes on Notation • Very common to see f(n) = O(g(n)) • Not completely accurate • not symmetric about = • Technically, O(g(n)) is a set of functions. • Set definition O(g(n)) = { f(n): there are constants c > 0, N>0 such that 0<=f(n)<=g(n) for all n > N.} • When we say f(n) = O(g(n)), we really mean thatf(n)∈O(g(n)).
Asymptotic Notation: Big-O Show that n2 = O(n3).