410 likes | 891 Views
DATA STRUCTURES AND ALGORITHMS. Lecture Notes 1 Prepared by İnanç TAHRALI. ROAD MAP. What is an algorithm ? What is a data structure ? Analysis of An Algorithm Asymptotic Notations Big Oh Notation Omega Notation Theta Notation Little o Notation Rules about Asymptotic Notations.
E N D
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI
ROAD MAP • What is an algorithm ? • What is a data structure ? • Analysis of An Algorithm • Asymptotic Notations • Big Oh Notation • Omega Notation • Theta Notation • Little o Notation • Rules about Asymptotic Notations
What is An Algorithm ? • Definition : A finite, clearly specified sequence of instructions to be followed to solve a problem.
What is An Algorithm ? Problem : Write a program to calculate int Sum (int N) PartialSum 0 i 1 foreach (i > 0) and (i<=N) PartialSum PartialSum + (i*i*i) increase i with 1 return value of PartialSum int Sum (int N) { int PartialSum = 0 ; for (int i=1; i<=N; i++) PartialSum += i * i * i; return PartialSum; }
Properties of an Algorithm • Effectiveness • simple • can be carried out by pen and paper • Definiteness • clear • meaning is unique • Correctness • give the right answer for all possible cases • Finiteness • stop in reasonable time
Is function F an algorithm? int F (int m) { n = m; while (n>1) { if (n/2 == 0) n=n/2; else n=3*n+1; } return m; }
What is a Data Structure ? • Definition : An organization and representation of data • representation • data can be stored variously according to their type • signed, unsigned, etc. • example : integer representation in memory • organization • the way of storing data changes according to the organization • ordered, inordered, tree • example : if you have more then one integer ?
Properties of a Data Structure ? • Efficient utilization of medium • Efficient algorithms for • creation • manipulation (insertion/deletion) • data retrieval (Find) • A well-designed data structure allows using little • resources • execution time • memory space
The Process of Algorithm Development • Design • divide&conquer, greedy, dynamic programming • Validation • check whether it is correct • Analysis • determine the properties of algorithm • Implementation • Testing • check whether it works for all possible cases
Analysis of Algorithm • Analysis investigates • What are the properties of the algorithm? • in terms of time and space • How good is the algorithm ? • according to the properties • How it compares with others? • not always exact • Is it the best that can be done? • difficult !
Mathematical Background • Assume the functions for running times of two algorthms are found ! For input size N Running time of Algorithm A = TA(N) = 1000 N Running time of Algorithm B = TB(N) = N2 Which one is faster ?
Mathematical Background If the unit of running time of algorithms A and B is µsec So which algorithm is faster ?
Mathematical Background If N<1000 TA(N) > TB(N) o/w TB(N) > TA(N) Compare their relative growth ?
Mathematical Background • Is it always possible to have definite results? NO ! The running times of algorithms can change because of the platform, the properties of the computer, etc. We use asymptotic notations (O, Ω, θ, o) • compare relative growth • compare only algorithms
Big Oh Notation (O) Provides an “upper bound” for the function f • Definition : T(N) = O (f(N)) if there are positive constants c and n0 such that T(N) ≤cf(N) when N ≥ n0 • T(N) grows no faster than f(N) • growth rate of T(N) is less than or equal to growth rate of f(N) for large N • f(N) is an upper bound on T(N) • not fully correct !
Big Oh Notation (O) • Analysis of Algorithm A 1000 N ≤ cN if c= 2000 and n0 = 1 for all N is right
Examples • 7n+5 = O(n) for c=8 and n0 =5 7n+5 ≤ 8n n>5 = n0 • 7n+5 = O(n2) for c=7 and n0=2 7n+5 ≤ 7n2 n≥n0 • 7n2+3n = O(n) ?
Advantages of O Notation • It is possible to compare of two algorithms with running times • Constants can be ignored. • Units are not important O(7n2) = O(n2) • Lower order terms are ignored • O(n3+7n2+3) = O(n3)
Running Times of Algorithm A and B TA(N) = 1000 N = O(N) TB(N) = N2 = O(N2) A is asymptotically faster than B !
Omega Notation (Ω) • Definition : T(N) = Ω (f(N)) if there are positive constants c and n0such that T(N) ≥ c f(N) when N≥ n0 • T(N) grows no slower than f(N) • growth rate of T(N) is greater than or equal to growth rate of f(N) for large N • f(N) is a lower bound on T(N) • not fully correct !
Omega Notation Example: • n 1/2 = W(lg n) . for c = 1 and n0 = 16 Let n > 16 c*(lg n) ≤ n 1/2
Omega Notation • Theorem: f(N) = O(g(n)) <=> g(n) = Ω(f(N)) Proof: f(N) ≤ c1g(n) <=> g(n) ≥ c2f(N) divide the left side with c1 1/c1f(N) ≤ g(n) <=> g(n) ≥ c2f(N) if we choose c2 as 1/c1 then theorem is right.
Omega Notation • 7n2 + 3n + 5 = O(n4) • 7n2 + 3n + 5 = O(n3) • 7n2 + 3n + 5 = O(n2) • 7n2 + 3n + 5 = Ω(n2) • 7n2 + 3n + 5 = Ω(n) • 7n2 + 3n + 5 = Ω(1) n2 and 7n2 + 3n + 5 grows at the same rate 7n2 + 3n + 5 = O(n2) = Ω(n2) = θ (n2)
Theta Notation (θ) • Definition : T(N) = θ (h(N)) if and only if T(N) = O(h(N)) and T(N) = Ω(h(N)) • T(N) grows as fast as h(N) • growth rate of T(N) and h(N) are equal for large N • h(N) is a tight bound on T(N) • not fully correct !
Theta Notation (θ) • Example : T(N) = 3N2 T(N) = O(N4) T(N) = O(N3) T(N) = θ(N2) best
Little O Notation (o) • Definition : T(N) = o(p(N)) if T(N) = O(p(N)) and T(N)≠θ(p(N)) • f(N) grows strictly faster than T(N) • growth rate of T(N) is less than the growth rate of f(N) for large N • f(N) is an upperbound on T(N) (but not tight) • not fully correct !
Little O Notation (o) • Example : T(N) = 3N2 T(N) = o(N4) T(N) = o(N3) T(N) = θ(N2)
RULES • RULE 1: if T1(N) = O(f(N)) and T2(N) = O(g(N)) then a) T1(N) + T2(N) = max (O(f(N)), O(g(N))) b) T1(N) * T2(N) = O(f(N) * g(N)) You can prove these ? Is it true for θ notation ? What about Ω notation?
RULES • RULE 2: if T(N) is a polynomial of degree k T(N) = akNk + ak-1Nk-1 + … + a1N + a0 then T(N) = θ(Nk)
RULES • RULE 3: logk N = o(N) for any constant k logarithm grows very slowly !
Some Common Functions • c = o (log N) => c=O(log N) but c≠Ω(logN) • log N = o(log2 N) • log2 N = o(N) • N = o(N log N) • N = o (N2) • N2 = o (N3) • N3 = o (2N)
Example : T(N) = 4N2 • T(N) = O(2N2) correct but bad style T(N) = O(N2) drop the constants • T(N) = O(N2+N) correct but bad style T(N) = O(N2) ignore low order terms
Example : f(N) = 7N2 g(N) = N2 + N
Example : f(N) = N logN g(N) = N1.5 compare logN with N0.5 compare log2N with N compare log2N with o(N) N logN = o(N1.5)