310 likes | 543 Views
Probability & Stochastic Processes. CH 1 Experiments, Models, & Probabilities. Goal of Lecture. Understanding & Handling Random Signals (Processes) Random Signal vs. Deterministic Signal Random Signal ← Random Variable ← Probability Theory ← Set Theory.
E N D
Probability &Stochastic Processes CH 1 Experiments, Models, & Probabilities
Goal of Lecture • Understanding & Handling Random Signals (Processes) • Random Signal vs. Deterministic Signal • Random Signal ← Random Variable ← • Probability Theory ← Set Theory
CH 1.1 Set Theory • <definition> • Set = a collection of things(elements) • Essential Conditions: Everyone’s Agreement on the following 2 decisions • 1) Anelement is included in aset or not. • 2) Twoelements are same or not. • <notations> • Set : A, B, C,… capital letters • Element : x, y, z,… small letters • Naming the elements: A={x, y, z}, C={students in this class} • Giving a rule : A={x2| x=1,2,3,…}
CH 1.1 Set Theory • <definitions> • Subset : • Equal : • Universal (Super) Set S: A set of all things that we are currently considering • Null Set : A subset of any set, i.e. • A set which has no element • We simply define it for convenience.
CH 1.1 Set Theory • Unionof sets A & B: S B S B A A • Intersectionof sets A & B: S S A A B B
CH 1.1 Set Theory • Complementof a set A: Ac S B A • Difference between sets A & B: S B A
CH 1.1 Set Theory • Notes • Sets A & Bare disjointiff S A B
CH 1.1 Set Theory • A collection of sets A1, A2,…, ANis mutually exclusive(ME) • iff . . . S AN A3 A1 A2 • A collection of sets A1, A2,…, ANis collectively exhaustive • (CE) iff A4 A5 A8 A6 A7 S . . . AN A1 A3 A2 • A collection of sets A1, A2,…, ANis ME & CE. S . . . A1 A2 A3 AN
CH 1.1 Set Theory <proof> <cf> S B A S A B
CH 1.2 Applying Set Theory to Probability • Whatis Probability? • ① Tossing a coin → H or T (prob.½ with equally likely assumption) • ② Tossing a coin threetimes → HHH, HHT,…,TTT (prob. 1/8) • ③ Tossing a coin threetimes, observe # of H • → 0, 1, 2, 3 (prob. 1/8, 3/8, 3/8, 1/8) • An experimentconsists of • 1) Procedure : Tossing a coin (three times) • 2) Observation : H or T, # of H,… → Outcomes are decided. • 3) Model : equally likely,… → Prob. of a set of outcome (event) is decided. • <def 1.1> • Outcome (of an experiment) ≡ • any possible observation of the experiment
CH 1.2 Applying Set Theory to Probability • <def 1.2> • Sample Space (of an experiment) ≡ • finest grain, (all possible distinguishable outcomes are identified separately.) • mutually exclusive, • collectively exhaustive, • set of all possible outcomes • Ex) ① S={H, T} • ②S={HHH, HHT, HTH, HTT, THH, THT, TTH, TTT} • ③S={0, 1, 2, 3}
CH 1.2 Applying Set Theory to Probability • <def 1.3> • Event (of an experiment) ≡ a set of outcomes • Ex 1.6 : Experiment: • Rolling a die procedure, # of dots observation • Sample Space S={1, 2, 3, 4, 5, 6} • EventsE1={# ≥ 4 }={4, 5, 6} • E2={even #}={2, 4, 6} • Ex 1.7 : Making a phone call & observing the duration • Sample Space S={x | x ≥ 0, real} • EventsE1 ={x | x ≥ 5, real}
CH 1.2 Applying Set Theory to Probability • <def 1.4> • Event Space ≡ a collectively exhaustive (CE) & • mutually exclusive (ME) set of events • Both Sample Space &Event Space are ME & CE. • Difference: • Sample Space – a set of finest grain outcomes • Event Space – a set of event(set)
CH 1.2 Applying Set Theory to Probability Ex 1.9 & 1.10 : Tossing 4 different coins S={hhhh, hhht, hhth,…, tttt}, 16outcomes define events Bi ≡ {outcomes with iheads}, then B0={tttt}, B1={httt, thtt, ttht, ttth}, B2={hhtt, htht, htth, thht, thth, tthh}, B3={hhht, hhth, hthh, thhh}, B4={hhhh} B={B0 , B1 , B2 , B3 , B4} → event space If C1={outcomes with more than 1 h} & C2={outcomes with less than 2 h’s}, then C={C1 , C2} → event space. If C2={outcomes with less than 3 h’s} → not event space. not ME If C2={outcomes without h} → not event space. not CE
CH 1.2 Applying Set Theory to Probability • QZ 1.2: Phone calls:v(voice) or d(data) • Observe 3 consecutive calls • → outcomes : vvv, vvd, vdv, vdd, dvv, dvd, ddv, ddd 8 cases • Event A1 : first call isv →{vvv, vvd, vdv, vdd} • Event B1 : first call isd →{dvv, dvd, ddv, ddd} • Are A1 and B1 ME? CE? → yes, yes • B = {A1, B1} event space • Event A2:2nd call isv →{vvv, vvd, dvv, dvd} • →
CH 1.2 Applying Set Theory to Probability (b) Event A3 : all calls are same→{vvv, ddd} Event B3 : v and d are alternate→{vdv, dvd} Are A3 and B3 ME? CE? → yes, no →
CH 1.3 Probability Axioms • What is Prob. ? • When an experiment (procedure, observation, & model) is defined, the prob. of an event (a set of outcomes) may be formulated as the relative frequency of an event.
CH 1.3 Probability Axioms • Notes
CH 1.3 Probability Axioms • Notes
CH 1.3 Probability Axioms • Notes • <def> • Equally Likely Outcomes :
CH 1.5 Conditional Probability • <def 1.6> • When event B is given, occurrence of event A becomesA∩B
CH 1.5 Conditional Probability Ex 1.18 : Roll two four-sided dies. X1 & X2= dots, equally likelydots. Universal set S={(X1 , X2 )| X1 , X2 =1, 2, 3, 4}
CH 1.6 Independence • <def 1.7> Events A & B are independent iff P[AB] = P [A] P[B] • → P[A/B] = P[A], P[B/A] = P[B] • <def 1.9> n events A1,A2,…, An are indep. iff any combination of • (n-1) events are indep. & P[A1A2…An ] = P [A1] P[A2]…P[An] • Notes : ME meansP[AB] = 0, is different from independence Ex 1.21 : Three lights, red or green, equally likely. S={rrr, rrg,…, ggg}, events Ri ={i-thlight r}, Gi ={i-th light g} R2 ={rrr, rrg, grr, grg} → P[R2] =4/8 = ½
CH 1.7 Sequential Experiments & Tree Diagram • Tree diagramis very useful for sequential experiments. Ex 1.27 : 2coins, coin-1: biased with P[H] =¾ coin-2: unbiased i.e. P[H] = ½ Pick a coin & draw, Prob. of picking a coin is equally likely. P[C1/H] = ?, P[C1/T] = ?
CH 1.8 Counting Methods • Experiment A & B : # of outcome is n &k respectively, • # of outcomes for the sequential experiment : n×k • Permutation: # of cases for selecting orderedk objects among n objects • Combination: # of cases for selecting unorderedk objects among n objects
CH 1.8 Counting Methods • There are m different objects. # of cases for selecting an object & returning the selected objectn times is mn. • Perform indep. success (with prob. p) or fail trials n times; • Prob. of n0failures ?