650 likes | 669 Views
Learn the mathematical framework for handling uncertainties, including probabilistic models, random variables, distributions, expectation, variance, and special random variables.
E N D
Probability theory provides a mathematical framework for dealing with uncertainties. Appendix A: Probability Theory Many processes inevitably involves uncertainties, which may result from error, noise, imprecision, incompletion, distortion, etc.. A.1 Probabilistic Models -- A probabilistic model is a mathematical description of a process under an uncertain situation (i.e., random process). 1
Elements of probability model Random process : whose outcome o is not predictable Sample space S: the set of all possible outcomes Event E: any subset of S Probability law P : the collective probability of the outcomes of an event,e.g. 2
Probability :frequency of occurring o, or degree of belief in o (e.g., war) Example: A coin is tossed 3 times. Let the probability of a head on an individual toss be p. Sample space: Probability Law:
Properties of probability laws: Let A, B, C: events
Axioms of probability Conditional Probability The proportion of F in E 5
Bayes’ Formula • Total Probability Theorem: Partition: 6
Bayes’ formula becomes Multiplication Rule: 7
“Disjoint” simplifies “Independent” simplifies 9 9
Example: Two successive rolls of a 4-sided die in which all 16 possible outcomes have probability 1/16. 10 10
((1,2), (2,1), (2,2)) ((4,2), (2,4), (3,2), (2,3), (2,2)) 11 11
A.2 Random Variables -- A random variable is a function, which assigns to each outcome a number that may represent a label or a quantity. • Example (label): Coin tossing Define random variable • Example (quantity): A coin is tossed 3 times (head: +$1; tail: -$1) Define random variable X as the net earned. 12
A.2.1 Cumulative Distribution Function (CDF), Probability Density Function (PDF) and Probability Mass Function (PMF) The cumulative distribution function (CDF)F of a random variable X for a real number a is Continuous: probability density function (PDF) Discrete: probability mass function (PMF) 13
Example: A coin is tossed 3 times. PMF: 14 14
CDF: 15 15
A.2.2 Joint CDF and Joint PDF -- involving multiple random variables Example: A coin is tossed 3 times. RVs: X (head: +$1; tail: -$1); Y (head: +$2; tail: $0) 16 16
Joint CDF: : joint PDF Marginal CDF: Marginal PDF: • Marginal functions are to eliminate the influences of some random variables. 17
Conditional distribution: Given joint PDF , the marginal PDF can be calculated and in turn the conditional distribution can be computed. Multiplication rule: 18
Proof: 19 19
Bayes’ rule: Proof: • If X and Y are independent, 20
* What is the relationship between events and random variables (rv)? -- rvs are a special case of events, i.e., events containing only one outcome (singleton events). * What is the difference between ordinary variables (ov) and random variables (rv)? -- rv’s are associated with probability distributions while ovs are not. 21
A.2.5 Expectation Expectation: Example: 22 22
Properties of expectation: Proof: (Assignment) 23 23
A.2.6 Variance Variance: : standard deviation • Properties of variance: 24 24
Covariance: 25 25
(Assignment) vi) If are independent, the covariance of any two rv’s is zero, Correlation: 27 27
A.3 Special Random Variables A.3.1 Discrete Distributions Bernoulli distribution • Random variable X takes 0/1, e.g., coin tossing. Let p be the probability that X = 1. • Bernoulli probability: 28 28
Binomial distribution -- N identical independent Bernoulli trials • Random variable X represents the number of 1s. • Binomial probability: Multinomial distribution -- N iid trials, each takes one of K states 29 29
Multinomial probability: 30 30
Geometric distribution -- rv X represents #Bernoulli tosses needed for a head to come up for the first time Geometric probability: Poisson distribution -- rv X represents, e.g., i) #typos in a book with a total of n words, ii) #cars involved in accidents in a city Poisson probability: 31 31
A.3.2 Continuous Distributions Uniform distribution: -- X takes values equably over the interval [a, b] Uniform density: 32 32
Normal (Gaussian) distribution: Normal density: 34 34
Exponential distribution: Exponential density: Rayleigh distribution: Rayleigh density: 35
A.3.3 Hybrid Distributions Theorem 1:Let f be a differentiable strictly increasing or strictly decreasing function defined on I. Let X be a continuous random variable having density . Let having density . Then, Proof: Let the CDFs ofrandom variables X and Y. 36
Example:Z-normalization Proof: 39 39
40 40
Theorem 2: Let be independent random variables having the respective normal densities . Then, has the normal density Proof: Assume that , then Since are independent, 41
Example: 43
Theorem 3: Central Limit Theorem Example: 44
Chi-square distribution Chi-square density: Properties: 45 45
Gamma density: Chi-square density 46 46
Erlang density: Beta density: Dirichlet density: 47 47
t density: 48 48
F density: 49 49
Summary • Probability Model • Properties of probability laws: 50