1.67k likes | 1.89k Views
Stats 241.3. Probability Theory Summary. Probability. Axioms of Probability A probability measure P is defined on S by defining for each event E , P [ E ] with the following properties. P [ E ] ≥ 0 , for each E. P [ S ] = 1. Finite uniform probability space.
E N D
Stats 241.3 Probability Theory Summary
Axioms of Probability A probability measureP is defined on S by defining for each event E, P[E] with the following properties • P[E] ≥ 0, for each E. • P[S] = 1.
Finite uniform probability space Many examples fall into this category • Finite number of outcomes • All outcomes are equally likely To handle problems in case we have to be able to count. Count n(E) and n(S).
Basic Rule of counting Suppose we carry out k operations in sequence Let n1 = the number of ways the first operation can be performed ni = the number of ways the ith operation can be performed once the first (i - 1) operations have been completed. i = 2, 3, … , k Then N = n1n2 … nk= the number of ways the k operations can be performed in sequence.
Basic Counting Formulae • Permutations: How many ways can you order n objects n! • Permutations of size k (< n): How many ways can you choose k objects from n objects in a specific order
Combinations of size k ( ≤n): A combination of size k chosen from n objects is a subset of size k where the order of selection is irrelevant. How many ways can you choose a combination of size k objects from n objects (order of selection is irrelevant)
Important Notes • In combinations ordering is irrelevant. Different orderings result in the same combination. • In permutations order is relevant. Different orderings result in the different permutations.
The additive rule P[A B] = P[A] + P[B] – P[A B] and if P[A B] = f P[A B] = P[A] + P[B]
The additive rule for more than two events and if Ai Aj = f for all i ≠ j. then
for any event E The Rule for complements
Conditional Probability,Independence andThe Multiplicative Rule
The multiplicative rule of probability and if A and B areindependent. This is the definition of independence
Definition: The set of k events A1, A2, … , Akare called mutually independent if: P[Ai1∩ Ai2∩… ∩ Aim] = P[Ai1] P[Ai2] …P[Aim] For every subset {i1, i2, … , im } of {1, 2, …, k } i.e.for k = 3 A1, A2, … , Akare mutually independentif: P[A1∩ A2] = P[A1] P[A2], P[A1∩ A3] = P[A1] P[A3], P[A2∩ A3] = P[A2] P[A3], P[A1∩ A2∩ A3] = P[A1] P[A2] P[A3]
Definition: The set of k events A1, A2, … , Akare called pairwise independent if: P[Ai∩ Aj] = P[Ai] P[Aj] for all i and j. i.e.for k = 3 A1, A2, … , Akare pairwise independentif: P[A1∩ A2] = P[A1] P[A2], P[A1∩ A3] = P[A1] P[A3], P[A2∩ A3] = P[A2] P[A3], It is not necessarily true that P[A1∩ A2∩ A3] = P[A1] P[A2] P[A3]
An generalization of Bayes Rule Let A1, A2 , …, Ak denote a set of events such that for all i and j. Then
Random Variables an important concept in probability
A random variable , X, is a numerical quantity whose value is determined be a random experiment
Definition – The probability function, p(x), of a random variable, X. For any random variable, X, and any real number, x, we define where {X = x} = the set of all outcomes (event) with X = x. For continuous random variables p(x) = 0 for all values of x.
Definition – The cumulative distribution function, F(x), of a random variable, X. For any random variable, X, and any real number, x, we define where {X≤x} = the set of all outcomes (event) with X ≤x.
Discrete Random Variables For a discrete random variable X the probability distribution is described by the probability function p(x), which has the following properties
Graph: Discrete Random Variable p(x) b a
Continuousrandom variables For a continuous random variable X the probability distribution is described by the probability density function f(x), which has the following properties : • f(x) ≥ 0
Graph: Continuous Random Variableprobability density function, f(x)
The distribution function F(x) This is defined for any random variable, X. F(x) = P[X ≤ x] Properties • F(-∞) = 0 and F(∞) = 1. • F(x) is non-decreasing (i. e. if x1 < x2 then F(x1) ≤F(x2) ) • F(b) – F(a) = P[a < X ≤ b].
p(x) = P[X = x] =F(x) – F(x-) Here • If p(x) = 0 for all x (i.e. X is continuous) then F(x) is continuous.
For Discrete Random Variables F(x) is a non-decreasing step function with F(x) p(x)
f(x) slope F(x) x • For Continuous Random Variables Variables F(x) is a non-decreasing continuous function with To find the probability density function, f(x), one first finds F(x) then
Success (S) • Failure (F) Suppose that we have a experiment that has two outcomes These terms are used in reliability testing. Suppose that p is the probability of success (S) and q = 1 – p is the probability of failure (F) This experiment is sometimes called a Bernoulli Trial Let Then
The probability distribution with probability function is called the Bernoulli distribution p q = 1- p
We observe a Bernoulli trial (S,F)n times. Let X denote the number of successes in the n trials. Then X has a binomial distribution, i. e. where • p = the probability of success (S), and • q = 1 – p = the probability of failure (F)
The Poisson distribution • Suppose events are occurring randomly and uniformly in time. • Let X be the number of events occuring in a fixed period of time. Then X will have a Poisson distribution with parameter l.
The Geometric distribution Suppose a Bernoulli trial (S,F) is repeated until a success occurs. X = the trial on which the first success (S) occurs. The probability function of X is: p(x) =P[X = x] = (1 – p)x – 1p = p qx - 1
The Negative Binomial distribution Suppose a Bernoulli trial (S,F) is repeated until k successes occur. Let X = the trial on which the kth success (S) occurs. The probability function of X is:
The Hypergeometric distribution Suppose we have a population containing N objects. Suppose the elements of the population are partitioned into two groups. Let a = the number of elements in group A and let b = the number of elements in the other group (group B). Note N = a+ b. Now suppose that n elements are selected from the population at random. Let X denote the elements from group A. The probability distribution of X is
Continuousrandom variables For a continuous random variable X the probability distribution is described by the probability density function f(x), which has the following properties : • f(x) ≥ 0
Graph: Continuous Random Variableprobability density function, f(x)
The Uniform distribution from a to b Continuous Distributions