300 likes | 372 Views
Computing probabilities by conditioning. Let E denote some event. Define a random variable X by. Computing probabilities by conditioning. Let E denote some event. Define a random variable X by. Computing probabilities by conditioning.
E N D
Computing probabilities by conditioning Let E denote some event. Define a random variable X by
Computing probabilities by conditioning Let E denote some event. Define a random variable X by
Computing probabilities by conditioning Let E denote some event. Define a random variable X by
Example 1: Let X and Y be two independent continuous random variables with densities fX and fY. What is P(X<Y)?
Example 1: Let X and Y be two independent continuous random variables with densities fX and fY. What is P(X<Y)?
Example 1: Let X and Y be two independent continuous random variables with densities fX and fY. What is P(X<Y)?
Example 1: Let X and Y be two independent continuous random variables with densities fX and fY. What is P(X<Y)?
Example 2: Let X and Y be two independent continuous random variables with densities fX and fY. What is the distribution of X+Y?
Example 2: Let X and Y be two independent continuous random variables with densities fX and fY. What is the distribution of X+Y?
Example 2: Let X and Y be two independent continuous random variables with densities fX and fY. What is the distribution of X+Y?
Example 2: Let X and Y be two independent continuous random variables with densities fX and fY. What is the distribution of X+Y?
Example 2: Let X and Y be two independent continuous random variables with densities fX and fY. What is the distribution of X+Y?
Example 2: Let X and Y be two independent continuous random variables with densities fX and fY. What is the distribution of X+Y?
Example 3: (Thinning of a Poisson) Suppose X ~Poisson(l) and {Ui} are i.i.d. Bernoulli(p) independent of X.
Stochastic Processes • A stochastic process {X(t), t T} is collection of random variables
Stochastic Processes • A stochastic process {X(t), t T} is collection of random variables • For each value of t, there is a corresponding random variable X(t) (state of the system at time t)
Stochastic Processes • A stochastic process {X(t), t T} is collection of random variables • For each value of t, there is a corresponding random variable X(t) (state of the system at time t) • When t takes on discrete values (e.g., t = 1, 2, ...) discrete time stochastic process (the notation Xn is often used instead, n = 1, 2, ...)
Stochastic Processes • A stochastic process {X(t), t T} is collection of random variables • For each value of t, there is a corresponding random variable X(t) (state of the system at time t) • When t takes on discrete values (e.g., t = 1, 2, ...) discrete time stochastic process (the notation Xn is often used instead, n = 1, 2, ...) • When t takes on continuous values continuous time stochastic process
Example 1: X(t) is the number of customers waiting in line at time t to check their luggage at an airline counter (continuous stochastic process)
Example 1: X(t) is the number of customers waiting in line at time t to check their luggage at an airline counter (continuous stochastic process) • Example 2:Xn is the number of laptops a computer store sells in week n.
Example 1: X(t) is the number of customers waiting in line at time t to check their luggage at an airline counter (continuous stochastic process) • Example 2:Xn is the number of laptops a computer store sells in week n. • Example 3:Xn = 1 if it rains on the nth day of the month and Xn = 0 otherwise.
Example 4: A gambler wins $1 with probability p, loses $1 with probability 1-p. She starts with $N and quits if she reaches either $M or $0. Xn is the amount of money the gambler has after playing n rounds.
Example 4: A gambler wins $1 with probability p, loses $1 with probability 1-p. She starts with $N and quits if she reaches either $M or $0. Xn is the amount of money the gambler has after playing n rounds. • P(Xn=i+1|Xn-1=i, Xn-2=i-1, ..., X0=N}=P(Xn=i+1|Xn-1=i}=p (i≠0, M) • P(Xn=i-1| Xn-1=i, Xn-2=i-1, ..., X0=N} = P(Xn=i-1|Xn-1=i}=1–p (i≠0, M) Pi, i+1=P(Xn=i+1|Xn-1=i}; Pi, i-1=P(Xn=i-1|Xn-1=i}
Pi, i+1= p; Pi, i-1=1-p for i≠0, M • P0,0= 1; PM, M=1for i≠0, M (0 and M are called absorbing states) • Pi, j= 0, otherwise
Markov Chains • {Xn: n =0, 1, 2, ...} is a discrete time stochastic process • If Xn = i the process is said to be in state i at time n • {i: i=0, 1, 2, ...} is the state space • If P(Xn+1=j|Xn=i, Xn-1=in-1, ..., X0=i0}=P(Xn+1=j|Xn=i} = Pij, the process is said to be a Discrete TimeMarkov Chain (DTMC). • Pijis the transition probability from state i to state j
Example 1: Probability it will rain tomorrow depends only on whether it rains today or not: P(rain tomorrow|rain today) = a P(rain tomorrow|no rain today) = b State 0 = rain State 1 = no rain
Example 2 (random walk): A Markov chain whose state space is 0, 1, 2, ..., and Pi,i+1= p = 1 - Pi,i-1 for i=0, 1, 2, ..., and 0 < p < 1 is said to be a random walk.
Defining a DTMC To define a DTMC, we need • Specify the states • Demonstrate the Markov property • Obtain the stationary probability transition matrix P