680 likes | 790 Views
Assignment 3. Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004. Example.
E N D
Assignment 3 • Chapter 3: Problems 7, 11, 14 • Chapter 4: Problems 5, 6, 14 • Due date: Monday, March 15, 2004
Example Inventory System: Inventory at a store is reviewed daily. If inventory drops below 3 units, an order is placed with the supplier which is delivered the next day. The order size should bring inventory position to 6 units. Daily demand D is i.i.d. with distribution P(D = 0) =1/3 P(D = 1) =1/3 P(D = 2) =1/3. Let Xn describe inventory level on the nth day. Is the process {Xn} a Markov chain? Assume we start with 6 units.
Markov Chains • {Xn: n =0, 1, 2, ...} is a discrete time stochastic process
Markov Chains • {Xn: n =0, 1, 2, ...} is a discrete time stochastic process • If Xn = i the process is said to be in state i at time n
Markov Chains • {Xn: n =0, 1, 2, ...} is a discrete time stochastic process • If Xn = i the process is said to be in state i at time n • {i: i=0, 1, 2, ...} is the state space
Markov Chains • {Xn: n =0, 1, 2, ...} is a discrete time stochastic process • If Xn = i the process is said to be in state i at time n • {i: i=0, 1, 2, ...} is the state space • If P(Xn+1=j|Xn=i, Xn-1=in-1, ..., X0=i0}=P(Xn+1=j|Xn=i} = Pij, the process is said to be a Discrete TimeMarkov Chain (DTMC).
Markov Chains • {Xn: n =0, 1, 2, ...} is a discrete time stochastic process • If Xn = i the process is said to be in state i at time n • {i: i=0, 1, 2, ...} is the state space • If P(Xn+1=j|Xn=i, Xn-1=in-1, ..., X0=i0}=P(Xn+1=j|Xn=i} = Pij, the process is said to be a Discrete TimeMarkov Chain (DTMC). • Pijis the transition probability from state i to state j
Example 1: Probability it will rain tomorrow depends only on whether it rains today or not: P(rain tomorrow|rain today) = a P(rain tomorrow|no rain today) = b
Example 1: Probability it will rain tomorrow depends only on whether it rains today or not: P(rain tomorrow|rain today) = a P(rain tomorrow|no rain today) = b State 0 = rain State 1 = no rain
Example 1: Probability it will rain tomorrow depends only on whether it rains today or not: P(rain tomorrow|rain today) = a P(rain tomorrow|no rain today) = b State 0 = rain State 1 = no rain
Example 4: A gambler wins $1 with probability p, loses $1 with probability 1-p. She starts with $N and quits if she reaches either $M or $0. Xn is the amount of money the gambler has after playing n rounds.
Example 4: A gambler wins $1 with probability p, loses $1 with probability 1-p. She starts with $N and quits if she reaches either $M or $0. Xn is the amount of money the gambler has after playing n rounds. • P(Xn=i+1|Xn-1=i, Xn-2=in-2, ..., X0=N}=P(Xn=i+1|Xn-1=i}=p (i≠0, M)
Example 4: A gambler wins $1 with probability p, loses $1 with probability 1-p. She starts with $N and quits if she reaches either $M or $0. Xn is the amount of money the gambler has after playing n rounds. • P(Xn=i+1|Xn-1=i, Xn-2=in-2, ..., X0=N}=P(Xn=i+1|Xn-1=i}=p (i≠0, M) • P(Xn=i-1| Xn-1=i, Xn-2= in-2, ..., X0=N} = P(Xn=i-1|Xn-1=i}=1–p (i≠0, M)
Example 4: A gambler wins $1 with probability p, loses $1 with probability 1-p. She starts with $N and quits if she reaches either $M or $0. Xn is the amount of money the gambler has after playing n rounds. • P(Xn=i+1|Xn-1=i, Xn-2=in-2, ..., X0=N}=P(Xn=i+1|Xn-1=i}=p (i≠0, M) • P(Xn=i-1| Xn-1=i, Xn-2= in-2, ..., X0=N} = P(Xn=i-1|Xn-1=i}=1–p (i≠0, M) Pi, i+1=P(Xn=i+1|Xn-1=i}; Pi, i-1=P(Xn=i-1|Xn-1=i}
Pi, i+1= p; • Pi, i-1=1-p for i≠0, M • P0,0= 1; PM, M=1for i≠0, M (0 and M are called absorbing states) • Pi, j= 0, otherwise
random walk: A Markov chain whose state space is 0, 1, 2, ..., and Pi,i+1= p = 1 - Pi,i-1 for i=0, 1, 2, ..., and 0 < p < 1 is said to be a random walk.
Example 1: Probability it will rain tomorrow depends only on whether it rains today or not: P(rain tomorrow|rain today) = a P(rain tomorrow|no rain today) = b What is the probability that it will rain four days from today given that it is raining today? Let a = 0.7 and b = 0.4. State 0 = rain State 1 = no rain
The Markov chain with transition probability matrix P is irreducible.
Recurrent and transient states • fi: probability that starting in state i, the process will eventually re-enter state i.