1 / 37

Markov Processes and Birth-Death Processes

Markov Processes and Birth-Death Processes. J. M. Akinpelu. Exponential Distribution. Definition. A continuous random variable X has an exponential distribution with parameter  > 0 if its probability density function is given by Its distribution function is given by.

quinn-hicks
Download Presentation

Markov Processes and Birth-Death Processes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Markov ProcessesandBirth-Death Processes J. M. Akinpelu

  2. Exponential Distribution • Definition. A continuous random variable X has an exponential distribution with parameter  > 0 if its probability density function is given by • Its distribution function is given by

  3. Exponential Distribution • Theorem 1. A continuous R.V. X is exponentially distributed if and only if for • or equivalently, • A random variable with this property is said to be memoryless.

  4. Exponential Distribution • Proof: If X is exponentially distributed, (1) follows readily. Now assume (1). Define F(x) = P{X ≤ x}, f(x) = F(x), and, G(x) = P{X > x}. It follows that G(x) = ‒ f(x). Now fix x. For h 0, • This implies that, taking the derivative wrt x,

  5. Exponential Distribution • Letting x = 0 and integrating both sides from 0 to t gives

  6. Exponential Distribution • Theorem 2. A R.V. X is exponentially distributed if and only if for h 0,

  7. Exponential Distribution • Proof: Let X be exponentially distributed, then for h 0, • The converse is left as an exercise.

  8. Exponential Distribution slope (rate) ≈ 

  9. Markov Process • A continuous time stochastic process {Xt, t 0} with state space E is called a Markov process provided that • for all states i, j E and all s, t  0. known 0 s s+t

  10. Markov Process • We restrict ourselves to Markov processes for which the state space E = {0, 1, 2, …}, and such that the conditional probabilities • are independent of s. Such a Markov process is called time-homogeneous. • Pij(t) is called the transition function of the Markov process X.

  11. Markov Process - Example • Let X be a Markov process with • where • for some > 0. X is a Poisson process. 0

  12. Chapman-Kolmogorov Equations • Theorem 3. For i, j E, t, s  0,

  13. Xt() 7 S4 6 S2 5 4 S3 3 S1 S5 2 S0 1 0 t T0 T2 T3 T4 T5 T1 Realization of a Markov Process

  14. Wt t Tn+1 Tn t+u Time Spent in a State • Theorem 4. Let t  0, and n satisfy Tn ≤ t < Tn+1, and let Wt = Tn+1 – t. Let i E, u  0, and define • Then • Note: This implies that the distribution oftime remaining in a state is exponentially distributed, regardless of the time already spent in that state.

  15. Time Spent in a State • Proof: We first note that due to the time homogeneity of X, G(u) is independent of t. If we fix i, then we have

  16. An Alternative Characterization of a Markov Process • Theorem 5. Let X ={Xt, t 0} be a Markov process. Let T0, T1, …, be the successive state transition times and let S0, S1, …, be the successive states visited by X. There exists some number i such that for any non-negative integer n, for any j E, and t > 0, • where

  17. An Alternative Characterization of a Markov Process • This implies that the successive states visited by a Markov process form a Markov chain with transition matrix Q. • A Markov process is irreducible recurrent if its underlying Markov chain is irreducible recurrent.

  18. Kolmogorov Equations • Theorem 6. • and, under suitable regularity conditions, • These are Kolmogorov’s Backward and Forward Equations.

  19. Kolmogorov Equations • Proof (Forward Equation): For t, h 0, • Hence • Taking the limit as h 0, we get our result.

  20. Limiting Probabilities • Theorem 7. If a Markov process is irreducible recurrent, then limiting probabilities • exist independent of i, and satisfy • for all j. These are referred to as “balance equations”. Together with the condition • they uniquely determine the limiting distribution.

  21. Birth-Death Processes • Definition. A birth-death process {X(t), t 0} is a Markov process such that, if the process is in state j, then the only transitions allowed are to state j + 1 or to state j – 1 (if j > 0). • It follows that there exist non-negative values jand j, • j = 0, 1, 2, …, (called the birth rates and death rates) so that,

  22. j-1 j j j+1 j-1 j j+1 Birth and Death Rates • Note: • The expected time in state j before entering state j+1 is 1/j; the expected time in state j before entering state j‒1 is 1/j. • The rate corresponding to state j is vj = j + j.

  23. Differential-Difference Equations for a Birth-Death Process • It follows that, if , then • Together with the state distribution at time 0, this completely describes the behavior of the birth-death process.

  24. Birth-Death Processes - Example • Pure birth process with constant birth rate j =  > 0, j = 0 for all j. Assume that Then solving the difference-differential equations for this process gives

  25. Birth-Death Processes - Example • Pure death process with proportional death rate j = 0 for all j, j = j > 0 for 1 ≤ j ≤ N, j = 0 otherwise, and Then solving the difference-differential equations for this process gives

  26. Limiting Probabilities • Now assume that limiting probabilities Pjexist. They must satisfy: • or

  27. Limiting Probabilities • These are the balance equations for a birth-death process. Together with the condition • they uniquely define the limiting probabilities.

  28. Limiting Probabilities • From (*), one can prove by induction that

  29. When Do Limiting Probabilities Exist? • Define • It is easy to show that • if S < . (This is equivalent to the condition P0 > 0.) Furthermore, all of the states are recurrent positive, i.e., ergodic. If S = , then either all of the states are recurrent null or all of the states are transient, and limiting probabilities do not exist.

  30. j-1 j j j+1 j-1 j j+1 Flow Balance Method • Draw a closed boundary around state j: • “flow in = flow out” Global balance equation:

  31. Flow Balance Method • Draw a closed boundary between state j and state j–1: j-1 j j j+1 j-1 j j+1 Detailed balance equation:

  32. Example • Machine repair problem. Suppose there are m machines serviced by one repairman. Each machine runs without failure, independent of all others, an exponential time with mean 1/. When it fails, it waits until the repairman can come to repair it, and the repair itself takes an exponentially distributed amount of time with mean 1/. Once repaired, the machine is as good as new. • What is the probability that j machines are failed?

  33. j-1=(m‒j+1) j=(m‒j) j j+1 j‒1 j= j+1= Example • Let Pj be the steady-state probability of j failed machines.

  34. j-1=(m‒j+1) j=(m‒j) j j+1 j‒1 j= j+1= Example

  35. Example • How would this example change if there were m (or more) repairmen?

  36. Homework • No homework this week due to test next week.

  37. References • Erhan Cinlar, Introduction to Stochastic Processes, Prentice-Hall, Inc., 1975. • Leonard Kleinrock, Queueing Systems, Volume I: Theory, John Wiley & Sons, 1975. • Sheldon M. Ross, Introduction to Probability Models, Ninth Edition, Elsevier Inc., 2007.

More Related