1 / 11

TexPoint fonts used in EMF.

Learn about stochastic processes, Markov chains, and queuing networks in computer and network performance modeling. Examples include the machine repairman model, Little's Law, Gambler's Ruin Problem, and Poisson Process. Explore state transition diagrams, calculate steady-state probabilities, and solve complex modeling scenarios.

Download Presentation

TexPoint fonts used in EMF.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CDA6530: Performance Models of Computers and NetworksExamples of Stochastic Process, Markov Chain, M/M/* Queue TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAAAAAA

  2. Queuing Network:Machine Repairman Model • c machines • Each fails at rate ¸ (expo. distr.) • Single repairman, repair rate ¹ (expo. distr.) • Define: N(t) – no. of machines working • 0· N(t) · c

  3. Utilization rate? • =P(repairman busy) = 1-¼c • E[N]? • We can use • Complicated

  4. E[N] Alternative: Little’s Law • Little’s law: N =¸ T • Here: E[N] = arrival ¢ up-time • Arrival rate: • Up time: expo. E[T]=1/¸ • Thus

  5. Markov Chain: Gambler’s Ruin Problem • A gambler who at each play of the game has probability p of winning one unit and prob. q=1-p of losing one unit. Assuming that successive plays are independent, what is the probability that, starting with i units, the gambler’s fortune will reach N before reaching 0?

  6. Define Xn: player’s fortune at time n. • Xn is a Markov chain • Question: • How many states? • State transition diagram? • Note: there is no steady state for this Markov chain! • Do not try to use balance equation here

  7. Pi: prob. that starting with fortune i, the gambler reach N eventually. • No consideration of how many steps • Can treat it as using infinite steps • End prob.: P0=0, PN=1 • Construct recursive equation • Consider first transition • Pi = p¢ Pi+1 + q¢ Pi-1 • Law of total probability • This way of deduction is usually used for infinite steps questions • Similar to the routing failure example 1 in lecture notes “random”

  8. Another Markov Chain Example • Three white and three black balls are distributed in two urns in such a way that each urn contains three balls. At each step, we draw one ball from each urn, and place the ball drawn from the first urn into the second urn, and the ball drawn from the second urn into the first urn. Let Xn denote the state of the system after the n-th step. • We say that the system is in state i if the first urn contains i white balls. Draw state transition diagram. • Calculate steady-state prob.

  9. Poisson Process • Patients arrive at the doctor's office according to a Poisson process with rate ¸=1/10 minute. The doctor will not see a patient until at least three patients are in the waiting room. • What is the probability that nobody is admitted to see the doctor in the first hour?

More Related