360 likes | 503 Views
Queueing Systems I : Chap. 2 Some Important Random Processes. S.Y. Yang. What is a “ Queue ”?. Any system where jobs/customers/users arrive looking for service and depart once service is provided. Notation. C n n-th customer to enter the system N(t)
E N D
Queueing Systems I : Chap. 2 Some Important Random Processes S.Y. Yang
What is a “Queue”? • Any system where jobs/customers/users arrive looking for service and depart once service is provided.
Notation • Cn • n-th customer to enter the system • N(t) • number of customers in the system at time t • U(t) • The unfinished work in the system at time t • The remaining time required to empty the system of all customers present at time t • If U(t) > 0, system is said to be busy and only when U(t) = 0, system is said to be idle.
Arrivals and departures (t)number of arrivals in(0,t) (t) number of departures in(0,t) N(t) the number in the system at time t = (t) - (t) (t) the total time all customers have spent in the system during (0,t) t average arrival rate in (0,t) = (t)/t Tt average system time per customer in (0,t) = (t)/(t) Nt average number of customers during (0,t) = (t)/t
Little’s Theorem • Nt = (t)/t = ((t)/(t))((t)/t) = Tt t • Let t -> N = T The average number of customers in a queueing system is equal to the average arrival rate of customers to that system, times the average time spent in that system.
(utilization factor) • (average arrival rate of customers) x (average service time) = x • In the case of multiple servers • = E[fraction of busy servers]
Discrete-time Markov Chains • Definition : The sequence of random variables X1, X2, … forms a discrete-time Markov chain if • The right side of above equation is referred to as the (one-step) transition probability.
Homogeneous Markov chain • If it turns out that the transition probabilities are independent of n, then we have what is referred to as a homogeneous Markov chain.
Irreducible Markov Chain • We say that a Markov chain is irreducible if every state can be reached from every other state; that is, for each pair of state( Ei and Ej ) there exists an integer m0(which may depend upon i and j) such that
More definitions about Markov chain • Closed • Absorbing state • Reducible • Recurrent/Transient • Periodic/aperiodic • Mean recurrence time • Recurrent null/Recurrent nonnull • P[system in state j at the n-th step]
Irreducible MC: Theorems • Theorem 1 : The states of an irreducible Markov chain are either all transient or all recurrent nonnull or all recurrent null. If periodic, then all states have the same period . • Theorem 2 : In an irreducible and aperiodic homogeneous Markov chain the limiting probabilities always exist and are independent of the initial state pribability distribution.
Moreover, either • A. all states are transient or all states are recurrent null in which cases for all j and there exists no stationary distribution, or • B. all states are recurrent nonnull and then for all j, in which case the set {j} is a stationary probability distribution and
Ergodicity • A state is said to be ergodic if it is aperiodic, recurrent, and nonnull. • Moreover, a Markov chain is said to be ergodic if the probability distribution {j(n)} as a function of n always converges to a limiting stationary distribution {j}, which is independent of the initial state distribution.
Transition probability matrix • Transition probability matrix • Probability vector
Transient behavior of the system • The probability vector at time n
General solution using z-Trans. • Somebody can add summary here. • It’s beyond my understanding.
Memoryless Property of MC • P[system remains in Ei for exactly m additional steps given that it has just entered Ei] = (1-pii)piim • Geometric Distribution!
General(Nonhomogeneous) MC • Multistep transition probabilities • One-step transition probability matrix • Multistep transition probability matrix
Chapman-Kolmogorov equation • Rewritten in matrix form as • Let q=n-1, • Let q=m+1, • Time dependent Probabilities
Continus-time MC. • Definition • Memoryless Property
Continuous-time MC memoryless property leads exponential distribution The pdf for the time the process spends in state Ei is exponentially distributed with the parameter
Continuous-time C-K equation • Def. of Time dependent transition probability • Continuous-time MC C-K equation • Represented by Matrix form
continued • If we define • Q(t) is called transition rate matrix.
continued • Forward & backward C-K equation
Birth-Death Process • A Markov process in which transitions from state Ek are permitted only to neighboring states Ek+1, Ek, and Ek -1. • Birth : a transition from Ek to Ek+1. • Death : a transition from Ek+1 to Ek. • Birth rate : k • Death rate : k
continued • Birth and death rates are independent of time and depend only on Ek. • A continuous time homogeneous MC
What we wish to solve for? • The probability that the population size is k at some time t;
Pure birth Process (Poisson Process) • k = 0 for all k.
Summary • Notations • Arrivals and Departures • Little’s Theorem • Discrete-time Markov Chain • Continuous-time Markov Chain
Reference • Queueing Systems Volume I : Theory, Leonard Kleinrock, JW&S, 1975, http://www.lk.cs.ucla.edu/ • Lecture notes from http://vega.icu.ac.kr/~bnec/ written by Professor J.K. Choi. • Lecture notes from http://home.iitk.ac.in/~skb/ee679/ee679.html written by Professor S.K. Bose.