70 likes | 99 Views
MATH 3033 based on Dekking et al. A Modern Introduction to Probability and Statistics. 2007 Slides by Gautam Shankar Format by Tim Birbeck Instructor Longin Jan Latecki. C12: The Poisson process. 12.1 – Random Points.
E N D
MATH 3033 based onDekking et al. A Modern Introduction to Probability and Statistics. 2007Slides by Gautam ShankarFormat by Tim Birbeck Instructor Longin Jan Latecki C12: The Poisson process
12.1 – Random Points • Poisson process model: often applies in situations where there is a very large population, and each member of the population has a very small probability to produce a point of the process. • Examples of Random points: arrival times of email messages at a server, the times at which asteroids hit earth, arrival times of radioactive particles at a Geiger counter, times at which your computer crashes, the times at which electronic components fail, and arrival times of people at a pump in an oasis.
12.2 – Taking a closer look at random arrivals • Example: Telephone calls arrival times • Calls arrive at random times, X1, X2, X3… • Homegeneity aka weak stationarity: is the rate lambda at which arrivals occur in constant over time: in a subinterval of length u the expectation of the number of telephone calls is lambda * u. • Independence: The number of arrivals in disjoint time intervals are independent random variables. • N(I) = total number of calls in an interval I • N([0,t]) Nt • E[Nt] = λ t • Divide Interval [0,t] into n intervals, each of size t/n
12.2 – Taking a closer look at random arrivals • When n is large enough, every interval Ij,n = ((j-1)t/n , jt/n] will contain either 0 or 1 arrival.Arrival: For such a large n ( n > λ t), Rj = number of arrivals in the time interval Ij,n • Rj has a Ber(pj) distribution for some pj.Recall: (For a Bernoulli random variable)E[Rj] = 0 • (1 – pj) + 1 • pj = pj • By Homogeneity assumption (see prev slide), for each jpj = λ• length of Ij,n = ( λ t / n) • Total number of calls:Nt = R1 + R2 + … + Rn. • By Independence assumption (see prev slide) Rj are independent random variables, so Nt has a Bin(n,p) distribution, with p = λ t/n
12.2 – Taking a closer look at random arrivals Definition: A discrete random variable X has a Poisson distribution with parameter µ, where µ > 0 if its probability mass function p is given by for k = 0,1,2.. We denote this distribution by Pois(µ) The Expectation an variance of a Poisson Distribution Let X have a Poisson distribution with parameter µ; then E[X] = µ and Var(X) = µ
12.3 – The one-dimensional Poisson process Interarrival Times The differences Ti = Xi – Xi-1 are called interarrival times.This imples that Therefore T1 has an exponential distribution with parameter λ
12.3 – The one-dimensional Poisson process T1 and T2 are independent, and The one-dimensional Poisson process with intensity λ is a sequence , , ,.. Of random variables having the property that the inter-arrival times are independent random variables, each with an Exp(λ) distribution. is equal to the number of Xi that are smaller than(or equal to) t.