310 likes | 321 Views
Learn about stochastic processes, their classification, and characterization. Understand discrete-state, discrete-time, continuous-time, discrete-space, continuous-space, and further classifications. Explore Markov processes and their properties.
E N D
EE255/CPS226Stochastic Processes Dept. of Electrical & Computer engineering Duke University Email: bbm@ee.duke.edu, kst@ee.duke.edu
What is a stochastic process? • Stochastic Process: is a family of rvs {X(t)|t ε T} (T is an index set; it may be discrete or continuous) • Values assumed by X(t) are called states. • State space (I): set of all possible states • Example: cosmic radio noise at antenna {a1, a2, .., ak}. t1 Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Stochastic Process Characterization • Sample space S: set of antennas. • Sample the output of all antennas at time t1 ( rv), i.e. we can define rv {X(t1)}. • In general, we can define: • At a fixed time t=t1, we can define Xt1(s) = X(t1,s) (rv X(t1)). Similarly, we can define, X(t2), .., X(tk). • X(t1) can be characterized by its distribution function, • We can also a joint variable, characterized by its CDF as, • Discrete and continuous cases: • States X(t) (i.e. time t) may be discrete/continuous • State space I (i.e. sample space S) may be discrete/continuous Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Classification of Stochastic Processes • Four classes of stochastic processes: • Discrete-state process chain (e.g., DJIA index at any time) • discrete-time process stochastic sequence {Xn | n є T} (e.g., probing a system every 10 ms.) Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Example: a Queuing System • Inter arrival times Y1, Y2, … (mutually independent) (FY) • Service times: S1, S2, … (mutually independent) (FS) • Notation for a queuing system: Fy /FY /m • Possible arrival/service time distributions types are: • M: Memory-less (i.e., EXP) • D: Deterministic • G: General distribution • Ek: k-stage Erlang etc. • M/M/1 Memory-less arrival/departure processes with 1-service station Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Discrete/Continuous Stochastic Processes • Nk: Number of jobs waiting in the system at the time of kth job’s departure Stochastic process {Nk|k=1,2,…}: • Discrete time, discrete space Nk Discrete k Discrete Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Continuous Time, Discrete Space • X(t): Number of jobs in the system at time t. {X(t)|t є T} forms a continuous-time, discrete-state stochastic process, with, X(t) Discrete Continuous Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Discrete Time, Continuous Space • Wk: wait time for the kth job. Then {Wk| k є T}forms a Discrete-time, Continuous-state stochastic process, where, Wk Continuous k Discrete Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Continuous Time, Continuous Space • Y(t): total service time for all jobs in the system at time t. Y(t) forms a continuous-time, continuous-state stochastic process, Where, Y(t) t Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Further Classification • Similarly, we can define nth order distribution: • Difficult to compute nth order distribution. (1st order distribution) (2nd order distribution) Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Further Classification (contd.) • Can the nth order distribution computations be simplified? • Yes. Under some simplifying assumptions: • Stationary (strict) • F(x;t) = F(x;t+τ) all moments are time-invariant • Independence • As consequence of independence, we can define Renewal Process • Discrete time independent process {Xn|n=1,2,…} (X1, X2, .. are iid, non-negative rvs), e.g., repair/replacement after a failure. Markov process removes independence restriction. • Markov Process • Stochastic proc. {X(t) | t є T} is Markov if for any t0 < t1< … < tn< t, the conditional distribution Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Markov Process • Mostly, we will deal with discrete state Markov process i.e., Markov chains • In some situations, a Markov process may also exhibit invariance wrt to the time origin, i.e. time-homogeneity • time-homogeneity does not imply stationarity. This also means that while conditional pdf may be stationary, the joint pdf may not be so. • Homogeneous Markov process process is completely summarized by its current state (independent of how it reached this particular state). • Let, Y: time spent in a given state Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Markov Process-Sojourn time • Y is also called the sojourn time • This result says that for a homogeneous discrete time Markov chain, sojourn time in a state follows EXP( ) distribution. • Semi-Markov process is one in which the sojourn time in state may not be EPX( ) distributed. Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Renewal Counting Process • Renewal counting process: # of renewals (repairs, replacements, arrivals) in time t: a continuous time process: • If time interval between two renewals follows EXP distribution, then Poisson Process Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Stationarity Properties • Strict sense Stationarity • Stationary in the mean E[X(t)] = E[X] • In general, if • Then, a process is said to be wide-sense stationary • Strict-sense stationarity wide-sense stationarity Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Bernoulli Process • A set of Bernoulli sequences, {Yi|i=1,2,3,..}, Yi =1 or 0 • {Yi} forms a Bernoulli Process. Often Yi’s are independent. • E[Yi] = p; E[Yi2] = p; Var[Yi] = p(1-p) • Define another stochastic process , {Sn|n=1,2,3,..}, where Sn = Y1 +Y2 +…+ Yn (i.e. Sn :sequence of partial sums) • Sn = Sn-1+ Yn (recursive form) • P[Sn = k| Sn-1= k] = P[Yn = 0] = (1-p)and, • P[Sn = k| Sn-1= k-1] = P[Yn = 1] = p • {Sn |n=1,2,3,..}, forms a Binomial process • P[Sn = k] = Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Binomial Process Properties • Viewing successes in a Bernoulli process as arrivals, then, • define discrete rv T1: # trials up to & including 1st success (arrival) • T1: First order inter-arrival time and v has a Geometric distribution • P[T1 =i] = p(1-p)i-1, i=1,2,…; E[T1] = 1/p; Var[T1] = (1-p)/p2 • Geometric Distribution memory-less property. • Cond.pmf P[T1 =i| no success in the previous m trials ] = p • Since we treat arrival as success in {Sn}, occupancy time in stateSn is memory-less • Generalization to rth order inter-arrival time Tr: # trial trials up toincluding rtharrival. • Distribution for Tr : r-fold convolution of T1’s distribution. • Non-homogeneous Bernouli process. Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Poisson Process • A continuous time, discrete state process. • N(t): no. of events occurring in time (0, t]. Events may be, • # of packets arriving at a router port • # of incoming telephone calls at a switch • # of jobs arriving at file/computer server • Number of failed components in time interval • Events occurs successively and that intervals between these successive events are iid rvs, each following EXP( ) • λ: average arrival rate (1/ λ: average time between arrivals) • λ: average failure rate (1/ λ: average time between failures) Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Poisson Process (contd.) • N(t) forms a Poisson process provided: • N(0) = 0 • Events within non-overlapping intervals are independent • In a very small interval h, only one event may occur (prob. p(h)) • Letting, pn(t) = P[N(t)=n], • Hence, for a Poisson process, interval arrival times follow EXP( ) (memory-less) distribution. Such a Poisson process is non-stationary. • Mean = Var = λt ; What about E[N(t)/t], as t infinity? Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Merged Multiple Poisson Process Streams • Consider the system, • Proof: Using z-transform. Letting, α = λt, + Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Decomposing a Poisson Process Stream • Decompose a Poisson process into multiple streams • N arrivals decomposed into {n1, n2, .., nk}; N= n1+n2, ..,+nk • Cond. pmf • Since, • The uncond. pmf + Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Renewal Counting Process • Poisson process EXP( ) distributed inter-arrival times. • What if the EXP( ) assumption is removed renewal proc. • Renewal proc. : {Xi|i=1,2,…} (Xi’s are iid non-EXP rvs) • Xi : time gap between the occurrence of ith and (i+1)st event • Sk = X1 + X2 + .. + Xk time to occurrence of the kth event. • N(t)- Renewal counting process is a discrete-state, continuous-time stochastic. N(t) denotes no. of renewals in the interval (0, t]. Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Renewal Counting Processes (contd.) Sn t • For N(t), what is P(N(t) = n)? • nth renewal takes place at time t (account for the equality) • If the nth renewal occurs at time tn < t, then one or more renewals occur in the interval (tn < t]. tn More arrivals possible Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Renewal Counting Process Expectation • Let, m(t) = E[N(t)]. Then, m(t) = mean no. of arrivals in time (0,t]. m(t) is called the renewal function. Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Renewal Density Function • Renewal density function: • For example, if the renewal interval X is EXP(λ x), then • d(t) = λ , t >= 0 and m(t) = λ t , t >= 0. • P[N(t)=n] = • Fn(t) will turn out to be e–λ t (λ t)n/n! i.e Poissonprocess pmf n-stage Erlang Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Availability Analysis • Availability: is defined is the ability of a system to provide the desired service. • If no repairs/replacements, Availability = Reliability. • If repairs are possible, then above def. is pessimistic. • MTBF = E[Di+Ti+1] = E[Ti+Di]=E[Xi]=MTTF+MTTR MTBF T1 D1 T2 D2 T3 D3 T4 D4 ……. Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Availability Analysis (contd.) • Two mutually exclusive situations: • System does not fail before time t A(t) = R(t) • System fails, but the repair is completed before time t • Therefore, A(t) = sum of these two probabilities renewal Repair is completed with in this interval t x Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Availability Expression • dA(x) : Incremental availability • dA(x) = Prob(that after renewal, life time is > (t-x) & that the renewal occurs in the interval (x,x+dx]) Repair is completed with in this interval x t x+dx 0 Renewed life time >= (t-x) Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Availability Expression (contd.) • A(t) can also be expressed in the Laplace domain. • Since, R(t) = 1-W(t) or LR(s) = 1/s – LW(s) = 1/s –Lw(s)/s • What happens when t becomes very large? • However, Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Availability, MTTF and MTTR • Steady state availability A is: • for small values of s, Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University
Availability Example • Assuming EXP( ) density fn for g(t) and w(t) Bharat B. Madan, Department of Electrical and Computer Engineering, Duke University