1 / 17

Transformation Techniques

Transformation Techniques. In Probability theory various transformation techniques are used to simplify the solutions for various moment calculations. We will discuss here 4 of those functions. Probability Generating Function Moment Generating Function Characteristic Function

baakir
Download Presentation

Transformation Techniques

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Transformation Techniques • In Probability theory various transformation techniques are used to simplify the solutions for various moment calculations. We will discuss here 4 of those functions. • Probability Generating Function • Moment Generating Function • Characteristic Function • Laplace Transformation of probability density function

  2. Probability Generating Function Tool that simplifies computations of integer valued discrete random variable problems X: non-negative integer valued Random Number P(X=k) =pk, then define the Probability Generating Function (PGF) of X by GX(z) = E[z X] = S pk z k = p0 + p1z + p2 z2 + ……. pk z k +…… z is a complex number z< 1 G(z) is nothing more than z-transform of pk. Gx(1) = 1 = S pk

  3. Generating Functions K : Non-negative integer valued random variable with probability distribution pj where, pj = Prob[K =j] for all j = 0,1,2,…… g(z) : p0 + p1 z+ p2 z2+ p3 z3+ ……. g(z) is a power series of probability pj with coefficient zj is the probability generating Function of random variable K Few properties g(1) = 1 as S pj = 1 and z is a complex number and converged to Absolute Value Mod[z] < 1 Expected Value E[K] = S j pj for j: 0,1,2….. (d/dz)g(z) = S j pj zj-1 at z =1 for j : 1,2,….. E[K] = g(1)(1) Similarly V[K] = g(2)(1) + g(1)(1) – [g(1)(1)]2 Reference: Introduction to Queuing Theory, Robert Cooper

  4. Moment Generating Function mg(t) : Moment Generating Function: Expected Value of function etX, where ‘t’ is a real variable and X is the random variable mg(t) = E[etX] =  Xi Rx p(Xi). etXi = ∫Rx f(x). etXidx If mg(t) exists for all real values of t, in some small interval –d, d : d > 0 about the origin, it can be shown that the probability distribution function can be obtained from mg(t). We assume mg(t) exists at a small region t about origin.

  5. Moment Generating Function-2 etX= 1 + tx + t2X2/2! + t3X3/3!+  Assume X is a continuous Random Variable mg(t) = E[etX] =  Xi Rx p(Xi). etXi = ∫Rx f(x). etXidx  = ∫Rx i=0 tiXi/i! f(X)dx   = ∫Rx ti/i! i=0 Xi f(X)dx = i=0 ti/i!∫Rx Xi f(X)dx  = i=0 ti/i!E[Xi] = E[X0] + tE[X1] + t2/2!E[X2] + … etX

  6. Moment Generating Function-3 mg(t) = E[X0] + tE[X1] + t2/2!E[X2] + … m (1)g(t) = E[X1] + tE[X2] + t2/2!E[X3] + … m (2)g(t) = E[X2] + tE[X3] + t2/2!E[X4] + … At t = 0 m (1)g(t) = E[X1] m (2)g(t) = E[X2] Var[X] = E[X2] – [E[X]]2 = m (2)g(t) - [m (1)g(t)] 2

  7. Characteristic Function The Characteristic Function of Random Variable X fX(u) = E[e juX] = ∫- e juXfx(x)dx where j = -1 and u is an arbitrary real variable Note: Except for the sign of exponent, Characteristic function is the Fourier Transform of the pdf of X. fX(u) = ∫- fx(x)dx[1 + jux +(jux)2/2! + (jux)3/3! + ……..]dx = 1 + juE[X] + (ju)2/2!E[X2] + (ju)3/3!E[X3] + ….. Let u=0 Then fX(0) = 1 f(1)X(0) = dfX(u)/duu=0 = jE[X] f(2)X(0) = d2fX(u)/du2u=0 = j2E[X2]

  8. Laplace Transform Let CDF of traffic arrival process is defined as A(x), where X is the random variable for inter arrival time between two customers. A(x) = P[X < x] The pdf (probability density function) is denoted by a(x) Laplace Transform of a(x) is denoted by A*(s) and is given by A*(s) = E[e –sX] = ∫- e –sx axdx Since most random variable deals with non negative numbers, we can make the transform as A*(s) = ∫0 e –sx axdx Similar techniques of Moment generating function or characteristic function can be used to show that A*(n) (0) = (-1)nE[Xn]

  9. l l l - ju l - v Example For a continuous Random variable pdf is given as follows le –lx x > 0 fx(x) = 0 x < 0 Laplace Transform : A*(s) = Characteristic Function: fx(u) = Moment Generating Function: mg(v) = l l + s

  10. Expected Value Laplace Transform : E[X] = (-1)A*(1) (0) = (-) d[l/(l + s)/dss=0 = (-) [(-)l/(l +s)2]s=0 = l/l2 = 1/l Characteristic Function: E[X] = j-1fx(1) (0) = (j-1) d[l/(l - ju)/duu=0 = (j-1)[l.j/(l - ju)2u=0 = l/l2 = 1/l Moment Generating Function E[X}= mX(1) (0) = d[l/(l - v)/dvv=0 =[l/(l - v)2v=0 = l/l2 = 1/l

  11. Variance Laplace Transform : E[X2] = (-1)2A*(2) (0) = d2[l/(l + s)/ds2s=0 = [2l(l+s)/(l +s)3]s=0 = 2l2/l3 = 2/l Var[X] = E[X2] – [E[X]]2 = 2/l – [1/l]2 = (2l –l)/l2 = 1/l Characteristic Function: E[X2] = j-2fx(2) (0) = (j-2) d2[l/(l - ju)/du2u=0 = (j-2)[2l(l – ju).j2/(l - ju)3u=0 = 2l2/l3 = 2/l Moment Generating Function E[X2}= mX(2) (0) = d2[l/(l - v)/dv2v=0 =[2l(l - v )/(l - v)3v=0 = 2l2/l3 = 2/l

  12. Sum of Random Variables K1 and K2 are two independent random variables with GF g1(z) and g2(z) Find the Probability distribution P{K=k} where K = K1 + K2 P{K =k} = P{k1 = j}.P{k2 = k-j} g1(z) = S P{k1=j}zj for j: 0.1,2……. g2(z) = S P{k2=j}zj for j: 0.1,2……. g1(z)g2(z) = S { S P{k1=j}P{k2=k - j}zk for k: 0.1,2……. and j : 0,1,2…k If K has a generating function of g(z), then g(z) = S P{K=k}zk for k: 0.1,2……. = S [S P{k1 = j}.P{k2 = k-j}] for k: 0.1,2……. and j : 0,1,2…k g(z) = g1(z)g2(z) k = 0 j = 0 k= 0 j = 0

  13. Example: Bernoulli Distribution Bernoulli Distribution : X =0 with probability q X = 1 with probability p p + q = 1 g(z) = q + pz g’(1) = p g’’(I) = 0 E[X] = g’(1) = p V[x] = g’’(1) + g’(1) – [g’(1)]2 = p – p2 = p(1 – p) = pq A coin is tossed for n times, Xj = 0 if tail and Xj = 1 if head probability to have k heads in n tosses. Sn is the sum of n independent Bernoulli random variables Sn = X1 +X2 +……….+ Xn g(z) = GF of a toss = q + pz GF of Sn = S P{Sn = k}zk for k : 0,1,2…… = g(z).g(z)…….g(z) = [g(z)]n = (q + pz)n = SnCk[pz] k q n-k for k = 0…..n Binomial Distribution P{Sn= k} = nCk[pz] k q n-k for k = 0…..n = 0 for k > n

  14. Example Poisson Distribution Poisson Distribution = [(lt)j/j!]e –lt for j:0,1,2….. Generating Function g(z) = S [(lt)j/j!]e –lt zj = e –ltS [(ltz)j/j!] for j: 0,1,2,…. = e –lt e ltz =e –lt(1-z) Expectation P[N(t) =j] g’(z) = lte –lt(1-z) E[N(t) =j] = g’(1) = lt Variance g’’(z) = (lt)2e –lt(1-z) g’’(1) = (lt)2 V[N(t)] = g’’(1) + g’(1) – {g’(1)}2 = lt Sum of Poisson distribution of l1 and l2 g(z) = e –l1t(1-z) e –l2t(1-z) = +e –(l1+ l2)t(1-z) l = l1 + l2

  15. M/M/1 System Birth and Death Equation 0 = - ( l + m) pn + m pn+1 + l pn-1 (n>1) 0 = -l p0 + m p1 pn+1 = [( l + m)/ m] pn - [ l/ m] pn-1 p1 = [l/ m]p0 If r = l/ m pn+1 = ( r + 1)pn - r pn-1 (n>1) p1 = rp0 Use GF to solve this equation zn pn+1 = ( r + 1) zn pn - r zn pn-1 (n>1) z-1 pn+1 zn+1 = ( r + 1) zn pn - r zpn-1 zn-1 z-1 Spn+1 zn+1 = ( r + 1) Szn pn - r zSpn-1 zn-1    n=1 n=1 n=1 Use of GF for Probability

  16. GF for Prob    z-1[Spn+1 zn+1 – p1 z – p0 ] = ( r + 1)[ Szn pn - p0] - r zSpn-1 zn-1 n=-1 n=0 n=1    Spn+1 zn+1=Spn zn=Spn-1 zn-1=P(z) But n=-1 n=0 n=1 z-1[P(z)– p1 z – p0 ] = ( r + 1)[ P(z) - p0] - r zP(z) z-1[P(z)– rp0 z – p0 ] = ( r + 1)[ P(z) - p0] - r zP(z) z-1P(z)– rp0 – z-1p0 = rP(z) - rp0 +P(z) - p0 - r zP(z) P(z) = p0 /(1 –rz) To Find p we use the boundary condition P(1) =1 P(1) = p0 /(1 –r) = 1 p0 = 1 –r P(z) = (1 –r) /(1 –rz) 1 /(1 –rz) = 1 + zr + zr2 + ……. P(z) = S(1-r)rnzn pn= (1-r)rn  n=0

More Related