420 likes | 879 Views
Use of moment generating functions. Using the moment generating functions of X, Y, Z, … determine the moment generating function of W = h ( X, Y, Z, … ). Identify the distribution of W from its moment generating function This procedure works well for sums, linear combinations etc.
E N D
Use of moment generating functions • Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …). • Identify the distribution of W from its moment generating function • This procedure works well for sums, linear combinations etc.
Therorem Let X and Y denote a independent random variables each having a gamma distribution with parameters (l,a1) and (l,a2). Then W = X + Y has a gamma distribution with parameters (l, a1 +a2). Proof:
Recognizing that this is the moment generating function of the gamma distribution with parameters (l, a1 + a2) we conclude that W = X + Y has a gamma distribution with parameters (l, a1 + a2).
Therorem(extension to n RV’s) Let x1, x2, … , xndenote n independent random variables each having a gamma distribution with parameters (l,ai), i = 1, 2, …, n. Then W = x1 + x2 + … + xnhas a gamma distribution with parameters (l, a1 +a2 +… + an). Proof:
Therefore Recognizing that this is the moment generating function of the gamma distribution with parameters (l, a1 + a2 +…+ an) we conclude that W = x1+ x2 + … + xnhas a gamma distribution with parameters (l, a1 + a2 +…+ an).
Therorem Suppose that x is a random variable having a gamma distribution with parameters (l,a). Then W = axhas a gamma distribution with parameters (l/a, a). Proof:
Special Cases • Let X and Y be independent random variables having a c2 distribution with n1 and n2 degrees of freedom respectively then X + Y has a c2 distribution with degrees of freedom n1 + n2. • Let x1, x2,…, xn,be independent random variables having a c2 distribution with n1 ,n2 ,…, nndegrees of freedom respectively then x1+ x2 +…+ xnhas a c2 distribution with degrees of freedom n1 +…+ nn. Both of these properties follow from the fact that a c2 random variable with ndegrees of freedom is a Grandom variable with l = ½ and a = n/2.
Recall If z has a Standard Normal distribution then z2 has a c2 distribution with 1 degree of freedom. Thus if z1, z2,…, znare independent random variables each having Standard Normal distribution then has a c2 distribution with ndegrees of freedom.
Therorem Suppose that U1 and U2 are independent random variables and that U = U1 + U2 Suppose that U1 and U have a c2 distribution with degrees of freedom n1andn respectively. (n1 < n) Then U2 has a c2 distribution with degrees of freedom n2 =n -n1 Proof:
Special Cases • Setting a = 0. Computing formula
Distribution of the sample variance Let x1, x2, …, xn denote a sample from the normal distribution with mean mand variance s2. Let Then has a c2 distribution with n degrees of freedom.
Note: or U = U2 + U1 has a c2 distribution with n degrees of freedom.
We also know that has normal distribution with mean mand variance s2/n Thus has a Standard Normal distribution and has a c2 distribution with 1degree of freedom.
If we can show that U1 and U2 are independentthen has a c2 distribution with n - 1degrees of freedom. The final task would be to show that are independent
Summary Let x1, x2, …, xn denote a sample from the normal distribution with mean mand variance s2. • than has normal distribution with mean mand variance s2/n has a c2 distribution with n= n - 1 degrees of freedom.
The Transformation Method Theorem Let X denote a random variable with probability density function f(x) and U = h(X). Assume that h(x) is either strictly increasing (or decreasing) then the probability density of U is:
Proof Use the distribution function method. Step 1 Find the distribution function, G(u) Step 2 Differentiate G (u ) to find the probability density function g(u)
Example Suppose that X has a Normal distribution with mean mand variance s2. Find the distribution of U = h(x) = eX. Solution:
hence This distribution is called the log-normal distribution
Theorem Let x1, x2,…, xn denote random variables with joint probability density function f(x1, x2,…, xn ) Let u1= h1(x1, x2,…, xn). The Transfomation Method(many variables) u2= h2(x1, x2,…, xn). un= hn(x1, x2,…, xn). define an invertible transformation from the x’s to the u’s
Then the joint probability density function of u1, u2,…, un is given by: where Jacobian of the transformation
Suppose that x1, x2 are independent with density functions f1 (x1) and f2(x2) Find the distribution of u1= x1+ x2 Example u2= x1 - x2 Solving for x1 and x2 we get the inverse transformation
The joint density of x1, x2 is f(x1, x2) = f1 (x1) f2(x2) Hence the joint density of u1and u2 is:
From We can determine the distribution of u1= x1+ x2
Hence This is called the convolution of the two densities f1 and f2.
Example: The ex-Gaussian distribution Let X and Y be two independent random variables such that: • X has an exponential distribution with parameter l. • Y has a normal (Gaussian) distribution with mean mand standard deviation s. Find the distribution of U = X + Y. This distribution is used in psychology as a model for response time to perform a task.
Now The density of U = X + Y is :.
Where V has a Normal distribution with mean and variance s2. Hence Where F(z) is the cdf of the standard Normal distribution
g(u) The ex-Gaussian distribution