600 likes | 718 Views
Some additional Topics. Distributions of functions of Random Variables. Gamma distribution, c 2 distribution, Exponential distribution. Therorem.
E N D
Distributions of functions of Random Variables Gamma distribution, c2 distribution, Exponential distribution
Therorem Let X and Y denote a independent random variables each having a gamma distribution with parameters (l,a1) and (l,a2). Then W = X + Y has a gamma distribution with parameters (l, a1 +a2). Proof:
Recognizing that this is the moment generating function of the gamma distribution with parameters (l, a1 + a2) we conclude that W = X + Y has a gamma distribution with parameters (l, a1 + a2).
Therorem(extension to n RV’s) Let x1, x2, … ,xndenote n independent random variables each having a gamma distribution with parameters (l,ai), i = 1, 2, …, n. Then W = x1 + x2 + … +xnhas a gamma distribution with parameters (l, a1 +a2 +… + an). Proof:
Therefore Recognizing that this is the moment generating function of the gamma distribution with parameters (l, a1 + a2 +…+ an) we conclude that W = x1+ x2 + … + xnhas a gamma distribution with parameters (l, a1 + a2 +…+ an).
Therorem Suppose that x is a random variable having a gamma distribution with parameters (l,a). Then W = axhas a gamma distribution with parameters (l/a, a). Proof:
Special Cases • Let X and Y be independent random variables having an exponential distribution with parameter lthen X + Y has a gamma distribution with a = 2 andl • Let x1, x2,…, xn,be independent random variables having a exponential distribution with parameter lthen S = x1+ x2 +…+ xnhas a gamma distribution with a = n and l • Let x1, x2,…, xn,be independent random variables having a exponential distribution with parameter lthen • has a gamma distribution with a = n and nl
Distribution of population – Exponential distribution Another illustration of the central limit theorem
Special Cases -continued • Let X and Y be independent random variables having a c2 distribution with n1 and n2 degrees of freedom respectively then X + Y has a c2 distribution with degrees of freedom n1 + n2. • Let x1, x2,…, xn,be independent random variables having a c2 distribution with n1 ,n2 ,…, nndegrees of freedom respectively then x1+ x2 +…+ xnhas a c2 distribution with degrees of freedom n1 +…+ nn. Both of these properties follow from the fact that a c2 random variable with ndegrees of freedom is a Grandom variable with l = ½ and a = n/2.
Recall If z has a Standard Normal distribution then z2 has a c2 distribution with 1 degree of freedom. Thus if z1, z2,…, znare independent random variables each having Standard Normal distribution then has a c2 distribution with ndegrees of freedom.
Therorem Suppose that U1 and U2 are independent random variables and that U = U1 + U2 Suppose that U1 and U have a c2distribution with degrees of freedom n1andnrespectively. (n1 < n) Then U2 has a c2distribution with degrees of freedom n2 =n -n1 Proof:
The joint probability function; p(x,y) = P[X = x, Y = y]
Marginal distributions Conditional distributions
The product rule for discrete distributions Independence
Definition: Two random variable are said to have joint probability density function f(x,y) if
Marginal distributions Conditional distributions
The product rule for continuous distributions Independence
Example • Suppose that to perform a task we first have to recognize the task, then perform the task. • Suppose that the time to recognize the task, X, has an exponential distribution with l = ¼ (i,e, meanm= 1/l= 4 ) • Once the task is recognized the time to perform the task, Y, is uniform from X/2 to 2X. • Find the joint density of X and Y. • Find the conditional density of X given Y = y.
Now and Thus
Conditional Expectation Let U = g(X,Y) denote any function of X and Y. Then is called the conditional expectation of U = g(X,Y) given X = x.
Conditional Expectation and Variance More specifically is called the conditional expectation of Y given X = x. is called the conditional variance of Y given X = x.
An Important Rule and where EXand VarX denote mean and variance with respect to the marginal distribution of X, fX(x).
Proof Let U = g(X,Y) denote any function of X and Y. Then
Example • Suppose that to perform a task we first have to recognize the task, then perform the task. • Suppose that the time to recognize the task, X, has an exponential distribution with l= ¼ (i,e, meanm= 1/l= 4 ) • Once the task is recognized the time to perform the task, Y, is uniform from X/2 to 2X. • Find E[XY]. • Find Var[XY].
Conditional Expectation: k (>2) random variables
Definition then the conditional joint probability function of X1, X2, …, Xq given Xq+1 = xq+1 , …, Xk= xkis Let X1, X2, …, Xq, Xq+1…, Xk denote k continuous random variables with joint probability density function f(x1, x2, …, xq, xq+1 …, xk )
Definition then the ConditionalExpectation of U given Xq+1 = xq+1 , …, Xk= xkis Let U = h( X1, X2, …, Xq, Xq+1…, Xk) Note this will be a function of xq+1 , …, xk.
Example Determine the conditional expectation of U = X 2+ Y + Z given X = x, Y = y. Let X, Y, Z denote 3jointly distributed random variable with joint density function
The marginal distribution of X,Y. Thus the conditional distribution of Z given X = x,Y = y is
The conditional expectation of U = X 2+ Y + Z given X = x, Y = y.
Thus the conditional expectation of U = X 2+ Y + Z given X = x, Y = y.
The rule for Conditional Expectation Let (x1, x2, … , xq, y1, y2, … , ym) = (x, y) denote q + m random variables. Then
Suppose a gambler is playing a game for which he wins 1$ with probability p and loses 1$ with probability q. • Note the game is fair if p = q = ½. • Suppose also that he starts with an initial fortune of i$ and plays the game until he reaches a fortune of n$ or he loses all his money (his fortune reaches 0$) • What is the probability that he achieves his goal? What is the probability the he loses his fortune?
Let Pi= the probability that he achieves his goal? Let Qi= 1 - Pi= the probability the he loses his fortune? Let X = the amount that he was won after finishing the game If the game is fair Then E [X] = (n – i )Pi + (– i )Qi = (n – i )Pi + (– i ) (1–Pi) = 0 or (n – i )Pi = i(1–Pi) and (n – i + i )Pi = i