1 / 40

Distributions of Functions of Random Variables

This chapter covers the distributions of functions of random variables, including transformations of one, two, and several random variables. It also introduces the moment-generating function technique.

mvara
Download Presentation

Distributions of Functions of Random Variables

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Probability and Statistical Inference (9th Edition)Chapter 5 (Part 1/2) Distributions of Functions of Random Variables November 18, 2015

  2. Outline 5.1 Functions of One Random Variable 5.2 Transformation of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

  3. Functions of One Random Variable • Let X be a continuous random variable with pdf f(x). If we consider a function of X, say, Y=u(X), then Y must also be a random variable with its own distribution • The cdf of Y is G(y) = P(Y<=y) = P(u(X)<=y) • The pdf of Y is g(y) = G’(y) (where apostrophe ’ denotes derivative)

  4. Functions of One Random Variable • Change-of-variable technique • Let X be a continuous random variable with pdf f(x) with support c1<x<c2. We begin this discussion by taking Y=u(X) as a continuous increasing function of X with inverse function X=v(Y). Say the support of X maps onto the support of Y, d1=u(c1)<y<d2=u(c2). Then, the cdf of Y is Thus,

  5. Functions of One Random Variable • The derivative, g(y)=G’(y), of such an expression is given by • Suppose now the function Y=u(X) and its inverse X=v(Y) are continuous decreasing functions. Then, Thus,

  6. Functions of One Random Variable • Thus, for both increasing and decreasing cases, we can write the pdf of Y:

  7. Functions of One Random Variable • Example 1

  8. Functions of One Random Variable • Theorem 1: Suppose that a random variable X has a continuous distribution for which the cdf is F. Then, the random variable Y=F(X) has a uniform distribution • Random variables from any given continuous distribution can be converted to random variables having a uniform distribution, and vice versa

  9. cdf F of N(0,1) Y = F(X) ~ U(0,1) pdf of N(0,1) X ~ N(0,1) Functions of One Random Variable

  10. Functions of One Random Variable • Theorem 1 (converse statement): If U is a uniform random variable on (0,1), then random variable X=F-1(U) has cdf F (where F is a continuous cdf and F-1 is its inverse function) • Proof: P(X<=x) = P(F-1(U)<=x) = P(U<=F(x)) = F(x)

  11. Functions of One Random Variable • Theorem 1 (converse statement) can be used to generate random variables of any distribution • To generate values of X which are distributed according to the cdf F: 1. Generate a random number u from U (uniform random variable on (0,1)) 2. Compute the value x such that F(x) = u 3. Take x to be the random number distributed according to the cdf F

  12. Functions of One Random Variable • Example 2 (the transformation Y=u(X) is not one-to-one): Let Y=X2, where X is Cauchy, then where Thus, In this case of two-to-one transformation, there is a need to sum two terms, each of which is similar to the one-to-one case

  13. Functions of One Random Variable • Consider the discrete case • Let X be a discrete random variable with pmf f(x)=P(X=x). Let Y=u(X) be a one-to-one transformation with inverse X=v(Y). Then, the pmf of Y is • Note that, in the discrete case, the derivative |v’(y)| is not needed

  14. Functions of One Random Variable • Example 3: Let X be a uniform random variable on {1,2,…,n}. Then Y=X+a is a uniform random variable on {a+1,a+2,…,a+n}

  15. Transformations of Two Random Variables • If X1 and X2 are two continuous random variables with joint pdf f(x1,x2), and if Y1=u1(X1,X2), Y2=u2(X1,X2) has the single-valued inverse X1=v1(Y1,Y2), X2=v2(Y1,Y2), then the joint pdf of Y1 and Y2 is where | | denotes absolute value and J is the Jacobian given by where | | denotes the determinant of a matrix

  16. Transformations of Two Random Variables • Example 1: Let X1 and X2 be independent random variables, each with pdf Hence, their joint pdf is Let Y1=X1-X2, Y2=X1+X2. Thus, x1=(y1+y2)/2, x2=(y2-y1)/2, and the Jacobian

  17. Transformations of Two Random Variables • Then, the joint pdf of Y1 and Y2 is

  18. Transformations of Two Random Variables • Example 2 (Box-Muller Transformation): Let X1 and X2 be i.i.d. U(0,1). Let Thus, where Q=Z12+Z22, and the Jacobian

  19. Transformations of Two Random Variables • Since the joint pdf of X1 and X2 is it follows that the joint pdf of Z1 and Z2 is • This is the joint pdf of two i.i.d. N(0,1) random variables • Hence, we can generate two i.i.d. N(0,1) random variables from two i.i.d. U(0,1) random variables using this method

  20. Random Samples • Assume that we conduct an experiment n times independently. Let Xk be the random variable corresponding to the outcome of the k-th run of experiment. Then X1, X2,…, Xn form a set of random samples of size n

  21. Random Samples • For example, if we toss a die n times and let X1, X2,…, Xn be the random variables corresponding to the outcome of the k-th tossing. Then X1, X2,…, Xn form a set of random samples of size n

  22. Theorems about Independent Random Variables • Let X1, X2,…, Xn be n independent discrete random variables and h() be a function of n variables. Then, the expected value of random variable Z=h(X1, X2,…, Xn) is equal to

  23. Theorems about Independent Random Variables • Likewise, if X1, X2,…, Xn are independent continuous random variables, then

  24. Theorems about Independent Random Variables • Theorem: If X1, X2,…, Xn are independent random variables and, for i = 1, 2,…, n, E[hi(Xi)] exists, then E[h1(X1) h2(X2) … hn(Xn)] = E[h1(X1)] E[h2(X2)]… E[hn(Xn)]

  25. Theorems about Independent Random Variables • Proof for the discrete cases: • The proof for the continuous cases can be derived similarly

  26. Theorems about Independent Random Variables • Theorem: Assume that X1, X2,…, Xn are n independent random variables with respective means μ1, μ2,…, μn and variances σ12, σ22,…, σn2. Then, the mean and variance of random variable where a1,a2,…,an are real constants, are and

  27. Theorems about Independent Random Variables • Proof:

  28. Theorems about Independent Random Variables • Since Xi and Xj are independent where i≠j, • Therefore,

  29. Moment-Generating Function Technique • Let X be a random variable. The moment-generating function (mgf) of X is defined as • It is called mgf because all of the moments of X can be obtained by successively differentiating MX(t)

  30. Moment-Generating Function Technique • For example, Thus, • Similarly, Thus,

  31. Moment-Generating Function Technique • In general, the nth derivative of MX(t) evaluated at t=0 equals E[Xn], i.e., where denotes the nth derivative of

  32. Moment-Generating Function Technique • Moment generating function uniquely determines the distribution. That is, there exists a one-to-one correspondence between the moment generating function (mgf) and the distribution function (pmf/pdf) of a random variable

  33. Moment-Generating Function Technique • Example 1 (mgf of N(0,1)): where the last equality follows from the fact that the expression in the integral is the pdf of a normal random variable with mean t and variance 1 which integrates to one

  34. Moment-Generating Function Technique • Exercise (mgf of N(m,s2)):

  35. Moment-Generating Function Technique • Theorem: If X1,X2,…,Xn are independent random variables with respective mgfs, Mi(t), i=1,2,…,n, then the mgf of is

  36. Moment-Generating Function Technique • Proof:

  37. Moment-Generating Function Technique • Corollary: If X1,X2,…,Xn correspond to independent random samples from a distribution with mgf M(t), then

  38. Moment-Generating Function Technique • The mgf of the sum of independent random variables is just the product of the individual mgfs

  39. Moment-Generating Function Technique • Example 2: Recall that, let Z1, Z2, …, Zn be independent N(0,1). Then, W=Z12+Z22+…+Zn2 has a distribution that is chi-square with n degrees of freedom, denoted by Let X1, X2, …, Xn be independent chi-square random variables with r1, r2, …, rn degrees of freedom, respectively. Show that Y=X1+X2+…+Xn is

  40. Moment-Generating Function Technique • Use the moment-generating function technique: which is the mgf of a Thus, Y is

More Related