350 likes | 1.08k Views
Generating Functions. The Moments of Y. We have referred to E(Y) and E(Y 2 ) as the first and second moments of Y, respectively. In general, E(Y k ) is the k th moment of Y. Consider the polynomial where the moments of Y are incorporated into the coefficients. Moment Generating Function.
E N D
The Moments of Y • We have referred to E(Y) and E(Y2) as the first and second moments of Y, respectively. In general, E(Yk) is the kth moment of Y. • Consider the polynomial where the moments of Y are incorporated into the coefficients
Moment Generating Function • If the sum converges for all t in some interval |t| <b,the polynomial is called the moment-generating function, m(t), for the random variable Y. • And we may note that for each k,
Moment Generating Function • Hence, the moment-generating function is given by May rearrange, since finite for |t| <b.
Moment Generating Function • That is, is the polynomial whose coefficients involve the moments of Y.
The kth moment • To retrieve the kth moment from the MGF,evaluate the kth derivative at t = 0. • And so, letting t = 0:
Geometric MGF • For the geometric distribution,
Common MGFs • The MGFs for some of the discrete distributions we’ve seen include:
Recognize the distribution • Identify the distribution having the moment generating function • Give the mean and variance for this distribution. • Could use the derivatives, but is that necessary?
Geometric MGF • Consider the MGF • Use derivatives to determine the first and second moments. And so,
Geometric MGF • Since • We have And so,
Geometric MGF • Sinceis for a geometric random variable with p = 1/3,our prior results tell us E(Y) = 1/p and V(Y) = (1 – p)/p2. which do agree with our current results.
All the moments • Although the mean and variance help to describe a distribution, they alone do not uniquely describe a distribution. • All the moments are necessary to uniquely describe a probability distribution. • That is, if two random variables have equal MGFs, (i.e., mY(t) = mZ(t) for |t| <b ), then they have the same probability distribution.
m(aY+b)? • For the random variable Y with MGF m(t), consider W = aY + b. Construct the MGF for the random variable W= 2Y + 3, where Y is a geometric random variable with p = 4/5.
E(aY+b) • Now, based on the MGF, we could again consider E(W) = E(aY + b). And so, letting t = 0, as expected.
Tchebysheff’s Theorem • For “bell-shaped” distributions, the empirical rule gave us a 68-95-99.7% rule for probability a value falls within 1, 2, or 3 standard deviations from the mean, respectively. • When the distribution is not so bell-shaped, Tchebysheff tells use the probability of being within k standard deviations of the mean is at least 1 – 1/k2, for k > 0. Remember, it’s just a lower bound.
A Skewed Distribution • Consider a binomial experiment with n = 10 and p = 0.1.
A Skewed Distribution • Verify Tchebysheff’s lower bound for k = 2: