180 likes | 937 Views
Section 8 – Joint, Marginal, and Conditional Distributions. Joint Distribution of X and Y. CDF of a Joint Distribution. Expectation of a function of Jointly Distributed RV’s. Recall:. Now:. Marginal Distribution of X and Y. Before this chapter, we were dealing with one random variable
E N D
Expectation of a function of Jointly Distributed RV’s • Recall: • Now:
Marginal Distribution of X and Y • Before this chapter, we were dealing with one random variable • These RV’s had f(x) this was a marginal distribution for X • fx(X) = Probability that that value of X occurs • This is what we’ve already been doing! • Ex: In the coin-dice example, • X was a coin toss (0, 1) • Y was one dice (1, 2, …, 6) for heads (x=1) • Y was two dice (2, 3, …, 12) for tails (x=0) • fx(0) = .5 , fx(1) = .5 • fy(1) = fx,y(1,1) there is no possible y= 1 if x = 0 • fy(2) = fx,y(0,2) + fx,y(1,2) = (.5)(1/36)+(.5)(1/6) • Sum (over all x) any events where y = 2
Marginal Distribution Formulas • Caution: When the probability space is non-rectangular, make sure to set limits of integration correctly • Ex) 8-9
Expectation & Variance of Conditional • Expectation • Find the conditional PDF from previous formula • Apply the expectation formula like usual • Variance (trickier!) • Find the conditional PDF from previous formula • Apply the expectation formula like usual • Find conditional mean: E[Y|X=x) • Find conditional second moment: E[Y^2|X=x] • Use the variance formula like usual, using these components • Recurring trap: Get variance through E[X^2]-(E[X])^2 formula • This is a trap in conditional distributions • This is a trap in adding variance (covered in a previous lecture)
Two Formulas for f(x,y) • For any X, Y • Special case for when X, Y are independent
Covariance between X & Y • Covariance = 0 for independent X, Y • Positive for large X with large Y • Negative for large X with small Y (vice versa) • Formula is similar to our familiar variance formula
Moment Generating Function of a Joint Distribution • The E(e^tX) and M’(0) approaches both work • Can get E(XY), E(X), E(Y) from this Cov(X,Y)
Bivariate Normal Distribution • “The Bivariate Normal Distribution has occurred infrequently on Exam P • More information in Actex (p. 242) • Our time can be used more efficiently, e-mail me with any questions/problems: poa5010@psu.edu
Properties • Iff X and Y are independent: Products of expectations are expectations of products • E[g(X) * h(Y)]=E[g(X)] * E[h(Y)] • Particularly useful: E[XY] = E[X] * E[Y] • This is why Cov(X,Y) = 0 Indepedence • Cov[aX + bY + c , dZ + eW + f] = adCov[X,Z] + aeCov[X,W] + bdCov[Y,Z] + beCov[Y,W] • Lower case constants, upper case RV’s • Var[aX + bY + c] =a^2*Var[X] + b^2*Var[Y] + 2ab*Cov[X,Y] • Trap: The sign of a, b affect Cov term
Formulas to Understand Graphically • Make sure that you understand these formulas as graphical concepts • Be able to set up these problems by thinking instead of memorizing
There’s some real work to do • From here on out, STAT 414 is really not enough • Go through all of the examples in the chapter and be comfortable setting up double integrals to find probabilities • Understand all of the properties/formulas • There are more properties on p243-244 than what I covered • This is some of the most conceptually difficult material on the entire exam that is frequently tested • Practice problems! • Bring questions for next time!