100 likes | 228 Views
M ixed Moments C orrelation & Independence, I ndependent sum. Tutorial 7, STAT1301 Fall 2010, 09NOV2010, MB103@HKU By Joseph Dong. Recall: Univariate Moment of order and Generalization: Mixed Moments. The moment of a random variable is defined as
E N D
MixedMomentsCorrelation & Independence,Independent sum Tutorial 7, STAT1301 Fall 2010, 09NOV2010, MB103@HKUBy Joseph Dong
Recall: Univariate Moment of order and Generalization: Mixed Moments • The moment of a random variable is defined as • The central moment of a random variable is defined as • Q: How to generalize these two definitions to the case of a random vector of dimensions? • A: We can define the mixed moment of an -dimensional random vector. • Define the mixed momentof a random vector as • Define the mixed centralmomentof a random vector as
Bivariate Mixed Moment: of • The mixed moment of order of and is defined by • The mixed central moment of order of and is defined by • The covariance of two random variables is defined as the 2nd bivariate mixed central moment : • Covariance is a bivariate concept. • A convenient identity: • Properties of : • Symmetry: • Positive semi-definiteness : • Linearity:
Standardization in Statistics • Express position of a number label using its distance from the expectation, in terms of a multiple of the standard deviation. • This is as if we recoordinatize the state space using the location of expectation as the origin and using the standard deviation as the unit length. • Standardization of a random variable is a one-one transformation (a centralization plus a rescaling) of the random variable. • Using angle brackets to denote the resultant random variable of standardization: • Purpose of standardization: for ease of describing positions. For example, • What’s the relative position of the number label 5.3 in the state space of a random variable following . . Since follows (show this if you are not convinced), and 3.25 is a very high quantile. Therefore 5.3 is located quite unusually right in the original state space.
Correlation as standardized Covariance • Covariance is a bivariate concept, so is correlation. • Compare: • Covariance of & : • Quick Question: What if and are independent? • Correlation () of & : • Very interestingly, correlation of any pair of r.v.’s is always bounded, while their covariance can explode. • Pf.
Exploring Correlation demonstration.wolfram.com • Download Mathematica™ player if you don’t have one. • Search “correlation” • And explore…
Covariance/Correlation calculation Exercises • Handout Problem 1 • Handout Problem 2 • Handout Problem 3 • Find the correlation of the two random variables and who are dependent functionally as • . What about ?
Independent sum • Independent sum refers to the sum of independent random variables. • is a random variable itself—it has a sample space, a state space, and a probability measure (and distribution) on the sample space. • We’re now interested in finding the following moments of : • Expectation (too easy and no need independent actually) • Just summing the expecations • Variance (a bit proof work required, uses all ) • It turns out that this is also just the sum of the variances. • Proof uses properties of covariance. • MGF (now easy because we have proved Theorem C of Tutorial 6). • Finding the MGF is equivalent to finding the Distribution. • If we consider a pair of independent sums, we are also interested in finding their covariance (this is easy too) • Using properties of covariance
Independent sum: finding its distribution • Previous slide gives one method for finding the distribution of : through its moment generating function since under independence condition, the moment generating function is very convenient to derive. But there is one problem: what if you don’t recognize the resulting form of MGF? (Assuming we are blinded about the divinely clever integral transformation methods.) • We can also find the distribution of by working with the probability measure directly.