200 likes | 350 Views
MF-852 Financial Econometrics. Lecture 8 Introduction to Multiple Regression Roy J. Epstein Fall 2003. Topics. Formulation and Estimation of a Multiple Regression Interpretation of the Regression Coefficients Omitted Variables Collinearity Advanced Hypothesis Testing.
E N D
MF-852 Financial Econometrics Lecture 8 Introduction to Multiple Regression Roy J. Epstein Fall 2003
Topics • Formulation and Estimation of a Multiple Regression • Interpretation of the Regression Coefficients • Omitted Variables • Collinearity • Advanced Hypothesis Testing
Multiple Regression • Used when 2 or more independent variables explain the dependent variable: Yi = 0 + 1X1i + 2X2i + … + kXki + ei or Yi = Xi + ei
The Error Term • Same assumptions as before: E(ei) = 0 var(ei) = 2 cov(X,e) = 0 cov(ei, ej) = 0
The Error Term • Same assumptions as before: E(ei) = 0 var(ei) = 2 cov(X,e) = 0 cov(ei, ej) = 0
The Estimated Coefficients • Measure the marginal effect of an independent variable, controlling for the other effects. • I.e., effect of Xi “all else equal” • Can be sensitive to what other variables are included in the regression.
Omitted Variables • Suppose true model is: Yi = 0 + 1X1i + 2X2i + ei • But you leave out X2. (by ignorance or lack of data) Does it matter?
Analysis of Omitted Variables Error term now includes e and X2: Yi = 0 + 1X1i + ui = 0 + 1X1i + [2X2i + ei] Two cases: • X2correlated with X1. biased — picks up effect of X2 and attributes it to X1. • X2uncorrelated with X1. No bias.
Collinearity • Let Yi = 0 + 1X1i + 2X2i + ei • Suppose X1 and X2 highly correlated. • What difference does it make? • Hard to estimate 1 and 2. • No bias, but large standard errors.
Collinearity—Diagnosis • Neither X1 or X2 has a significant t statistic BUT • X1 is significant when X2 is left out of the regression and vice versa. • Test joint significance with F test.
Exact Collinearity • Let Yi = 0 + 1X1i + 2X2i + ei • Suppose X2 is exact linear function of X1 • E.g., X2 = a + bX1 • Then cannot estimate model at all! • Can also occur with 3 or more X’s.
Exact Collinearity—Example • Regression to explain calories as function of fat content of foods • X1 is fat in ounces per portion • X2 is fat in same food in grams • Then X2i = 28.35 X1i • Can’t estimate Yi = 0 + 1X1i + 2X2i + ei • Intuition: no independent information in X2.
Tests of Restrictions • Suppose H0: 2 = 21 in Yi = 0 + 1X1i + 2X2i + ei • Test H0 with reformulated model that embeds restriction: Yi = 0 + 1(X1i + 2X2i) + 2X2i + ei • Under H0, 2 = 0 • Can test with usual t statistic
Test your Understanding! • What is difference between exact collinearity, e.g., • X2i = 2X1i • And a coefficient restriction, e.g., • H0: 2 = 21 ? • Relate the concepts to the model.