210 likes | 316 Views
Multiple linear regression MLR. Assumptions. The model is linear in the parameters The error terms are statistically independent The independent variables are linearly independent. All populations have equal variances. Linear in parameters. Not linear in parameters.
E N D
Assumptions • The model is linear in the parameters • The error terms are statistically independent • The independent variables are linearly independent. • All populations have equal variances
Not linear in parameters • Violation of assumptions
Linear in parameter but nonlinear in variables • Not a violation of assumptions
Violation of the assumption that the model is linear in parameters • This is called a mis—specification error. • This means the model has been written improperly. • There is such a thing as non—linear regression
The error terms are statistically independent • If the errors terms are statistically independent, then the value of the error term at time t will not be correlated with the values of the error terms at any other time period. • The ACF of statistically independent error terms will be the ACF of noise.
The error terms are statistically independent • The violation of this assumption is called serial correlation. • Serial correlation can be detected using the Durbin—Watson test or the ACF of the residuals. • Look at plot of residuals
Causes of serial correlation • Omitting a relevant variable from a regression equation. • A mis—specification error.
Consequences of serial correlation • The estimates of the standard deviation of the regressions coefficients ( ) will be wrong • So the T-test and p—values will be wrong as well
Consequences for forecasting • Can be very severe
Fixes for serial correlation • Find the missing relevant variable. • Write the regression equation correctly to avoid mis—specification. • Lagged dependent variable
The independent variables are linearly independent • The independent variables are not linearly independent if you can write one of them as some linear combination of the others.
Detection of multicolinearity • Plot the independent variables. • Compute the correlations between the independent variables. • Look for logical inconsistencies in the regression statistics.
Fix for multicolinearity • Find this and you will become very famous amongst econometricians. • Can’t really omit one of the offending variables. • At times it doesn’t really matter for forecasting. (St. Louis Model)
Heteroscedasticity • Violation of the assumption that the populations that the samples come from all have the same variance.
Consequences of heteroscedasticiy • Same as with serial correlation as far as the estimates of is concerned • Makes the forecasts increasingly uncertain
Fix for heteroscedasticity • In this course we will take logs of the dependent variables and perhaps the logs of all variables • More sophisticated methods exist but are difficult to use and also require a good deal of work