680 likes | 1.58k Views
ANOVA for Regression. ANOVA tests whether the regression model has any explanatory power. In the case of simple regression analysis the ANOVA test and the test for b 1 are identical. ANOVA for Regression. MSE = SSE/(n-2) MSR = SSR/p where p=number of independent variables F = MSR/MSE.
E N D
ANOVA for Regression ANOVA tests whether the regression model has any explanatory power. In the case of simple regression analysis the ANOVA test and the test for b1 are identical.
ANOVA for Regression MSE = SSE/(n-2) MSR = SSR/p where p=number of independent variables F = MSR/MSE
ANOVA Hypothesis Test H0: b1 = 0 Ha: b1 ≠ 0 Reject H0 if: F >Fa Or if: p <a
ANOVA and Regression Fa = 4.21 given a=.05, df num. = 1, df denom. = 27
Issues with Hypothesis Test Results • Correlation does NOT prove causation • The test does not prove we used the correct functional form
Confidence Interval for Estimated Mean Value of y xp = particular or given value of x yp = value of the dependent variable for xp E(yp) = expected value of yp or E(y|x= xp)
Computing b0 and b1, Example From example of car age, price:
Confidence Interval of Conditional Mean Given 1-a = .95 and df = 3:
Confidence Interval for Predicted Values of y A confidence interval for a predicted value of y must take into account both random error in the estimate of b1 and the random deviations of individual values from the regression line.
Confidence Interval of Conditional Mean Given 1-a = .95 and df = 3:
Residual Plots Against x • Residual – the difference between the observed value and the predicted value • Look for: • Evidence of a nonconstant variance • Nonlinear relationship
Regression and Outliers Outliers can have a disproportionate effect on the estimated regression line.
Regression and Outliers • One solution is to estimate the model with and without the outlier. • Questions to ask: • Is the value a error? • Does the value reflect some unique circumstance? • Is the data point providing unique information about values outside of the range of other observations?
Chapter 15 Multiple Regression
Regression Multiple Regression Model y = b0 + b1x1 + b2x2 + … + bpxp + e Multiple Regression Equation y = b0 + b1x1 + b2x2 + … + bpxp Estimated Multiple Regression Equation
Multiple Regression, Example Predicted MPG for car weighing 4000 lbs built in 1980 with 6 cylinders: -14.4 -.00652(4000)+.76(80)-.0741(6) =-14.4-26.08+60.8-.4446=19.88
Multiple Regression Model SST = SSR + SSE
Multiple Coefficient of Determination The share of the variation explained by the estimated model. R2 = SSR/SST
F Test for Overall Significance H0: b1 = b1 = . . . = bp Ha: One or more of the parameters is not equal to zero Reject H0 if: F >Fa Or Reject H0 if: p-value <a F = MSR/MSE
t Test for Coefficients H0: b1 = 0 Ha: b1 ≠ 0 Reject H0 if: t < -ta/2 or t >ta/2 Or if: p <a t = b1/sb1 With a t distribution of n-p-1 df
Multicollinearity • When two or more independent variables are highly correlated. • When multicollinearity is severe the estimated values of coefficients will be unreliable • Two guidelines for multicollinearity: • If the absolute value of the correlation coefficient for two independent variables exceeds 0.7 • If the correlation coefficient for independent variable and some other independent variable is greater than the correlation with the dependent variable