160 likes | 274 Views
LECTURE 5. MULTIPLE REGRESSION TOPICS SQUARED MULTIPLE CORRELATION B AND BETA WEIGHTS HIERARCHICAL REGRESSION MODELS SETS OF INDEPENDENT VARIABLES SIGNIFICANCE TESTING SETS POWER ERROR RATES. SQUARED MULTIPLE CORRELATION. Measure of variance accounted for by predictors
E N D
LECTURE 5 • MULTIPLE REGRESSION TOPICS • SQUARED MULTIPLE CORRELATION • B AND BETA WEIGHTS • HIERARCHICAL REGRESSION MODELS • SETS OF INDEPENDENT VARIABLES • SIGNIFICANCE TESTING SETS • POWER • ERROR RATES
SQUARED MULTIPLE CORRELATION • Measure of variance accounted for by predictors • Always increases (or stays same) with additional predictors • Always >= 0 in OLS • More stable than individual predictors (compensatory effect across samples)
Multiple regression analysis • The test of the overall hypothesis that y is unrelated to all predictors, equivalent to • H0: 2y123… = 0 • H1: 2y123… = 0 • is tested by • F = [ R2y123… / p] / [ ( 1 - R2y123…) / (n – p – 1) ] • F = [ SSreg / p ] / [ SSe / (n – p – 1)]
SSreg ssx1 SSy SSe ssx2 Fig. 8.4: Venn diagram for multiple regression with two predictors and one outcome measure
SSreg ssx1 SSy SSe ssx2 Fig. 8.4: Venn diagram for multiple regression with two predictors and one outcome measure
Type I ssx1 SSx1 SSy SSe SSx2 Type III ssx2 Fig. 8.5: Type I and III contributions
B and Beta Weights • B weights • are t-distributed under multinormality • Give change in y per unit change in predictor x • “raw” or “unstandardized” coefficients
B and Beta Weights • Beta weights • are NOT t-distributed- no correct significance test • Give change in y in standard deviation units per standard deviation change in predictor x • “standardized” coefficients • More easily interpreted
PATH DIAGRAM FOR REGRESSION – Beta weight form = .5 X1 .387 r = .4 Y e X2 = .6 R2 = .742 + .82 - 2(.74)(.8)(.4) (1-.42) = .85
Depression e .471 .4 LOC. CON. -.345 -.448 -.317 DEPRESSION SELF-EST .399 R2 = .60 -.186 SELF-REL
PATH DIAGRAM FOR REGRESSIONS – Beta weight form X1 = .2 .387 r = .35* R2y= .2 Y1 e1 = .3 X2 = .2 = .5 = .3 Y2 e2 R2y= .6
HIERARCHICAL REGRESSION • Predictors entered in SETS • First set either causally prior, existing conditions, or theoretically/empirically established structure • Next set added to decide if model changes • Mediation effect • Independent contribution to R-square
HIERARCHICAL REGRESSION • Sample-focused procedures: • Forward regression • Backward regression • Stepwise regression • Criteria may include: R-square change in sample, error reduction
STATISTICAL TESTING – Single additional predictor • R-square change: F-test for increase in SS per predictor in relation to MSerror for complete model: F (1,dfe) = (SSA+B – SSA )/ MSeAB A B Y A byB SSe B Y t= byB / sebyB
STATISTICAL TESTING –Sets of predictors • R-square change: F-test for increase in SS per p predictors in relation to MSerror for complete model: F (p,dfe) = ((SSA+B – SSA )/p)/ MSeAB Y A B is a set of p predictors SSe B
Experimentwise Error Rate • Bonferroni error rate: ptotal <= p1 + p2 + p3 + … • Allocate error differentially according to theory: • Predicted variables should have liberal error for deletion (eg. .05 to retain in model) • Unpredicted additional variables should have conservative error to add (eg. .01 to add to model)