100 likes | 129 Views
Chapter 12. Prediction/Regression Part 3: Nov. 21, 2013. Multiple Regression. Bivariate prediction – 1 predictor, 1 criterion Multiple regression – use multiple predictors Reg model/equations are same, just use separate reg coefficients ( ) for each predictor
E N D
Chapter 12 Prediction/Regression Part 3: Nov. 21, 2013
Multiple Regression • Bivariate prediction – 1 predictor, 1 criterion • Multiple regression – use multiple predictors • Reg model/equations are same, just use separate reg coefficients () for each predictor • Ex) multiple regression formula with three predictor variables • a is still the regression constant (where the reg line crosses the y axis) • b1 is the regression coefficient for X1 • b2 is the regression coefficient for X2, etc…
Standardized regression coefficients • With bivariate regression, we discussed finding the slope of the reg line, b. • b = unstandardized regression coefficient • based on the original scale of measurement • But we’re sometimes interested in comparing our regression results to other researchers’ • …may have same variables but used different measures • Standardized regression coefficients (β or beta) will let us compare (more generalizable)
Using standardized coefficients (betas) • There is a formula for changing b into β in the chapter, but you won’t be asked to use it So the regression equation (model) would look like this if we use standardized regression coefficients (β):
Overlap among predictors • Common for there to be correlation among predictor variables • β = unique contribution of each variable • β1 = unique contribution of X1 in predicting Y, excluding overlap w/other predictors • R2 gives the % variance in y explained by all of the predictors together • There will be a significance test for R2 to determine whether the entire regression model explains significant variance in Y. • If yes Then examine the individual predictors’ β • There is a signif test for each of these. Is each predictor important or only some of them?
Interpreting beta • In general, interpret it like a correlation between predictor & criterion: • if β is positive, higher scores on predictor (x) are related to higher scores on criterion (y) • If β is negative, higher scores on x go with lower scores on y.
Hypothesis tests for regression • We are usually interested in multiple issues • Is the β significantly different from 0? (is there any relationship betw x & y?) • In multiple regression, we may be interested in which predictor is the best (has the strongest relationship to the criterion)
Prediction in Research Articles • Multiple regression results commonly reported • Note example table in book, reports r’s and βs for each predictor; reports R2 in note at bottom.
Reporting mult. regression • From previous table… • The multiple regression equation was significant, R2 = .13, p < .05. Depression (β = .30, p<.001) and age (β = .20, p < .001) both significantly predicted intragroup effect, but number of sessions and duration of the disorder were not significant predictors. • This indicates that older adults and those with higher levels of depression had higher (better) intragroup effects.
SPSS Reg Example • Analyze Regression Linear • Remember, “Independent Variable” is your predictor (x), “Dependent Variable” is your criterion (y) • Class handout of output – what to look for: • “Model Summary” section - shows R2 • ANOVA section – 1st line gives ‘sig value’, if < .05 signif • This tests the significance of the R2 (is the whole regression equation significant or not? If yes it does predict y) • Coefficients section – 1st line gives ‘constant’ = a • Other lines give ‘standardized coefficients’ = b or beta for each predictor • For each predictor, there is also a significance test (if ‘sig’ if < .05, that predictor is significantly different from 0 and does predict y) • If it is significant, you’d want to interpret the beta (like a correlation)