180 likes | 641 Views
2. Nature of Correlational Research. Assumptions of Linearity and Additivity LinearityAdditivityAssumes no interactionsFactors affecting Correlational CoefficientReliability of the measureRestriction of range (p 226 fig 8-2)Outliers (p 226, fig 8-2)? Using your data set, insert an outlier that will cause the bivariate correlation to exceed significance beyond p <.001. what value was necessary to achieve it? Subgroup Differences (227. fig 8.3)).
E N D
1. 1 Chapter 8Correlational (passive) research strategy Nature of Correlational Research
Simple and Partial Correlational Analysis
Multiple Regression Analysis (MRA)
Some other Corr Techniques
Testing Mediational Hypotheses
Factor Analysis
Summary
2. 2 Nature of Correlational Research Assumptions of Linearity and Additivity
Linearity
Additivity
Assumes no interactions
Factors affecting Correlational Coefficient
Reliability of the measure
Restriction of range (p 226 fig 8-2)
Outliers (p 226, fig 8-2)
? Using your data set, insert an outlier that will cause the bivariate correlation to exceed significance beyond p <.001. what value was necessary to achieve it?
Subgroup Differences (227. fig 8.3))
3. 3 Nature of Correlations (con’t) Multifacted Constructs
Cf Abramson et al. attributional style v. Ohio State Leadership model
Keeping them separate
When theoretically distinct (constructs predict interaction)
Depression and attributional style
Three conditions (internal, stable, global) predict depression
When information would be lost (obscuring them in overall)
Antifat facets (4) have diff relationships to other constructs
Not simply for convenience
? Describe a multfacted construct that plays a role in your theoretical framework
Combining them
When interested in latent variable variables
4. 4 Multifaceted ConstructsRecommentations 1. use reliable measures
2. check the distribution
Compare sample to existing norms
2. plot scores for subgroups and combined groups
4. compute subgroup means and corr
Make sure they don’t adversely affect combined corr
5. Have a good reason to combine facets
5. 5 Simple and Partial Corr Analsys Correlation coefficient (you know about this)
Differences in correlation coefficients
Fisher’s z transformation
Equality of r’s
Cohen & Cohen (1983)
Can relationships be different if r’s are same?
Yes, test slopes (unstandardized) if SDs differ
Check for moderators in the regression analysis
6. 6 Partial Correlation Controlling for a third variable
Feather (1985) p. 235 study with
Depression
Self-esteem
Masculinity
What better explains depression? Masc or SE?
Self esteem (masc and self-esteem were confounded)
7. 7 Multiple Regression (MRA) Difference between MC & MR
MC to establish relationships
Based on sample where Ps measured on all vars (IVs and DVs)
MR used to predict DV from IVs
When Ps are measured on only IVs
For example
Predicting success in a grad program
Predicting likelihood of suicide
Ypred = a + b1X1 + b2X2 …+ bkXk
8. 8 MRA Forms Simultaneous (use) AKA: Standard
All predictors considered at once regardless of value of each predictor
Hierarchical (use) AKA: sequential table 8-5, p. 238)
User decides order of consideration
Which predictors should be controlled for
For theory testing or practical needs
Stepwise AKA: statistical (may be problematic)
9. 9 Information from MRA Multiple correlation coefficient R
R2 degree of association
% variance accounted for by all predictors
Coefficient
b weight = raw (unstandardized) scores
ß (beta) weight = standardized score
Allows direct comparison of weights
Change in R2 (In hierarchical MRA)
To show how much incremental variance each predictor adds
Be careful…order of entry is important
? What is the difference between multiple correlation and multiple regression?
10. 10 Multicollinearity two or more predictors are highly related (r>.8)
Effects of multicollinearity:
1. inflates Standard Errors of regression
2. large errors lead to non sig predictors
Causes
1. multiple measures of same construct
- use latent variable approach
2. sampling error
(accidentally over-sampling high or low Ps on a variable)
11. 11 Multicollinearity Detecting Multicollinearity
Look at correlation matrix for r’s > .8
Run series of MR to detect Rs > .0
Check for VIF >10
Dealing with it
Avoid redundant vars
Use vars with least intercorrelation
Factor analyze to combine vars
12. 12 MRA instead of ANOVA Moderated MR (similar to ANCOVA)
To test interaction
Compute an interaction term (IV1 * IV2) in spss
Enter the interaction term AFTER main effects in MR (blocks)
Use instead of ANOVA
When one or more IVs are continuous
When IVs are correlated (ANOVA assumes IVs are uncorrelated)
Transforming continuous to dichotomous vars
Using median split,,,not usually a good idea!
Reduces power (loses precision)
Gives false “effect” when two median splits are used
Just say “no”…to median split
13. 13 Other Correlational Techniques Logistic regression
Set of continuous IVs to predict categorical criterion (DV)
Gives estimate of probability of group membership
? Give an example of how you could use logistic regression in your project.
Multiway frequency analysis
pattern of relationships among set of nominal vars (X2)
Loglinear analysis extends chi sq to > 2 vars
Logit analysis (when vars are considered IVs and one is a DV)
ANOVA for categorical vars
14. 14 Testing Mediational Hypotheses p 246 IV -> M -> DV
See Condon & Crano (1988)
? Give an example of a mediating variable that could play a role in your project
Similarity< Other like us?> =Attraction
Simple mediation (3 Vars)
Complex models
Path analysis (SEM) fig 8-7, p. 248
Latent vars analysis
Covariance structure analysis (LISREL)
Prospective research (fig 8-8, p. 249)
Cross lagged correlational analysis
15. 15 Limits on Interpretation (path analysis) Completeness of model
Are all vars considered?
Any curvilinear or non additive relationships?
Alternative Models (p 252 fig 8-10)
What other competing theories?
16. 16 Factor Analysis A statistical means for finding constructs within a set of variables
Identifies sets of items are most related to one another
Latent variables or constructs (e.g. attitudes toward computers)
Factors:
1. anxiety toward them
2. perceived positive effects on society
3. perceived negative effects on society
4. personal usefulness of them
17. 17 Factor Analysis (EFA) Uses (Exploratory)
Data reduction
Scale development
Considerations
Numbers of Ps needed (a lot): 200-300
Quality of data
Methods of factor extraction and rotation
Determining num of factors
Interpreting the factors
Retaining factor scores
CFA (confirmatory FA)
18. 18 Correlational Analsyses Nature of Correlational Research
Simple and Partial Correlational Analysis
Multiple Regression Analysis (MRA)
Some other Corr Techniques
Testing Mediational Hypotheses
Factor Analysis