270 likes | 417 Views
Introduction to Analysis of Variance. 1-Way Fixed Effects 46-511: Statistics for Graduate Study I. Analysis of Variance: Learning Objectives. What is analysis of variance & when is it used? Awareness of multiple designs under ANOVA & what we will cover Basic logic of between subjects ANOVA
E N D
Introduction to Analysis of Variance 1-Way Fixed Effects 46-511: Statistics for Graduate Study I
Analysis of Variance:Learning Objectives • What is analysis of variance & when is it used? • Awareness of multiple designs under ANOVA & what we will cover • Basic logic of between subjects ANOVA • Assumptions of ANOVA • Main theme for today: • Partitioning Variance
Corrections to Book • Contrasts & CI’s on p. 72 & 73 • Contrast 1 (example page 86) last minus sign should be an equal sign i.e., • =12/10+(-.5)2/8+(-.5)2/11=.154
Analysis of Variance (ANOVA) • What is analysis of variance & when is it used? • There are several variations of ANOVA designs • We will look at 6 or 7 this semester. • Main Distinction in ANOVA • Fixed Effects • Random Effects • Main “workhorse” data from experimental designs • Can be applied to quasi-experimental designs as well
ANOVA: Fixed Effects Designs • Main Distinctions • Between Subjects • One Way • Two Way • N-Way • Within Subjects • One Way • Two-way/N-Way • Split-Plot/Mixed • N-Between by N-Within • Other Variations Include • ANCOVA • Hierarchical Designs • Latin Squares Designs
One-Way Fixed Effects ANOVA Eight people randomly assigned to one of three groups (n=8). Total of 24 people in this experiment (N=24) Three groups receive a different treatment (k=3) – in this example teaching method
This means… Variation among participants treated the same assumed to be experimental error Variation between treatment levels assumed to reflect treatment effects
Now all we need is a measure of variation… Sum of Squares: Mean Square: Standard Deviation:
From which we will create a ratio Or, In every ANOVA design, we will try to figure out how to partition variance into variance due to effects, and error variance.
Partitioning Sums of Squares • Total • Within/Error • Between/Treatment/Method
Total N-1 Between k-1 Within k(n-1) Defined: Degrees of Freedom
Obtaining F: Converting SS to MS What is the expected value of F if the null hypothesis is true?
Partitioning Variance:A graphical representation • Structural Model Xij= + j + ij • Graphical Representation (subject 1 in treatment 1): 1………2………3………4………5………6………7………8………9……..10 X11 T1μ 3.0 5.71 4.75 Thus, subject X11’s score of 3 = 5.71 + (4.75 – 5.71) + (3.0 – 4.75)
Hypotheses and Assumptions • Hypothesis being tested • Assumptions • 1. Normal Distribution • 2. Homogeneity of Variance. • 3. Independence of Observations.
Robustness: Common Advice • To Normality • To Homogeneity of Variance • To Independence of Observations
Assumption: Normal Distribution • Simplifying assumption • Robustness? • Situations where we are less concerned • Situations where we should be concerned • Central Limit Theorem • Is this assumption downplayed?
Imagine a population distribution such as this What the Central Limit Theorem tells us Taken from Wilcox, R. R. (2002). Understanding the practical advantages of modern ANOVA methods. Journal of Clinical Child and Adolescent Psychology, 31, 399-412.
What we actually get Problem persists up to n > 160 From Wilcox, R. R. (2002). Understanding the practical advantages of modern ANOVA methods. Journal of Clinical Child and Adolescent Psychology, 31, 399-412.
Normality revisited • Outliers can cause problems • Failure to meet normality & similarity of distributions • Skewness • Kurtosis • Effects on error term • Effects on confidence intervals • Effects on test statistic
Homogeneity of Variance • Common Advice: • Robust when sample sizes are equal (or roughly equal) • Variances are not excessively different • Caveats • Above is true, namely when normality holds • May become problematic as k increases or normality fails • Tests • Levene’s • Fmax • Caveats • Sensitive to normality • May lack power to detect differences
Independence of Observations • See Table 2.1 in Stevens (p. 61). • Test for independence (ICC) • Solutions/Alternatives
Implications of Assumptions • Often think of F and t as testing for mean differences • Sensitive to other parameters of distributions • When we fail to meet assumptions, we may not know why (but assume it is mean differences) • Normality & homogeneity assumptions imply… • Treatment doesn’t affect these two things • More complete list of assumptions • Independence of errors • Identical distributions (within group) • Identical distribution (between groups) • Homogeneity of Variance • Normal Distribution Random Sampling
Solutions/Alternatives • Homogeneity & Normality Problems • Transformations • Reducing alpha • Nonparametric techniques (e.g., Kruskal-Wallis) • Trimmed means • Bootstrapping • Independence of observations • Aggregation • Selecting representative cases • Separate analyses • Reducing alpha • Multilevel modeling • Apply a transmogrifying correction