471 likes | 657 Views
Chapter Thirteen. Hypothesis Testing for Two or More Means: The One-Way Analysis of Variance. More Statistical Notation. Analysis of variance is abbreviated as ANOVA An independent variable is called a factor
E N D
Chapter Thirteen Hypothesis Testing for Two or More Means: The One-Way Analysis of Variance
More Statistical Notation • Analysis of variance is abbreviated as ANOVA • An independent variable is called a factor • Each condition of the independent variable is also called a level or a treatment, and differences produced by the independent variable are a treatment effect • The symbol for the number of levels in a factor is k
One-Way ANOVA A one-way ANOVA is performed when only one independent variable is tested in the experiment
Between Subjects • When an independent variable is studied using independent samples in all conditions, it is called a between-subjects factor • A between-subjects factor involves using the formulas for a between-subjects ANOVA
Within Subjects Factor • When a factor is studied using related (dependent) samples in all levels, it is called a within-subjects factor • This involves a set of formulas called a within-subjects ANOVA
Analysis of Variance • The analysis of variance is the parametric procedure for determining whether significant differences occur in an experiment containing two or more sample means • In an experiment involving only two conditions of the independent variable, you may use either a t-test or the ANOVA
Experiment-Wise Error • The overall probability of making a Type I error somewhere in an experiment is call the experiment-wise error rate • When we use a t-test to compare only two means in an experiment, the experiment-wise error rate equals a
Comparing Means • When there are more than two means in an experiment, the multiple t-tests result in an experiment-wise error rate that this much larger than the a we have selected • Using the ANOVA allows us to compare the means from all levels of the factor and keep the experiment-wise-error rate equal to a
Assumptions of the ANOVA • Each condition contains a random sample of interval or ratio scores • The population represented in each condition forms a normal distribution • The variances of all populations represented are homogeneous
The F-Test • The statistic for the ANOVA is F • When Fobt is significant, indicates only that somewhere among the means at least two of them differ significantly • It does not indicate which specific means differ significantly • When the F-test is significant, we perform post hoc comparisons
Post Hoc Comparisons • Post hoc comparisons are like t-tests • We compare all possible pairs of level means from a factor, one pair at a time
Sources of Variance • There are two potential sources of variance • Scores may differ from each other even when participants are in the same condition. This is called variance within groups • Scores may differ from each other because they are from different conditions. This is called the variance between groups
Mean Squares • The mean square within groups is an estimate of the variability in scores as measured by differences within the conditions of an experiment • The mean square between groups is an estimate of the differences in scores that occurs between the levels in a factor
The F-Distribution The F-distribution is the sampling distribution showing the various values of F that occur when H0 is true and all conditions represent one population
Degrees of Freedom • The critical value of F (Fcrit) depends on • The degrees of freedom (both the dfbn = k - 1 and the dfwn = N - k) • The a selected • The F-test is always a two-tailed test
Sum of Squares • The computations for the ANOVA require the use of several sums of squared deviations • Each of these terms is called the sum of squares and is symbolized by SS
Summary Table of a One-way ANOVA Source Sum of df Mean F Squares Squares Between SSbndfbnMSbnFobt Within SSwndfwnMSwn Total SStotdftot
Computing Fobt • Compute the total sum of squares (SStot)
Computing Fobt • Compute the sum of squares between groups (SSbn)
Computing Fobt • Compute the sum of squares within groups (SSwn) • SSwn = SStot - SSbn
Computing Fobt • Compute the degrees of freedom • The degrees of freedom between groups equals k - 1 • The degrees of freedom within groups equals N - k • The degrees of freedom total equals N - 1
Computing Fobt • Compute the mean squares
Computing Fobt • Compute Fobt
Fisher’s Protected t-Test • When the ns in the levels of the factor are not equal, use Fisher’s protected t-test
Tukey’s HSD Test • When the ns in all levels of the factor are equal, use the Tukey HSD multiple comparisons test where qk is found using the appropriate table
Confidence Interval • The computational formula for the confidence interval for a single m is
Graphing the Results in ANOVA A graph showing means from three conditions of an independent variable.
Proportion of Variance Accounted For • Eta squared indicates the proportion of variance in the dependent variable that is accounted for by changing the levels of a factor
Omega Squared • In some instances, the effect size is reported using the measurement omega squared • is an estimate of the proportion of the variance in the population that would be accounted for by the relationship
Example • Using the following data set, conduct a one-way ANOVA. Use a = 0.05
Example • dfbn = k - 1 = 3 - 1 = 2 • dfwn = N - k = 18 - 3 = 15 • dftot = N - 1 = 18 - 1 = 17
Example • Fcrit for 2 and 15 degrees of freedom and a = 0.05 is 3.68 • Since Fobt = 4.951, the ANOVA is significant • A post hoc test must now be performed
Example • The mean of sample 3 is significantly different from the mean of sample 2