350 likes | 490 Views
1 Overview 2 One-way ANOVA. Analysis of Variance. Ronald Aylmer Fisher (1890-1962). Where ANOVA Comes From. An introduction of a procedure for testing the hypothesis that three or more population means are equal. For example: H 0 : µ 1 = µ 2 = µ 3 = . . . µ j
E N D
1 Overview 2 One-way ANOVA Analysis of Variance
Ronald Aylmer Fisher (1890-1962) Where ANOVA Comes From
An introduction of a procedure for testing the hypothesis that three or more population means are equal. For example: H0: µ1 = µ2 = µ3 = . . . µj H1: At least one mean is different Overview
Definition Analysis of Variance (ANOVA) a method of testing the equality of three or more population means by analyzing sample variations Overview
1. The F-distribution is not symmetric; it is skewed to the right. 2. The values of F can be 0 or positive, they cannot be negative. 3. There is a different F-distribution for each pair of degrees of freedom for the numerator and denominator. ANOVA methods require the F-distribution
F - distribution Not symmetric (skewed to the right) nonnegative values only
Assumptions 1. The populations have normal distributions. 2. The populations have the same variance 2 (or standard deviation ). 3. The samples are simple random samples. 4. The samples are independent of each other. One-Way ANOVA
Treatment (or factor) a property or characteristic that allows us to distinguish the different populations from another Definition
Estimate the common value of 2 using 1. The variance between samples (also called variation due to treatment) is an estimate of the common population variance 2 that is based on the variability among the sample means. 2. The variance within samples (also calledvariation due to error) is an estimate of the common population variance 2 based on thesample variances. ANOVA Fundamental Concept
Test Statistic for One-Way ANOVA variance between samples F = variance within samples ANOVA Fundamental Concept A excessively large F test statistic is evidence against equal population means.
Right-tailed test Degree of freedom with j samples of the same size n numerator df = j -1 denominator df = n -j Critical Value of F
SS(total),or total sum of squares, is a measure of the total variation (around Y) in all the sample data combined. Key Components of ANOVA Method
SS(total),or total sum of squares, is a measure of the total variation (around Y) in all the sample data combined. Key Components of ANOVA Method SS(total) = (Yij - Y)2
SS(treatment) is a measure of the variation between the samples. In one-way ANOVA, SS(treatment) is sometimes referred to as SS(factor). Because it is a measure of variability between the sample means, it is also referred to as SS (betweengroups) or SS (between samples). Key Components of ANOVA Method
SS(treatment) is a measure of the variation between the samples. In one-way ANOVA, SS(treatment) is sometimes referred to as SS(factor). Because it is a measure of variability between the sample means, it is also referred to as SS (betweengroups) or SS (between samples). Key Components of ANOVA Method SS(treatment) = n1(Y1 - Y)2 + n2(Y2 - Y)2 + . . . nk(Yk - Y)2 = nj(Yj - Y)2
SS(within or error) is a sum of squares representing the variability that is assumed to be common to all the populations being considered. Key Components of ANOVA Method
SS(within or error) is a sum of squares representing the variability that is assumed to be common to all the populations being considered. SS(within) = (Yij - Yj)2 Key Components of ANOVA Method
SS(total) = SS(treatment) + SS(error) SS(total) = SS(between) + SS(within) Key Components of ANOVA Method
Sum of Squares SS(treatment)andSS(error) divided by corresponding number of degrees of freedom. MS (treatment) is mean square for treatment, obtained as follows: Mean Squares (MS)
Sum of Squares SS(between)andSS(within) divided by corresponding number of degrees of freedom. MS (between or treatment) is mean square for treatment, obtained as follows: Mean Squares (MS) SS (between) MS (between) = j - 1
MS (within or error) is mean square for error, obtained as follows: Mean Squares (MS) SS (within) MS (within) = n - j
MS (within or error) is mean square for error, obtained as follows: Mean Squares (MS) SS (within) MS (within) = n - j SS (total) MS (total) = n - 1
Example Grand Mean= (70+79+76)/3 = 75
Each observation is different from the Grand (total sample) Mean by some amount • There are two sources of variance from the mean • 1) That due to the treatment (COLOUR) or independent variable • 2) That which is unexplained by our treatment H0: µ1 = µ2 = µ3 H1: At least one mean is different
(Each Observation-Grand Mean)2 = (68-75)2+ (70-75)2 + (63-75)2 + (76-75)2 +(73-75)2 + (75-75)2+ (80-75)2 + (77-75)2 + (85-75)2+(78-75)2 + (79-75)2 + (72-75)2 +(80-75)2 + (73-75)2 + (76-75)2 = 416 Total Sum of Squares
(Average treatment outcome-Grand Mean)2•number of observations in treatment (70-75)2•5 + (79-75)2 •5 + (76-75)2 •5=210 Explained variance Treatment Sum of Squares
(Each observation-treatment mean)2 =(68-70)2 + (70-70)2 + (63-70)2 + (76-70)2 + (73-70)2 + (75-79)2 + (80-79)2 + (77-79)2 + (85-79)2 + (78-79)2 + (79-76)2 + (72-76)2 + (80-76)2 + (73-76)2 + (76-76)2 = 206 Within SS or Unexplained Variance Unexplained Sum of Squares
df = j-1, n-j (3-1), (15-3) Tabled value for F(2;12) = 3,885 (=0.05) Fcomp must equal or exceed 3,885 for the means to be significantly different Calculating F
Conclusion • Computed value F = 6,12 • Tabled value for F(2;12) = 3,885 (=0.05) • H0 should be rejected. Population means are NOT equal. • If the null hypothesis is rejected and you are interested in finding the differences you shoud run Fisher’s Least Significance Difference (LSD).
Least Significance Difference LSD is a method to compare pairwise comparisons between quantitative variables coming from three or more independent groups. Two treatment means are declared to be significantly different (using LSD) if wherendenotes the number of observations involved in the computation of a treatment mean.
Least Significance Difference • df = n-j=15-3=12 • Tabled value for t(0,1;12) =2,179 • must equal or exceed 5,71 for the means to be significantly different
Least Significance Difference • |blue – white| = |70-79| = 9* • |blue – green| = |70-76| = 6* • |white – green| = |79-76| =3