970 likes | 1.25k Views
ANOVA & sib analysis. ANOVA & sib analysis. basics of ANOVA - revision application to sib analysis intraclass correlation coefficient. analysis of variance (ANOVA) is a way of comparing the ratio of systematic variance to unsystematic variance in a study .
E N D
ANOVA & sib analysis • basics of ANOVA - revision • application to sib analysis • intraclass correlation coefficient
analysis of variance (ANOVA) is a way of comparing the ratio of systematic variance to unsystematic variance in a study
analysis of variance (ANOVA) is a way of comparing the ratio of systematic variance to unsystematic variance in a study • ANOVA as regression
analysis of variance (ANOVA) is a way of comparing the ratio of systematic variance to unsystematic variance in a study • ANOVA as regression • research question: does exposure to content of Falconer & Mackay (1996) increase knowledge of quantitative genetics?
analysis of variance (ANOVA) is a way of comparing the ratio of systematic variance to unsystematic variance in a study • ANOVA as regression • research question: does exposure to content of Falconer & Mackay (1996) increase knowledge of quantitative genetics?
analysis of variance (ANOVA) is a way of comparing the ratio of systematic variance to unsystematic variance in a study • ANOVA as regression • research question: does exposure to content of Falconer & Mackay (1996) increase knowledge of quantitative genetics? score person
analysis of variance (ANOVA) is a way of comparing the ratio of systematic variance to unsystematic variance in a study • ANOVA as regression • research question: does exposure to content of Falconer & Mackay (1996) increase knowledge of quantitative genetics? • outcomeij = model + errorij score person
Dummy coding: outcomeij = model + errorij
Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εiji = 1, … N, N = number of people per condition = 5 j = 1, … M, M = number of conditions = 3
Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εiji = 1, … N, N = number of people per condition = 5 j = 1, … M, M = number of conditions = 3 knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1
Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εiji = 1, … N, N = number of people per condition = 5 j = 1, … M, M = number of conditions = 3 knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1
Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εiji = 1, … N, N = number of people per condition = 5 j = 1, … M, M = number of conditions = 3 knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1 = b0 + εi1
Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εiji = 1, … N, N = number of people per condition = 5 j = 1, … M, M = number of conditions = 3 knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1 = b0 + εi1 knowledgei2 = b0 + b1*dummy12 + b2*dummy22 + εi2
Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εiji = 1, … N, N = number of people per condition = 5 j = 1, … M, M = number of conditions = 3 knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1 = b0 + εi1 knowledgei2 = b0 + b1*dummy12 + b2*dummy22 + εi2 = b0 + b1*1 + b2*0 + εi2
Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εiji = 1, … N, N = number of people per condition = 5 j = 1, … M, M = number of conditions = 3 knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1 = b0 + εi1 knowledgei2 = b0 + b1*dummy12 + b2*dummy22 + εi2 = b0 + b1*1 + b2*0 + εi2 = b0 + b1 + εi2
Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εiji = 1, … N, N = number of people per condition = 5 j = 1, … M, M = number of conditions = 3 knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1 = b0 + εi1 knowledgei2 = b0 + b1*dummy12 + b2*dummy22 + εi2 = b0 + b1*1 + b2*0 + εi2 = b0 + b1 + εi2 knowledgei3 = b0 + b1*dummy13 + b2*dummy23 + εi3
Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εiji = 1, … N, N = number of people per condition = 5 j = 1, … M, M = number of conditions = 3 knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1 = b0 + εi1 knowledgei2 = b0 + b1*dummy12 + b2*dummy22 + εi2 = b0 + b1*1 + b2*0 + εi2 = b0 + b1 + εi2 knowledgei3 = b0 + b1*dummy13 + b2*dummy23 + εi3 = b0 + b1*0 + b2*1 + εi3 = b0 + b2 + εi3
Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εij knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1 = b0 + εi1 knowledgei2 = b0 + b1*dummy12 + b2*dummy22 + εi2 = b0 + b1*1 + b2*0 + εi2 = b0 + b1 + εi2 knowledgei3 = b0 + b1*dummy13 + b2*dummy23 + εi3 = b0 + b1*0 + b2*1 + εi3 = b0 + b2 + εi3 Therefore:
Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εij knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1 = b0 + εi1 knowledgei2 = b0 + b1*dummy12 + b2*dummy22 + εi2 = b0 + b1*1 + b2*0 + εi2 = b0 + b1 + εi2 knowledgei3 = b0 + b1*dummy13 + b2*dummy23 + εi3 = b0 + b1*0 + b2*1 + εi3 = b0 + b2 + εi3 Therefore: → μcondition1 = b0b0 is the mean of condition 1 (N)
Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εij knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1 = b0 + εi1 knowledgei2 = b0 + b1*dummy12 + b2*dummy22 + εi2 = b0 + b1*1 + b2*0 + εi2 = b0 + b1 + εi2 knowledgei3 = b0 + b1*dummy13 + b2*dummy23 + εi3 = b0 + b1*0 + b2*1 + εi3 = b0 + b2 + εi3 Therefore: → μcondition1 = b0b0 is the mean of condition 1 (N) → μcondition2 = b0 + b1 = μcondition1 + b1 μcondition2 - μcondition1= b1
Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εij knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1 = b0 + εi1 knowledgei2 = b0 + b1*dummy12 + b2*dummy22 + εi2 = b0 + b1*1 + b2*0 + εi2 = b0 + b1 + εi2 knowledgei3 = b0 + b1*dummy13 + b2*dummy23 + εi3 = b0 + b1*0 + b2*1 + εi3 = b0 + b2 + εi3 Therefore: → μcondition1 = b0b0 is the mean of condition 1 (N) → μcondition2 = b0 + b1 = μcondition1 + b1 b1 is the difference in means of μcondition2 - μcondition1= b1 condition 1 (N) and condition 2 (L)
Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εij knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1 = b0 + εi1 knowledgei2 = b0 + b1*dummy12 + b2*dummy22 + εi2 = b0 + b1*1 + b2*0 + εi2 = b0 + b1 + εi2 knowledgei3 = b0 + b1*dummy13 + b2*dummy23 + εi3 = b0 + b1*0 + b2*1 + εi3 = b0 + b2 + εi3 Therefore: → μcondition1 = b0b0 is the mean of condition 1 (N) → μcondition2 = b0 + b1 = μcondition1 + b1 b1 is the difference in means of μcondition2 - μcondition1= b1 condition 1 (N) and condition 2 (L) → μcondition3 = b0 + b2 = μcondition1 + b2 μcondition3 - μcondition1= b2
Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εij knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1 = b0 + εi1 knowledgei2 = b0 + b1*dummy12 + b2*dummy22 + εi2 = b0 + b1*1 + b2*0 + εi2 = b0 + b1 + εi2 knowledgei3 = b0 + b1*dummy13 + b2*dummy23 + εi3 = b0 + b1*0 + b2*1 + εi3 = b0 + b2 + εi3 Therefore: → μcondition1 = b0b0 is the mean of condition 1 (N) → μcondition2 = b0 + b1 = μcondition1 + b1 b1 is the difference in means of μcondition2 - μcondition1= b1 condition 1 (N) and condition 2 (L) → μcondition3 = b0 + b2 = μcondition1 + b2 b2 is the difference in means of μcondition3 - μcondition1= b2 condition 1 (N) and condition 3 (LB)
μLB μ μL μN
μLB b2 μ μL b1 μN b0
μLB μ μL μN
μLB μ μL μN Sums of squares
μLB μ μL μN Sums of squares SST = Σ(scoreij - μ)2
μLB μ μL μN Sums of squares SST = Σ(scoreij - μ)2
μLB μ μL μN Sums of squares SST = Σ(scoreij - μ)2 SSB = ΣNj(μj - μ)2
μLB μ μL μN Sums of squares SST = Σ(scoreij - μ)2 SSB = ΣNj(μj - μ)2
μLB μ μL μN Sums of squares SST = Σ(scoreij - μ)2 SSB = ΣNj(μj - μ)2 SSW = Σ(scoreij - μj)2
μLB μ μL μN Sums of squares SST = Σ(scoreij - μ)2 SSB = ΣNj(μj - μ)2 SSW = Σ(scoreij - μj)2
μLB μ μL μN Sums of squares SST = Σ(scoreij - μ)2 SSB = ΣNj(μj - μ)2 SSW = Σ(scoreij - μj)2 SST = SSB + SSW
μLB μ μL μN Sums of squares SST = Σ(scoreij - μ)2 SSB = ΣNj(μj - μ)2 SSW = Σ(scoreij - μj)2 SST = SSB + SSW Degrees of freedom dfT = MN - 1 dfB = M – 1 dfW = M(N – 1) N = number of people per condition M = number of conditions
μLB μ μL μN Sums of squares SST = Σ(scoreij - μ)2 SSB = ΣNj(μj - μ)2 SSW = Σ(scoreij - μj)2 SST = SSB + SSW Degrees of freedom dfT = MN - 1 dfB = M – 1 dfW = M(N – 1) Mean squares MST = SST/dfT MSB = SSB/dfB MSW = SSW/dfW N = number of people per condition M = number of conditions
μLB μ μL μN Sums of squares SST = Σ(scoreij - μ)2 SSB = ΣNj(μj - μ)2 SSW = Σ(scoreij - μj)2 SST = SSB + SSW Degrees of freedom dfT = MN - 1 dfB = M – 1 dfW = M(N – 1) Mean squares MST = SST/dfT MSB = SSB/dfB MSW = SSW/dfW F-ratio F = MSB/MSW = MSmodel/MSerror N = number of people per condition M = number of conditions
Sib analysis • number of males (sires) each mated to number of females (dams)
Sib analysis • number of males (sires) each mated to number of females (dams) • mating and selection of sires and dams→ random
Sib analysis • number of males (sires) each mated to number of females (dams) • mating and selection of sires and dams→ random • thus: population of full sibs (same father, same mother; same cell in table) and half sibs (same father, different mother; same row in table)
Sib analysis • number of males (sires) each mated to number of females (dams) • mating and selection of sires and dams→ random • thus: population of full sibs (same father, same mother; same cell in table) and half sibs (same father, different mother; same row in table) • data: measurements of all offspring
Sib analysis • example with 3 sires: scoreoffspring1dam1sire1 μdam1sire1 μsire1
Sib analysis • ANOVA: • Partitioning the phenotypic variance (VP):
Sib analysis • ANOVA: • Partitioning the phenotypic variance (VP): • between-sire component
Sib analysis • ANOVA: • Partitioning the phenotypic variance (VP): • between-sire component • - component attributable to differences • between the progeny of different males
Sib analysis μsire3 μsire2 μsire1