340 likes | 487 Views
Statistics review. Basic concepts: Variability measures Distributions Hypotheses Types of error. Common analyses T-tests One-way ANOVA Randomized block ANOVA Two-way ANOVA. The t-test. Asks: do two samples come from different populations?. YES. NO. DATA. Ho. A. B. The t-test.
E N D
Statistics review Basic concepts: • Variability measures • Distributions • Hypotheses • Types of error • Common analyses • T-tests • One-way ANOVA • Randomized block ANOVA • Two-way ANOVA
The t-test Asks: do two samples come from different populations? YES NO DATA Ho A B
The t-test Depends on whether the difference between samples is much greater than difference within sample. A B Between >> within… A B
The t-test Depends on whether the difference between samples is much greater than difference within sample. A B Between < within… A B
s2 + s2 n1 n2 The t-test T-statistic= Difference between means Standard error within each sample
s2 + s2 n1 n2 The t-test How many degrees of freedom? (n1-1) + (n2-1) Why does this seem familiar?
T-tables Careful! This table built for one-tailed tests. Only common stats table where to do a two-tailed test (A doesn’t equal B) requires you to divide the alpha by 2
T-tables Two samples, each n=3, with t-statistic of 2.50: significantly different?
T-tables Two samples, each n=3, with t-statistic of 2.50: significantly different? No!
If you have two samples with similar n and S.E., why do you know instantly that they are not significantly different if their error bars overlap?
If you have two samples with similar n and S.E., why do you know instantly that they are not significantly different if their error bars overlap? } Careful! Doesn’t work the other way around!! • the difference in means < 2 x S.E., i.e. t-statistic < 2 • and, for any df, t must be > 1.96 to be significant!
One-way ANOVA General form of the t-test, can have more than 2 samples Ho: All samples the same… Ha: At least one sample different
One-way ANOVA General form of the t-test, can have more than 2 samples A B C DATA A B C Ho Ha A B C A C B
A B C One-way ANOVA Just like t-test, compares differences between samples to differences within samples Difference between means Standard error within sample T-test statistic (t) MS between groups MS within group ANOVA statistic (F)
Mean squares: • MS= Sum of squares • df
Everyone gets a lot of cake (high MS) when: Lots of cake (high SS) Few forks (low df) • MS= Sum of squares • df
Mean squares: • MS= Sum of squares • df Analogous to variance
Variance: Sum of squared differences • S2= Σ (xi – x )2 • n-1 df
ANOVA tables SST = SSX SSE
ANOVA tables SSE SSX = MSE MSX = df (X) df (E)
ANOVA tables } } SSE SSX = MSE MSX = df (X) df (E)
Do three species of palms differ in growth rate? We have 5 observations per species. Complete the table!
Hint: For the total df, remember that we calculate total SS as if there are no groups (total variance)…
Note: treatment df always k-1 Is it significant? At alpha = 0.05, F2,12 = 3.89
2. Randomized block Poor patch Medium patch Good patch BLOCK B BLOCK C BLOCK A
Pro: Can remove between-block SS from error SS…may increase power of test Error Error Block Treatment Treatment
Con: Blocks use up error degrees of freedom Error Error Block Treatment Treatment
Do the benefits outweigh the costs? Does MS error go down? F = Treatment SS/treatment df Error SS/error df Error Error Block Treatment Treatment
Two-way ANOVA Just like one-way ANOVA, except subdivides the treatment SS into: • Treatment 1 • Treatment 2 • Interaction 1&2
Two-way ANOVA Suppose we wanted to know if moss grows thicker on north or south side of trees, and we look at 10 aspen and 10 fir trees: • Aspect (2 levels, so 1 df) • Tree species (2 levels, so 1 df) • Aspect x species interaction (1df x 1df = 1df) • Error? k(n-1) = 4 (10-1) = 36
5 3 2 Interactions Combination of treatments gives non-additive effect Additive effect: Alder Fir South North
Interactions Combination of treatments gives non-additive effect Anything not parallel! South South North North
Careful! If you log-transformed your variables, the absence of interaction is a multiplicative effect: log (a) + log (b) = log (ab) y Log (y) South South North North