230 likes | 393 Views
What does covariance tell you? What it is a function of? What coefficient tells you the strength of the relationship? What is confidence a function of?. Review. What is the central limit theorem? What is a normal distribution? What inference does the central limit theorem help us with?.
E N D
What does covariance tell you?What it is a function of?What coefficient tells you the strength of the relationship?What is confidence a function of? Review
What is the central limit theorem?What is a normal distribution?What inference does the central limit theorem help us with? Review of central limit theorem
yi = a + bxi + eiyhat = a + bx Two formulas • What is yhat? • What is yi? • What is xi? • What is ei? • What is b? • What is a?
Interpreting results What is the difference between b and Beta? What is the standard error How do you compute t? What is the significance level?
Residual review What is a residual? What is the mean of residuals? What assumption do we make about residuals?
What is a z score?How is it computed?What is the beta coefficient?How is it different from the b in terms of interpreting the effect? Z score review
T statistic review • What is the formula for the t statistic? • If the t = 2, how confident are we? • (what are we confident about?)
The intercept? • If the intercept is 3, and the dependent variable ranges from 1-4 and the independent variable is 1-4, what other information do we need to know the value of the DV when the IV is at its lowest value? • The slope is 2. • What is the value of the DV when the IV is at its lowest value?
If you multiply the dependent variable by 100, what numbers change? How do they change? * • What numbers do not change? * *Potential answers: B, Beta, standard error, t, significance
Where does our estimate of the error come from? • The residuals. If the points are far from the slope, then we are less confident. • If the points are close to the slope, then we are more confident.
Can we be wrong about rejecting a null hypothesis?There are two kinds of errors: • (Type 1) a true null hypothesis can be incorrectly rejected • (Type 2) a false null hypothesis can fail to be rejected.
Type 2 error is more serious • We you fail to reject the hypothesis, you do not prove the hypothesis is wrong. (remember, we don’t ever prove anything). • It could be measurement error and all kinds of statistical problems that lead to rejecting a null hypothesis.
Null Hypothesis Rejected • If you reject it, then you have tried to prove your theory wrong and you could not. • Don’t forget that you haven’t proven anything (we never prove anything) • You still have other ways of trying to prove it wrong
What is the question that we ask in statistical analysis? • How much better have we done than the mean in predicting values of y from x?
How do we know we have done better than the mean? • Distance between the slope and the mean is great • What is the confidence of “doing better than the mean” likely determined by? • Ratio of explained to unexplained variance
Wouldn’t it be great to have a coefficient that told us the ratio of explained to unexplained variance? • Total Variance • = • Explained Variance • + • Unexplained variance
R square • R square = Explained Variance Unexplained + Explained Variance Unexplained Variance + Explained Variance = what? (total variance)
For each observation, you calculate the distance from the mean to the slope squared to get explained variance. • Then divide by the total sum of squares, which is total variance
Pearson r and R square • Pearson r squared is the same as R square • (in the bivariate case – one independent variable) (Pearson r)2 = R square R square is standardized and symmetric Symmetric means that it doesn’t matter which is the independent variable and which is the dependent variable
regr happy prestg80, beta Source | SS df MS Number of obs = 11 -------------+------------------------------ F( 1, 9) = 3.30 Model | 1.31753739 1 1.31753739 Prob > F = 0.1026 Residual | 3.59155351 9 .399061502 R-squared = 0.2684 -------------+------------------------------ Adj R-squared = 0.1871 Total | 4.90909091 10 .490909091 Root MSE = .63171 ------------------------------------------------------------------------------ happy | Coef. Std. Err. t P>|t| Beta -------------+---------------------------------------------------------------- prestg80 | -.0380391 .0209348 -1.82 0.103 -.518061 _cons | 3.330371 .8050567 4.14 0.003 . ------------------------------------------------------------------------------
Formula for r and beta* * Beta is the same as r in the case of bivariate