1 / 14

Understanding Three-Level Models In Inference | Analysis of Variance Components and Correlations

Dive into three-level models, explore variance components, ICC, VPC, and correlations. Learn to compare models and the impacts of adding predictors. Utilize R, JMP, and lmer for comprehensive analysis.

goolsby
Download Presentation

Understanding Three-Level Models In Inference | Analysis of Variance Components and Correlations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Stat 414 – Day 17 Three-level models Inference

  2. Last Time – 3-level models • Yijk = grand mean + random effect for school + random effect for class + random effect for student • 3 variance components • Proportion of total variation at each level (VPC) • Intraclass correlation coefficients (ICC) • Level 3/total: correlation of students in same school • (Level 2 + Level 3)/total: correlation of students in same class (in same school) • Level 3/(Level 2 + Level 3): correlation of classes within same school • What is correlation of student with self? • What is correlation of 2 students from different schools?

  3. Last Time – 3-level models

  4. class Covariances… • Divide by total variance to get correlations 1 2 3 4 5 6 1 2 3 4 5 6

  5. Day 16 Example 3 (achieve.txt) • JMP SDs .5584 .5222 2.2016

  6. Example 3 (achieve.txt) • JMP R - lme by REML AIC BIC logLik 46154 46182.97 -23073 SDs .5584 .5222 2.2016

  7. Example 3 (achieve.txt) • JMP R - lme (method = “ML”) by ML AIC BIC logLik 46150.04 46179 -23071.02 SDs .5584 .5222 2.2016

  8. Example 3 (achieve.txt) • JMP R - lmer SDs .5584 .5222 2.2016

  9. Example 3 (achieve.txt) • JMP R – lmer (REML= F) SDs .5584 .5222 2.2016

  10. How compare models? • Model with no random effects? • Model with only Level 2 random effects? • Model with only Level 3 random effects?

  11. Possible approaches • R, JMP lme: intervals(model1) • JMP: wald tests • These can be risky with small sample sizes as assume normality and variance estimates follow skewed distributions • Likelihood ratio tests • p-value actually a bit conservative • R lmerTest: likelihoods change • ML (log likelihood) vs. REML (residual log like)

  12. Opinions vary • Use ML when dropping fixed effects terms • You can compare nested models that only differ in the random terms by using the REML likelihood or the ordinary likelihood. If you want to compare models that differ in fixed effects terms (or both), then use ordinary likelihood.

  13. Example With lme (lmer will do either) 46758.087 – 46142.04 = 616.05

  14. Adding predictors • Let’s add student’s vocabulary (genvocab), number of students in class (clenroll), number of students in school (cenroll)

More Related