1 / 37

Multi State Alternate Assessment Collaborative: Measuring Growth in Alternate Assessments

Multi State Alternate Assessment Collaborative: Measuring Growth in Alternate Assessments. Gary W. Phillips American Institutes for Research CCSSO 2014 National Conference on Student Assessment (NCSA) New Orleans June 25-27, 2014. Alternate Assessment State Collaborative.

Download Presentation

Multi State Alternate Assessment Collaborative: Measuring Growth in Alternate Assessments

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Multi State Alternate Assessment Collaborative: Measuring Growth in Alternate Assessments Gary W. Phillips American Institutes for Research CCSSO 2014 National Conference on Student Assessment (NCSA) New Orleans June 25-27, 2014

  2. Alternate Assessment State Collaborative

  3. Three State Collaboratives • National Center and State Collaborative (NCSC) hosted by the National Center on Education Outcomes (NCEO) at the University of Minnesota involving 24 states. • Dynamic Learning Maps (DLM) hosted by the Center for Educational Testing and Evaluation (CETE) at the University of Kansas involving 13 states. • Multistate Alternate Assessment Collaborative hosted by the American Institutes for Research (AIR) in Washington, DC involving 6 states. • Reading, Writing (combine for ELA), Mathematics, Science & Social Studies Multistate Alternate Assessment Collaborative

  4. Multistate Alternate Assessment Collaborative Multistate Alternate Assessment Collaborative

  5. Advantages of an Adaptive Alternate Assessment

  6. Advantages of Adaptive Alternate Assessment • Standardized administration which allows the scores from the test to be comparable from year-to-year • Test difficulty is adapted to student ability • Administered and scored by teachers • High reliability of the scores • Aligned to Common Core State Standards (in most states) • Cheaper than portfolio assessments • Less administration time (about one hour per content area) Multistate Alternate Assessment Collaborative

  7. Advantages of Adaptive Alternate Assessment • Meets the same APA/AERA/NCME technical requirements as assessments of the general population • Growth models • iPad pilot Multistate Alternate Assessment Collaborative

  8. Administration

  9. Administration • Individually administered to students by the Test Administrator • Takes about one hour per subject • A second rater independently scores the test for a sample of students (used to establish inter-rater reliability) Multistate Alternate Assessment Collaborative

  10. Calibration and Vertical Scale?

  11. Typical Linking Design & Vertical Scale

  12. How do we Estimate Student Ability?

  13. Student Ability Estimation • Partial Credit Rasch Model • Pattern Scoring Multistate Alternate Assessment Collaborative

  14. How are the Scores Reported?

  15. Vertical Scale

  16. Vertical Scale Multistate Alternate Assessment Collaborative

  17. Vertical Scale Multistate Alternate Assessment Collaborative

  18. Growth

  19. Multistate Alternate Assessment Collaborative

  20. Multistate Alternate Assessment Collaborative

  21. Multistate Alternate Assessment Collaborative

  22. Multistate Alternate Assessment Collaborative

  23. How do we Establish Performance Standards?

  24. Standard Setting • Bookmark Procedure (or ID Matching) • Workshop panel of broadly representative stakeholders • Ordered Item Booklet (OIB) covers Extended Common Core State Standards • Response Probability for ordering items • Two Rounds • Impact data • Achievement Level Descriptors (ALDs) • Benchmarking (if requested by the Department of Education) • Vertical articulation • Multiple Proficiency levels Multistate Alternate Assessment Collaborative

  25. Most Difficult Item . 12 3 4 22 21 20 19 18 17 16 15 13 2 11 10 9 8 7 6 5 14 Ordered Item Booklet 1 Easiest Item Standard SettingOrdered Item Booklet

  26. Most Difficult Item . 12 3 4 22 21 20 19 18 17 16 15 13 2 11 10 9 8 7 6 5 14 Ordered Item Booklet 1 Easiest Item Standard SettingBookmark Procedure

  27. Most Difficult Item . 12 22 21 20 19 18 17 16 15 13 14 9 11 10 3 4 2 6 7 8 5 Ordered Item Booklet 1 Easiest Item Standard Setting I.D. Matching Procedure Consistent with the PLD for “Meets Proficiency” Consistent with the PLD for “Approaches Proficiency” Threshold region

  28. Standard SettingArticulation Multistate Alternate Assessment Collaborative

  29. Standard SettingArticulation Multistate Alternate Assessment Collaborative

  30. Standard SettingImpact Data • Uses Lords distributional projection methodology • See Lord, F., Applications of Item Response Theory to Practical Testing Problems, Lawrence Erlbaum Associates: Hillsdale, New Jersey, 1980, Chapter 4). Multistate Alternate Assessment Collaborative

  31. Standard SettingImpact Data Multistate Alternate Assessment Collaborative

  32. Standard SettingBenchmarking Against Previous Standard Multistate Alternate Assessment Collaborative

  33. Standard SettingBenchmarking Against General Education Standard Multistate Alternate Assessment Collaborative

More Related