1 / 18

Paul Stemmer stemmerp@mi NAEP State Coordinator Michigan Department of Education

Describing NAEP Background Question Data: A Cautionary Tale. Paul Stemmer stemmerp@mi.gov NAEP State Coordinator Michigan Department of Education National Conference on Student Assessment, Minneapolis, Minnesota June 2012. NAEP Grade 4 Reading Trends Comparison.

ovidio
Download Presentation

Paul Stemmer stemmerp@mi NAEP State Coordinator Michigan Department of Education

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Describing NAEP Background Question Data: A Cautionary Tale Paul Stemmer stemmerp@mi.gov NAEP State Coordinator Michigan Department of Education National Conference on Student Assessment, Minneapolis, Minnesota June 2012

  2. NAEP Grade 4 Reading Trends Comparison

  3. NAEP Grade 4 Reading Trends Comparison (Enlarged)

  4. What is different about these comparisons?

  5. What is different about these comparisons? (enlarged) Avg SS 202 214 219 222 * 229 237 238 * 217 219 221 * 215 225 224 Nat. Pub. MA MI OH Avg SS 220 237 219 224

  6. Life is Good!

  7. NAEP Trends in Grade 4 Reading Average Scaled Scores

  8. Wait a Minute, What Happened?

  9. Wait a Minute, What Happened? (enlarged) Avg. SS * 217 219 222 * * 191 190

  10. What explains this? What about other TUDAs?

  11. What about OHIO and Cleveland?

  12. What about MA and Boston?

  13. Why the discrepancies? • Two Possibilities • A. If the data is in error? • Differences in respondent question interpretation – definitions? • Need to improve the question • Social Desirability • B. If data is “true,” perhaps more interesting? • Internal School Variables • External School Factors

  14. Checking Social Desirability 09 to 11 Comparison

  15. Why the discrepancies? • If data is “true,” perhaps more interesting?What are the possible intervening factors? • Internal School Variables • Professional development • Rigor • Instructional Methods/Impact • Professional knowhow vs. execution • Administration, management and follow through • Resources and resource utilization • External School Factors • Chronic Absenteeism • Concentration of Poverty • Health Factors • Cultural differences

  16. Conclusions? • Descriptive data can be helpful and informative • Have to keep reminding ourselves it is not causative/predictive • Are we jumping to solutions without fully understanding how best to solve these problems of large urban school districts? • Do we infer too much from snapshot assessment results about the quality of teaching and administration? • Does arguing otherwise sound like excuses? • Aren’t we all guilty of trying to connect the dots? • Trying to simplify the story • Making inferences beyond what the data is telling us.

  17. Finally • Prediction is very hard, especially about the future. - Yogi Berra

  18. For More Information: Contact: Paul Stemmer, Ph.D., NAEP State Coordinator Michigan Department of EducationOffice of Educational Assessment and AccountabilityPO Box 30008Lansing, MI 48909 (517)241-2360stemmerp@mi.gov

More Related