1 / 13

Cindy Millikin, PhD Director of Results Driven Accountability Dan D. Jorgensen

From Accountability to Achievement: “The Data Dive” 2012 Fall Special Education Directors’ Meeting. Cindy Millikin, PhD Director of Results Driven Accountability Dan D. Jorgensen Evaluation & Research Coordinator. Purpose. Data Pipeline Update

Download Presentation

Cindy Millikin, PhD Director of Results Driven Accountability Dan D. Jorgensen

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. From Accountability to Achievement: “The Data Dive”2012 Fall Special Education Directors’ Meeting Cindy Millikin, PhD Director of Results Driven Accountability Dan D. Jorgensen Evaluation & Research Coordinator

  2. Purpose Data Pipeline Update Overview of Data Collections, Data Tools & Reports Future Trainings and Resources Data Audit Form & Activity Open Dialogue

  3. Data Pipeline • Series of Interchanges • Authoritative source • From “collections” to “snapshots” • Data Store • ESSU Data Management System and the IEP Interchange • Validations – Level 1 edits • Duplicate SASIDs

  4. Data Submissions • Historically, data has been collected to support accountability decisions. Today, the data continues to support this work; however, an increased emphasis on achievement outcomes now exists. • “Core” Data Collections/Snapshots include: • Special Education End of Year • Special Education Discipline • Special Education Child Count (December 1 Count) • Special Education Personnel Collection (HR Collection – December 1 Staff) • Other Data Sources: • Parent Survey (Indicator 8) • Indicator 13 file reviews • Student Outcome survey (Indicator 14) • TCAP/CO Alt results and AMO decisions (Indicator 3) • Results Matter (Indicator 7) • Dispute Resolution (Indicator 16-19) • Monitoring (Indicator 15) • Data Submissions (Indicator 20)

  5. 2012 TCAP Results by Disability • Data from TCAP 2012 • Reading • Writing • Math • Trends • Does not include the results of students from eligible facilities or Co-Alt

  6. Data Reports Special Education Data Reports: http://www.cde.state.co.us/cdespedfin/SPED_DataReports.htm • Available Reports: • Students served by Disability • Students served by Grade level • Students served by Race/Ethnicity • Students served by Gender • Students served by Setting • Students Served by Age Group • Students Compared to October Count

  7. Other Sources of Data • RtI Implementation Rubrics: • District Level • School Level • Classroom Level • http://www.cde.state.co.us/RtI/ • Discipline Referral/Suspensions” • PBIS SWIS System • http://www.cde.state.co.us/pbis/ • SchoolView • TCAP Status/Growth • SPF/DPF Frameworks • http://www.schoolview.org/ • District/School Reports • Membership • Graduation/Dropout Rates • http://www.cde.state.co.us/index_stats.htm

  8. Sample Questions to Ask What is the disaggregated status of our populations per subject matter? Reading? Math? Writing? Achievement? Proficient and Advanced? Growth? What are the trends with these populations over the last 3-5 years? Where are students showing accelerated growth? How do these results compare to students without disabilities? Where are the most significant gaps?

  9. Fidelity & Achievement: RtI For sites, at the highest level of implementation (i.e. optimizing) across components on the RtI rubric, literacy scores were substantially higher then other schools with lower levels of fidelity of implementation. The rubric serves as an effective tool to gauge successful RtI implementation and to improve program design at the classroom, school, and district levels.

  10. Fidelity & Achievement: PBIS • For elementary and middle schools, roughly a three to five point gain on the PBIS fidelity measure (BoQ) predicts a 1% increase in math and reading proficiency scores on the CSAP. • The relationship between fidelity scores and math achievement is stronger than that observed for reading. • The PBIS fidelity of implementation tools (i.e. the SET and BoQ) serve as effective tools to gauge successful RtI implementation and to improve program design at the classroom, school, and district levels.

  11. Additional Resources Upcoming Trainings & Resources: • MTSS Problem-Solving Training • PBIS Leadership Academy • Topics: Role of Leadership, Disproportionality and Discipline (Indicator 4), alternatives to suspension, legislative updates. • http://www.cde.state.co.us/scripts/calendarpbs/eventdetail.asp?event=230 • Unified Improvement Plan Training • Address different steps in the UIP planning process. • http://www.cde.state.co.us/uip/UIP_TrainingAndSupport_Register.asp • ESSU web-page: http://www.cde.state.co.us/cdespedfin/Index_SEFD.htm

  12. Data Audit Tool • The Audit Tool is a simple form that clarifies the data available to sites and ties that data back to the UIP goal. • The form identifies: • Data Source • Purpose (fidelity or outcome) • Frequency of Collection • Frequency of Review • Who has access? • Which team reviews? • Relation to UIP Goal • Intended Outcomes

  13. Open Dialogue Please record your thoughts. Please be prepared to share your thoughts with the larger group. How may the Exceptional Student Service Unit better support the needs of your AU/site/district(s)? What data, analysis, and/or reports would best support your work? What additional supports may CDE be able to provide you to facilitate improved academic achievement for students with disabilities?

More Related