1 / 32

Enhancing State Support: Effective Evaluation Strategies

Learn how to improve state-level supports and stakeholder engagement through evaluation. Understand implementation-outcome connection, data evaluation, and reporting. Discover examples for continuous improvement and effective reporting to stakeholders.

saum
Download Presentation

Enhancing State Support: Effective Evaluation Strategies

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A3 – Improving State Level Supports and Stakeholder Engagement through Effective EvaluationKim Gulbrandson, Justyn Poulos – Wisconsin RtI CenterKey Words: Applied Evaluation, Assessment

  2. Objectives Understand connection between implementation and student outcomes. Identify considerations for deciding which implementation and outcome data to evaluate and report. Leave with examples of how states can make implementation and outcome connections and how to report to stakeholders. Gather examples of how to use the same evaluation data for state level continuous improvement.

  3. What Are You Trying to Accomplish? Wisconsin RtI Center: Our mission is to build the capacity of Wisconsin schools to develop and sustain an equitable multi-level system of supports to ensure the success for all students.

  4. The Fidelity - Outcome Relationship ”Higher Fidelity is correlated with better outcomes across a wide range of programs and practices.” (Fixsen/Blase) https://nirn.fpg.unc.edu/sites/nirn.fpg.unc.edu/files/resources/NIRN-NASHMPD15thConference-02-2005.pdf

  5. Logic Model – What Data We Collect and Use and Why

  6. Align Data Collection and Evaluation Across the Project Same data and logic used for both internal continuous improvement and external evaluation.

  7. Evaluation Plan - Calendar Internal Continuous Improvement Monthly Evaluation Brief April Share Outcomes with DPI May-June Share Outcomes with SLT August Annual Report October Share Annual Report with SLT November Evaluation Brief Dec-Jan SLT = State Leadership Team DPI = Department of Public Instruction

  8. Evaluation Briefs Align to Logic Audience: Schools and Districts

  9. Annual Report Aligns to Logic Professional Learning–Assessing-Fidelity–Sustaining- Student Outcomes Audience: https://www.wisconsinrticenter.org/assets/files/resources/1513016537_2016-17%20Annual%20Report.pdf

  10. Since 2009… Professional Learning • Of the 2214public schools in Wisconsin, 1803 (81%) have participated in professional learning offered by the Center • and 1547 (70%) have completed a full training in behavior, reading, and/or mathematics • Of trained schools, 1427 (92%) have self-assessed to measure their implementation. • Of assessing schools, 1072 (75%) reached fidelity or full implementation at any one level. Statewide Overview 2214 public schools

  11. PBIS – Trained to Fidelity

  12. Math – Trained to Fidelity

  13. Reading – Trained to Fidelity

  14. Overall Student Outcomesfrom DPI Report - Follows Same Logic

  15. Overall Student Outcomes

  16. Student Outcomes – Students with IEPs

  17. Student Outcomes – Students with IEPs

  18. Student Outcomes – English Learners

  19. Student Outcomes – Students of Color

  20. Student Outcomes – Students of Color

  21. Lesson Learned When Looking at Outcomes – Factor in Length of Implementation

  22. Make the Data Relatable for Stakeholders • ConnectEd

  23. How We Engage Stakeholders in our Logic and Evaluation

  24. The Data Has a Dual Purpose 1. Informs our support to schools/districts (our continuous improvement) - INTERNAL 2. Provides our data for evaluation, which impacts funding and political support for the center - EXTERNAL

  25. Use Same Logic for Internal Continuous Improvement

  26. Lessons Learned - Goal Establish buy-in during goal development and start with realistic goals.

  27. Lessons Learned in Continuous Improvement - Assessing Smaller, more rapid indicators of progress for assessing versus an annual review.

  28. Lesson Learned - Operationalize the differences between how schools and state differ in problem solving from this data.

  29. Shared Practice PDSA Example – to Increase Assessing

  30. Key Takeaways Link Internal and External Evaluation through your Logic Have an Evaluation Plan and Map it Out (Calendar) Start with Doable and Reasonable Goals

More Related