1 / 44

Michigan Department of Education Office of Education Improvement and Innovation

Understand the implementation process of Michigan Continuous School Improvement (MI-CSI) to improve student achievement and meet educational requirements. Learn to gather, analyze data, set goals, and evaluate impact with practical strategies.

dutra
Download Presentation

Michigan Department of Education Office of Education Improvement and Innovation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Michigan Department of EducationOffice of Education Improvement and Innovation DO Implement Plan Monitor Plan Evaluate Plan One Voice – One Plan Michigan Continuous School Improvement (MI-CSI)

  2. Continuous School Improvement Process Gather Get Ready Collect School Data Build School Profile Do Implement Plan Monitor Plan Evaluate Plan Student Achievement Plan Develop School Improvement Plan Study Analyze Data Set Goals Set Measurable Objectives Research Best Practice

  3. Guided Conversations DO WHY: Why is it important to strategically implement, monitor, and evaluate the School Improvement Plan? HOW: How will we communicate the plan to all stakeholders so that they clearly understand and own their roles in implementation? How do we build ownership of the strategy and the plan? How will we ensure implementation with fidelity? How is the implementation of your plan monitored and evaluated? How is the impact of your plan monitored and evaluated? WHAT: What will your school look like when this plan is implemented with fidelity? What is the expected impact on student achievement?

  4. Do Vocabulary Implementation Monitor Evaluation Fidelity Impact

  5. Do: Plan Requirements • ESEA and PA 25 require annual evaluation of the following: • Implementation of the plan • Student achievement results by subgroup using data from state assessments and other indicators • Modifications to plan as needed • ISDs/RESAs are required by PA25 to provide technical assistance to schools and districts to develop annual evaluations.

  6. MAKING CONNECTIONS - THE BIG PICTURE Comprehensive Needs Assessment Where are we now? Where do we want to go and how are we going to get there? leads to School Improvement Plan leads to Annual Education Report How did we do?

  7. DO Implement Plan Monitor Plan Evaluate Plan Plan Develop School Improvement Plan

  8. Continuous School Improvement Process Get Ready Collect Data Build Profile School Data Profile (SDP) School Process Rubrics (SPR) Analyze Data School Data Analysis School Process Analysis Set Goals Set Measureable Objectives Research Best Practice Develop School Improvement Plan Implement Plan Monitor Plan Evaluate Plan Gather Study Plan Do Comprehensive Needs Assessment School Improvement Plan

  9. Components of Effective Implementation Leadership Competency Organization Vision  Mission  Beliefs Student Achievement

  10. Plan, Monitor, and Evaluate A Strategy/Initiative/Program Study Gather Student Achievement Do Plan DEVELOP ACTION PLAN - to be done prior to implementation Question 1: What is the readiness for implementing the strategy/initiative/program? Question 2: Do participants have the knowledge and skills to implement the strategy/initiative/program? Question 3: Is there opportunity for high quality implementation?

  11. Questions to Reconsider Prior to Implementation What is the readiness for implementing the strategies and/or activities? To what extent can stakeholders articulate and do they believe the research behind the decision to implement the strategy/activity? (Competency) KC I.2.B.2 To what extent are stakeholder (staff, parent, student) concerns about the strategy/activity identified and addressed? (Leadership) KC I.1.B.3, KC IV.1.B.3 To what extent are staff able to integrate this strategy/activity with other existing initiatives? (Competency)

  12. Questions to Reconsider Prior to Implementation Do participants have the knowledge and skills to Implement the strategy/activity? To what extent do participants share a vision of how practice will change as a result of the strategy/activity? (Leadership) KC II.2.B.1 To what extent do administrators demonstrate the knowledge and skills to assess the effectiveness of the strategy/activity? (Leadership) KC II.1.A.1 To what extent are opportunities sufficient for staff to learn the knowledge/skills/non-negotiable elements identified as essential to the strategy/activity? (Organization/Competency) KC III.2.C.1, KC III.2.C.2; AdvancEd Indicator 3.11 To what extent is staff able to apply the acquired knowledge and skills?(Competency) KC III.2.B.2

  13. Questions to Reconsider Prior to Implementation Is there Opportunity for Implementation? To what extent is administrative support sufficient to get the results you intend? (Organization) KC II.1.A.1-7, KC II.1.B.1-5; AdvancEd Indicator 3.4 To what extent are the financial resources and allocated time sufficient to get the results you intend? (Organization) KC II.3.A.2-4; AdvancEd Indicator 4.1 To what extent is staff collaborating to support the program? (Organization) KC III.2.A.1; AdvancEd Indicator 3.5 To what extent are structures in place to collect and review formative implementation data? (Organization) KC II.1.A.2, KC V.1.A.4; AdvancEd Indicator 3.2, 5.1

  14. Ultimately….. To what extent are stakeholders committed with both hearts and minds to the strategy/activity/non-negotiable elements? KC I.2.B.2, KC II.2.B.3

  15. Implementation is a Process! Monitor Implementation Evaluate Implementation Monitor Impact Evaluate Impact Adult Focused Student Focused

  16. EVALUATE ADULT IMPLEMENTATION AND IMPACT ON STUDENT ACHIEVEMENT (Summative) DID IT WORK? Implementation: Adult Focused Impact: Student Focused EVALUATE MONITOR EVALUATE MONITOR IS WHAT WE ARE DOING WORKING? ARE WE SHOWING EVIDENCE OF STUDENT GROWTH? What interim adjustments are suggested by implementation data? How might these adjustments affect the integrity of the results? DID OUR STRATEGIES RESULT IN INCREASED STUDENT ACHIEVEMENT? What unintended consequences (good and bad) have occurred? SHOULD THE STRATEGY/ACTIVITY BE CONTINUED? DISCONTINUED? MODIFIED? WAS THE PROGRAM IMPLEMENTED AS INTENDED? DID WE IMPLEMENT THE PLAN/STRATEGIES CORRECTLY & CONSISTENTLY? DID WE GIVE IT ENOUGH TIME? RESOURCES? ARE STRATEGIES AND ACTIVITIES BEING IMPLEMENTED AS INTENDED WITH FIDELITY? ARE WE COLLECTING & USING STUDENT AND ADULT DATA TO MODIFY & ADJUST ONGOING IMPLEMENTATION? MONITOR ADULT IMPLEMENTATION AND IMPACT ON STUDENT ACHIEVEMENT (Formative) IS IT WORKING?

  17. DO Implement Plan Monitor Plan Evaluate Plan Plan Develop School Improvement Plan

  18. Continuous School Improvement Process Get Ready Collect Data Build Profile School Data Profile (SDP) School Process Rubrics (SPR) Analyze Data School Data Analysis School Process Analysis Set Goals Set Measureable Objectives Research Best Practice Develop School Improvement Plan Implement Plan Monitor Plan Evaluate Plan Gather Study Plan Do Comprehensive Needs Assessment School Improvement Plan

  19. Implementation is a Process! Monitor Implementation Evaluate Implementation MonitorImpact Evaluate Impact Adult Focused Student Focused

  20. Leadership and Learning Center 2010

  21. Plan, Monitor, and Evaluate A Strategy/Initiative/Program Study Gather Student Achievement Do Plan MONITOR - to be used on an on-going basis during implementation Question 4: Is the strategy/initiative/program implemented as intended? Question 5: What is the impact on students?

  22. Implementation: Adult Focused Impact: Student Focused MONITOR EVALUATE MONITOR EVALUATE IS WHAT WE ARE DOING WORKING? ARE WE SHOWING EVIDENCE OF STUDENT GROWTH? What interim adjustments are suggested by implementation data? How might these adjustments affect the integrity of the results? ARE STRATEGIES AND ACTIVITIES BEING IMPLEMENTED AS INTENDED WITH FIDELITY? ARE WE COLLECTING & USING STUDENT AND ADULT DATA TO MODIFY & ADJUST ONGOING IMPLEMENTATION? MONITOR ADULT IMPLEMENTATION AND IMPACT ON STUDENT ACHIEVEMENT (Formative) IS IT WORKING?

  23. Activities

  24. MonitoringImplementation and Impact • Are you meeting on a regular basis to monitor implementation of your plan? • What does the data say when you monitored implementation? • What evidence have you collected to determine if adults are implementing with fidelity? • What evidence have you collected to determine the impact of implementation? • What adjustments are suggested by implementation and impact data? How might these adjustments affect the integrity of results? • How will you communicate progress with stakeholders?

  25. Checking for Understanding What are your questions about monitoringimplementation and impact?

  26. DO Implement Plan Monitor Plan Evaluate Plan Plan Develop School Improvement Plan

  27. Continuous School Improvement Process Get Ready Collect Data Build Profile School Data Profile (SDP) School Process Rubrics (SPR) Analyze Data School Data Analysis School Process Analysis Set Goals Set Measureable Objectives Research Best Practice Develop School Improvement Plan Implement Plan Monitor Plan Evaluate Plan Gather Study Plan Do Comprehensive Needs Assessment School Improvement Plan

  28. Implementation is a Process! Monitor Implementation Evaluate Implementation Monitor Impact Evaluate Impact Adult Focused Student Focused

  29. Plan, Monitor, and EvaluateA Strategy/Initiative/Program Study Gather Student Achievement Do Plan EVALUATE – to be used at the end of an implementation cycle Question 5: What was the strategy/initiative/program’s impact on students? If objectives were met: Conclusion: Should the strategy/initiative/program be continued or institutionalized? If objectives were not met: Question 1: What was the readiness for implementing the strategy/initiative/program? Question 2: Did participants have the knowledge and skills to implement the plan? Question 3: Was there opportunity for high quality implementation? Question 4: Was the strategy/initiative/program implemented as intended? Conclusion: Should the strategy/initiative/program be adjusted or discontinued?

  30. EVALUATE ADULT IMPLEMENTATION AND IMPACT ON STUDENT ACHIEVEMENT (Summative) DID IT WORK? Implementation: Adult Focused Impact: Student Focused MONITOR EVALUATE MONITOR EVALUATE IS WHAT WE ARE DOING WORKING? ARE WE SHOWING EVIDENCE OF STUDENT GROWTH? What interim adjustments are suggested by implementation data? How might these adjustments affect the integrity of the results? DID OUR STRATEGIES RESULT IN INCREASED STUDENT ACHIEVEMENT? What unintended consequences (good and bad) have occurred? SHOULD THE STRATEGY/ACTIVITY BE CONTINUED? DISCONTINUED? MODIFIED? WAS THE PROGRAM IMPLEMENTED AS INTENDED? DID WE IMPLEMENTTHE PLAN/STRATEGIES CORRECTLY & CONSISTENTLY? DID WE GIVE IT ENOUGH TIME? RESOURCES? ARE STRATEGIES AND ACTIVITIES BEING IMPLEMENTED AS INTENDED WITH FIDELITY? ARE WE COLLECTING & USING STUDENT AND ADULT DATA TO MODIFY & ADJUST ONGOING IMPLEMENTATION? MONITOR ADULT IMPLEMENTATION AND IMPACT ON STUDENT ACHIEVEMENT (Formative) IS IT WORKING?

  31. IMPLEMENTATION What evidence do you have that the strategy/activities were implemented with fidelity? KC II.2.B.4; AdvancED Indicator 3.6 What evidence do you have that implementation adhered to strategies, timelines, and responsibilities? KC II.2.B.4: AdvancED Indicator 3.6 IMPACT What impact has the strategy/activity had on students and what is your evidence? KC I.3.B.3; AdvancED Indicators 3.2, 5.2 What impact has the strategy/activity had on your subgroups and what is the evidence? KC I.3.B.3; AdvancED Indicators 3.2, 5.2 Evaluating Implementation and Impact

  32. Evaluate Plan • The process is cyclical, and evaluation data should inform the next cycle of planning. • The ultimate goal is to have improvement strategies take hold and become so internalized that they become part of the school culture.

  33. Evaluate Plan - Conclusion • Given your evidence, what adjustments are needed to your School Improvement Plan, if any? • What do we continue to do? • What do we stop doing? • What do we need to “tweak”?

  34. Evaluate Plan - Digging Deeper • To what extent was this the right strategy/activity to address your need? • What is needed to maintain momentum and accelerate achievement gains? • Are the benefits of the strategy/activity sufficient to justify the resources it requires? • How might these results inform the School Improvement Plan?

  35. It is also critical that the School Improvement Team structure opportunities to celebrate successes, no matter how small. Celebrating successes reinforces valued performance and reminds the school community that however challenging, school improvement results in improved academic performance. One Voice – One Plan

  36. Checking for Understanding What are your questions about evaluatingimplementation and impact?

  37. One Voice – One Plan However noble, sophisticated, or enlightened proposals for changeand improvement might be, they come to nothing if teachers don’t adopt them in their own classrooms and if they don’t translate them into effective classroom practices.

  38. Continuous School Improvement Process Gather Get Ready Collect School Data Build School Profile Do Implement Plan Monitor Plan Evaluate Plan Student Achievement Plan Develop School Improvement Plan Study Analyze Data Set Goals Set Measurable Objectives Research Best Practice

  39. Do Vocabulary Implementation Monitor Evaluation Fidelity Impact

  40. Guided Conversations DO WHY: Why is it important to strategically implement, monitor, and evaluate the School Improvement Plan? HOW: How will we communicate the plan to all stakeholders so that they clearly understand and own their roles in implementation? How do we build ownership of the strategy and the plan? How will we ensure implementation with fidelity? How is the implementation of your plan monitored and evaluated? How is the impact of your plan monitored and evaluated? WHAT: What will your school look like when this plan is implemented with fidelity? What is the expected impact on student achievement?

  41. Questions/Comments?Please contact: • Renie Araoz (Araozr@michigan.gov) • Diane Fleming (FlemingD6@michigan.gov) • Diane Joslin-Gould (Joslin-Gouldd@michigan.gov) Or visit the MDE - School Improvement website

  42. These training materials and resources were developed in collaboration with the following individuals and organizations. We deeply appreciate their time and support. Renie Araoz – MDE/AdvancED Michigan Deb Asano - Marquette-Alger RESA Lisa Bannon - Wexford-Missaukee ISD Ben Boerkoel - Kent ISD Judy Bonne - Wayne RESA Leah Breen - MDE Elizabeth Brophy - Calhoun ISD Betty Burke-Coduti - Marquette-Alger ISD (Retired) Henry Cade - MDE Mark Coscarella - MDE Patti Dobias - MAISA Sharon Dodson - Kalamazoo RESA Deb Dunbar - Bay-Arenac ISD Scott Felkey - Oakland Schools Diane Fleming - MDE Linda Forward - MDE Gayle Greene - Macomb ISD (Retired) Lisa Guzzardo Asaro - Macomb ISD Donna Hamilton - MDE Carrie Haubenstricker - Tuscola ISD Robert Higgins - MDE Fiona Hinds - AdvancED Michigan Diane Joslin-Gould - MDE Linda Kent - MDE Scott Koziol - Michigan Center Public Schools Teresita Long - MDE Margaret Madigan - MDE Yvonne Mayfield-MDE Kathleen Miller - Shiawassee RESD Al Monetta - AdvancED Michigan (Retired) Cheryl Oczepek - MDE Carolyn Rakotz - Wayne RESA Dodie Raycraft - St Joseph County ISD Karen Ruple – MDE Jennifer Sabsook - Charlevoix-Emmet ISD Kathy Sergeant - AdvancED Michigan Consultants Beth Steenwyck Betty Underwood Jan Urban-Lurain

More Related