1 / 27

State Model Evaluation System Updates: Insights and Implementation Strategies

This session will focus on understanding the revised rubric, learning from pilot districts, and accessing resources for effective implementation. Explore concerns and benefits of the new rubric, and discuss scoring revisions and final effectiveness ratings.

stout
Download Presentation

State Model Evaluation System Updates: Insights and Implementation Strategies

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Changes to the State Model Evaluation System Regional Meetings March, 2018

  2. Session Outcomes • Deepen understanding of the revised rubric and scoring • Apply lessons learned from pilot districts to your district’s roll-out of the revised rubric • Learn about and access available resources  • Network with your colleagues in other districts • Connect with your regional specialist for evaluation system implementation support

  3. Getting to Know You • I want to know more about…. • I’ve heard that…. • I have questions about…

  4. Purpose for Current Rubric Revision • Concerns around rubric • Length • Feasibility for facilitating feedback and growth conversations • Concerns around professional practices: • Some ambiguous, vague language • Redundancy within and across elements/standards • Concerns around lack of consistency and inter-rater agreement

  5. 3 Years of Data collection revealed… • Patterns of “checking off the box” within certain elements (especially with lists of PPs) • Patterns of predictability – if evaluators were checking off one practice, it was almost a given that other practices would be checked off in rote manner • Examples of potentially inflated element ratings • Lack of distinction between specific professional practices rating.

  6. Technical Working Group Process • Review of evaluation systems across many states to identify clear and concise language to remedy ambiguities and content that could be added to enhance measurability of elements and standards • Review of each standard, element and professional practice to add, edit and revise language with the explicit goal of eliminating redundancies and being willing to let go of practices that did not appear to be contributing to any kind of differentiation between levels of practice.

  7. Benefits of the New Rubric • Teachers at all grades have the same rubric in the district • Increased rigor and relevance • Clear distinction and scaffolding between the 5 levels of professional practices • Less professional practices allows for focused feedback conversations and professional growth • Language is clear, concise, comprehensive and accurate • Scoring more accurately reflects teachers’ performance

  8. Rubric Revisions 10

  9. Category Labels Change 11

  10. Digging into the Revised Rubric Goal: Come up with a 1-3 word phrase that describes each of the 27 elements. Work in small groups Using the revised rubric look at each Standard and Element individually Determine a one-two word label that describes each element within that standard Share out the labels

  11. Reflection • Now that you’ve had time exploring the revised rubric, talk with your team or table: • What are your general impressions about the changes that have occurred in the State Model Evaluation System rubric? • What do you see is the biggest change in the new rubric?? • What are you most concerned about? • What will the challenges be?

  12. Planning • How might an activity like this be helpful for your educators as they work through these changes? • What might you change in the activity to meet your needs? • What might you keep the same?

  13. Scoring Review and Values • Feedback from the field indicated that the scoring of the State Model Evaluation System was not aligned to authentic evaluation ratings and did not reflect districts’ values. Examples of these values include: • Ratings at the element and standard level should authentically roll up into an accurate Overall Professional Practice rating. • There should be a high bar for Accomplished and Exemplary on the professional practice side. • There should be a high bar to earn an overall effectiveness rating of Highly Effective. • The revised State Model Evaluation System reflects more rigorous scoring that is aligned to these values.

  14. Professional Practice Scoring Revisions 18

  15. Professional Practice Scoring Revisions

  16. Final Effectiveness Score and Rating • The following table shows the changes in cut scores for the final effectiveness score and rating. Professional Practices = 540 points Measures of Student Learning = 540 points Total Points = 1080 points 22

  17. Review of the Scoring Changes at the Overall Level (PP + MSL) Old System New System OldSystem New System

  18. Calculating Final Effectiveness Ratings • https://www.cde.state.co.us/educatoreffectiveness/rubricrevision • Path to this tool: CDE/teaching and learning/state model evaluation system/state model evaluation system pilot

  19. Lessons learned Thank you to our early adopters! • Our biggest learning was… • Our biggest challenge was… • Advice I would give to districts starting to implement the Revised State Model System… • I used my regional specialist to…

  20. Now What? • Suggested Timelines • Resources: • https://www.cde.state.co.us/educatoreffectiveness/rubricrevision • https://www.cde.state.co.us/educatoreffectiveness/revisedrubric-pilot • https://www.cde.state.co.us/educatoreffectiveness/ee-regions • https://www.cde.state.co.us/educatoreffectiveness/pilotsystemschangeoverview

  21. Reflection • Now that you’ve learned about changes in the State Model Evaluation System, talk with your team or table: • What are your general impressions about the changes that have occurred in the State Model Evaluation System? • Where do you see the greatest opportunity for growth using this tool? • What challenges do you foresee that we can support? • What are your next steps for your district?

  22. Networking with Colleagues

  23. The Role of the Educator Effectiveness Regional Specialists • Support for continued development of quality evaluation systems • Work in collaboration for the use of existing data • Enhance the use of evaluation data • Support to enhance, refine, and develop district/BOCES evaluation components • Make connections to share efficient and effective best practices from across the state • Collaborate to improve educator effectiveness and student growth • Support in building authentic MSLs • Support for all workings of COPMS/RANDA

  24. A New Way to Work 30

  25. Contact Us http://www.cde.state.co.us/educatoreffectiveness/contactus

More Related