440 likes | 590 Views
Ohio Teacher Evaluation System. A Model For Change William A. Bussey Superintendent Mid-East Career and Technology Centers. Ohio TIF Requirement 2012-2013 OTES Implementation. Implementing OTES through the Ohio Appalachian Collaborative (OAC) and Teacher Incentive Funds(TIF )
E N D
Ohio Teacher Evaluation System A Model For Change William A. Bussey Superintendent Mid-East Career and Technology Centers
Ohio TIF Requirement 2012-2013 OTES Implementation • Implementing OTES through the Ohio Appalachian Collaborative (OAC) and Teacher Incentive Funds(TIF) • All teachers will be evaluated using the OTES Model-Teacher Performance Standards • All administrators are credentialed evaluators • All teachers will develop and monitor one Student Learning Objective
TODAY’S FOCUS • Review the processes, results, and challenges with both the performance rating and student growth measures • Reflect on the good, the bad, and the ugly • Review any changes as we transition to the "authentic" evaluation tool for teachers this coming school year I have brought with me the experts! • Barb Funk • Dan Coffman • Scott Sabino • Michelle Patrick
OTES PD Implementation • OTES Team created in the Spring of 2012 • Executive Director • Building Directors • New Asst. Director-role is evaluation • Asst. Director - role Curriculum and OAC/TIF • Teachers
OTES Leadership Training • Introduced the OTES framework to all leadership teams District Transformation Team Strategic Compensation Team OTES Team Formative Instructional Practices Team Assessment/Literacy Tech Team HSTW Site Coordinators • Train key people within the district • Success starts with a solid foundation • Teacher led PD would increase staff buy-in
Year long Professional Development Focus • Ensure teachers understood: • Framework of the new evaluation system • New components • New tools • New process • Process would increase teacher/admincollaboration time
Staff PD • Initial OTES Staff training • Teacher Performance – OTES Model • 7 Teaching Standards • Placed in the performance rubric • Teacher Evidence • Calibration of a standard to a teaching video • Completed self-assessment • Student Growth – Student Learning Objectives
Staff PD cont. • November PD • Mini Sessions on Formative Instructional Practices, Literacy w/CC Anchor Standards in Reading and Writing, BFK and Value Added, Teacher Evidence • February PD • State training manual on Student Growth Measures • Quarterly Meetings used as check points • Conversations with evaluator
The Process-Learning It Together • Pre-Pre Observation Conference • Met with teachers (last two week of October) to review the process timeline and elements (paperwork and documentation) and answer questions. • Follow-up to In-service Day. • Self-Assessment Tool (Optional in the process, but discovered to be a MUST DO!) • Professional Growth Plan (See example) • Teacher Performance Goal (Goal 2)- Influenced by Self-Assessment Tool and Ohio Teaching Standards • Self-directed • Statement of the goal and how it will be measured • Student Achievement Goal (Goal 1)- Influenced by SLO (Use standard from pre-assessment used to develop SLO) • Specific standard and how it will be measured
The Process- Learning It Together • Teacher Performance Evaluation Rubric (TPER)- (See example) • Evaluation begins with Proficient (“Rock solid teaching”) • Video examples- see NIET Training Modules • Examine key words and phrases embedded in rubric at each level (Proficient, Accomplished and Developing) • Pre-Conference • Teachers complete/submit responses using ohiotpes.com • Face-to-face • Observation • 30 minutes • Script the lesson!
The Process- Learning It Together • Post-Conference • Follows each observation • Specific questions relating to the lesson and the instructor • Relate to TPER • Area for Reinforcement/Refinement • Classroom Walkthroughs (2-5 per teacher) • Shared with the teacher • Opportunity for feedback • Used paper form and ohiotpes.com • The more often, the better! Set schedule early and put it on the teachers. It’s the teachers’ evaluation and their responsibility to provide evidence/documentation relating to the TPER.
The Process-Learning It Together • Round 1 Pre-Conference Observation Walkthrough(s) Post-Conference • Round 2 Pre-Conference Observation Walkthrough(s) Post-Conference • Summative Performance Rating Conference
Considerations for Building a Strong, Reliable Measures of Student Growth • Use measures of student growth effectively in a high quality evaluation system • Make informed decisions on the right measures • Make informed decisions about the appropriate weight of measures • Increase reliability and validity of selected measures
Student Learning Objectives • Teachers completed the development, implementation and scoring process • SLO timeline with specific due dates, calendar with expectations • Teachers created their SLO and chose their growth target • Implemented the SLO • Calculated the results • Three main types of targets used • Whole group • Tiered/grouped targets • Individual Targets
TYPES OF TARGETS • Whole group target-one target for all students in SLO • All students will score a 75% or better on the post assessment • Tiered/grouped target-range of targets for groups of students • Pre-assessment scores between 0 – 25 would be expected to score between 25-50 on post assessment • Individual target-each student in the SLO receives a target score • Using a formula such as (100 – pretest)/2 + the pretest = growth target
Class of 2013Results Fall 2011 • 469 students were tested • 81% (379) students earned a bronze or better Intervention Provided through KeyTrain Online Spring 2012 • 90 students were tested • 71% (64) students earned a bronze or better
NCRC District Results - 2013 • 40% Bronze • 50% Silver • 5% Gold • 5% Not yet • 27% Bronze • 64% Silver • 7% Gold • 2% Not yet 2011-2012 Level I 2012-2013 Level II
Occupational Profiles 70% Met or Exceeded Occupational Profile!
Class of 2014 Results Fall 2012 • 467 students were tested • 86% students earned a bronze or better Intervention Provided through KeyTrain Online Spring 2013 • 60 students were tested • 55% students earned a bronze or better
NCRC District Results - 2014 • 37% Bronze • 54% Silver • 4% Gold • 5% Not yet 2012-2013 Level I 2013-2014 Level II
Key Takeaways • Data Analysis and Assessment Literacy Data Analysis and Setting Growth Targets • Data driven decisions – What data? • Engage in transparent conversations around student growth • Outliers, class size, averages, ranges • Identify trends, patterns, expectations for mastery • Gather other available data • Zoom-in and identify similarities and differences in students
Build staff Assessment Literacy Priority Standards Appropriate assessments Quality Assessment Design Assessment Blue Print reflects instruction Instructional Tasks move students to meet standards
Significance of the SLO Pilot Work • Conversations and Collaboration • Greater focus on Student Learning • Deeper reflection of the: • teaching and learning process • accountability of growth for all students • role student growth plays in determining Educator Effectiveness
District Policy - Process • Policy developed utilizing the template provided by ODE • Simple to allow for flexibility • Change as we negotiate our contract • Further development by the district SLO Evaluation Committee for SLO guidelines
REFLECTIONS. . . Let the “real” thinking take place… What made us breathe a sigh of relief when it was over What went well/positive elements Yes, some things were quite positive! Suggestions/ways to use our internal feedback and insight to feed forward
REFLECTIONS – breathing a sigh of relief because… • Roller Coaster of Emotions • WHEW that took some time! • 5 hours per teacher per observation? *gulp* • What do I give up? • Walkthroughs? • Technology • eTPES downfalls • Other evaluations? • Walkthrough data? • Support for all levels of learners
REFLECTIONS – What Went Well • Roller Coaster of Emotions • Process was overall positive • From Self-Assessments Reflection • Consensus: “It’s not so bad”!! • Technology based • Trial year! WOOHOOO • Focused purpose and common dialogue • Holistic • Rubric • Criteria not a checklist • Collaboration • Administrator with associate school principals • Administrators with administrator • Administrators with teachers • Teachers with teachers • Utopia!
REFLECTIONS – ways to improve • Self-Assessment – everyone • Walkthrough data – form that collects data we need? • Experimenting with Google Forms? • Use to see trends • Non-instructional staff evaluations • OSCES, ASCA, and OTES • Input from staff • Opportunities for more alignment • for professionals to align all goals; IPDP, OTES/OPES, Resident Educator, • to look for trends and align with PD, • to group professionals with aligned goals as they work together to improve their practice, • to align ourselves as evaluators - do we truly calibrate? Can we better align (with each other) by discussing our ratings and why, etc., etc., etc.