160 likes | 291 Views
Jill Lindsey, PhD Suzanne Franco, EdD Ted Zigler , EdD Wright State University, Dayton, OH Research funded by the OERC. Investigating Implementation of Ohio's Teacher and Principal Evaluation Systems. Making Research Work for Education www.oerc.osu.edu.
E N D
Jill Lindsey, PhD Suzanne Franco, EdD Ted Zigler, EdD Wright State University, Dayton, OH Research funded by the OERC Investigating Implementation of Ohio's Teacher and Principal Evaluation Systems Making Research Work for Education www.oerc.osu.edu
LEA Implementation of OTES &OPES • Sample LEAs • Implementation variations • SGM weights • Positive Feedback • Unintended Consequences • Next Step Questions 2
Sample of 12 LEAs Representing: • All RttT regions • RttT LEAs and one non-RttT LEA • Range of LEA typology (revised in 2013)
Key Findings • LEAs not fully implementing yet • Positive feedback & “In this together” approach • Time demands are unrealistic • Unintended negative consequences • Many misunderstandings and questions • Unfairness concerns impacting culture • Data experts, resources, and examples are needed
OTES Implementation • One LEA implementing OTES without SGM for all teachers in 2012-13 for performance pay • One LEA piloting OTES with SGM for all teachers in 2012-13 • Seven LEAs piloting OTES without SGM for a few teachers in 2012-13 • Three LEAs not piloting or implementing OTES in 2012-13
OPES Implementation • Five LEAs implementing the OPES performance rubric with all principals in 2012-13. • One LEAs implementing selected standards from the OPES performance rubric with all principals in 2012-13. • Six LEAs were not piloting or implementing OPES in 2012-2013.
Positive Feedback from LEAs • Goal setting process is useful. • Opportunities for discussions about growth are helpful. • OTES/OPES will help to focus professional development on target areas. • The rubrics well received; represents the work teachers and administrators do. • Inclusion of SGM component for both teachers and principals creates “we are in this together” approach. • Shared attribution reflects LEA philosophy
Unintended Consequences …. • Implementation takes away time with students • Not enough school days to observe every teacher • Low VAM can be demoralizing • Requires overemphasis on self and documentation • Culture of collaboration is undermined • Change negatively impacts successful schools
Misunderstandings & Questions • Value - added • Vendor assessments • SLOs - Student Learning Objectives • Shared Attribution (building VAM, district VAM, building performance index, and district performance index) 10
Variations in Weighting Decisions • Five of the twelve LEAs had SGM weights under consideration • Three of the five LEAs were using Shared Attribution • Six of the twelve LEAs were developing SLOs
Unfairness Concerns Negatively Impacting Culture • Validity of SGM questioned • Differences in required documentation • Problematic that SLO – VAM – Vendor Assessments are used as interchangeable growth measures • ODE look up table for final ratings weights SGM >50% • Labels and color-codes send negative messages • Tremendous stress; need time “to get things right”
Summary of Findings • LEAs not fully implementing but piloting • Positive feedback & “In this together” approach • Time demands are unrealistic • Unintended negative consequences • Many misunderstandings and questions • Unfairness concerns impacting culture • Need Data experts, resources, and examples
NEXT STEP QUESTIONS • What is the impact of linkage percentages on final ratings? • What is the impact of SGM weighting on final ratings? • Do OTES/OPES scales meaningfully distinguish between levels of performance? • Given budget constraints and length of school year, are the legislated evaluation requirements sustainable? • Do required teacher/principal evaluations improve student performance?
Contact information • Jill.Lindsey@wright.edu • Suzanne.Franco@wright.edu • Ted.Zigler@wright.edu