330 likes | 471 Views
Teacher Evaluation in New York State. . . . and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012. NYS Initiatives. Common Core Standards. What do we want our students to know and be able to do?. APPR
E N D
Teacher Evaluation in New York State . . . . . . and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012
NYS Initiatives • Common Core Standards • What do we want our students to know • and be able to do? • APPR • Annual Professional Performance Review • Data Driven Instruction • Ensuring high quality instruction in every classroom • How will we know?
APPR – A 100 – Point System Annual Professional Performance Review Inspired by Race to the Top Legislation • New APPR a condition of the award • Some portions of evaluation process negotiated between the district and its teacher union • Some portions state-mandated • Evaluation process results in teacher “HEDI” score • Highly Effective, • Effective, • Developing, • Ineffective • Can lead to expedited 3020-A process for teacher termination
District-wide Decision Making NY – districts can make individual decisions regarding: • Specific supervision model to be used • Priorities and academic need • Which subjects/teachers will use state-provided ELA/Math scores and which will have SLOs • In-house processes for SLO assessing, scoring, implementation Other States: • Similar conditions • Entire state interprets uniformly
Step 1 Select a teacher practice rubric from the State-approved list or apply for a variance • Danielson’s Framework for Teaching • Marzano’s Causal Teacher Evaluation Model • NYSTCE Framework for the Observation of Effective Teaching • NYSUT Teacher Practice Rubric Collective Bargaining considerations
Step 2 Agree on the definition of “classroom observation” and any additional measures in the 60 point category (40 pts must be multiple observations) Choose one or more of the following other measures of teacher practice: • A portfolio or evidence binder (student work or teacher artifacts) • Feedback from students, parents, and/or other teachers using a survey • Professional growth goals using self reflection (maximum of 5 points)
We negotiated . . . Observation = 2 learning walks (15-minute informal walk-through, follow-up conversation) OR A formal class-length observation Multiple “observations” needed (2) Could be • 2 class-length observations • 1 class-length observation, 2 learning walks • 4 learning walks
And . . . • A portfolio or evidence binder (student work and/or teacher artifacts) • Professional growth goals using self-reflection (Professional Learning Plan, PLP) WCSD selects 9 components from the 4 Domains Teachers select an additional 5 components
Who evaluates whom? “The governing body of each school district and BOCES is responsible for ensuring that evaluators have appropriate training—including training on the application and use of the rubrics— before conducting an evaluation. The governing body is also responsible for certifying a lead evaluator as qualified before that lead evaluator conducts or completes a teacher’s or principal’s evaluation. ” NYS Commissioner’s Regulations
Lead Evaluators Responsible for carrying out observations, summative evaluations Must be trained and calibrated across each school district in selected model • Knowledge of model • Walk-through, observation protocols • Evidence-based reports • Use, knowledge of specific rubrics • Forms, feedback for teachers • Professional Learning Plans
How do districts calibrate supervision? Districts design a plan for: • Training for all evaluators • Certification for lead evaluators • Role clarification • Subcomponent and overall scoring • Improvement plans • Knowledge of appeals procedures (i.e., NYSED model appeals procedure in guidance)
Teachers Develop Professional Learning Plan (PLP) Attention to multiple professional areas (4 Domains) • Preparation • Classroom Environment • Instruction • Reflection, Professional Responsibilities Student-Centered Aspects • Individual SLOs/Student Growth • Student achievement • Data-driven instruction • Select observation protocol • Traditional observation (class length) • Walk-through/Learning Walk
Proposed Calendar of Supervision More frequent interactions between teacher/supervising administrator • Mid-October – describe and set PLP, content are SLO(s) • November-December – observation and follow-up • January – midterm check-in on PLP, SLO progress • February-March - observation and follow-up • May-June – summative evaluation conference
Summer Work Districts are choosing specific models, scheduling and implementing administrator training Administrators, teachers at various stages • Learning new protocols • Scheduling workshops • Goal-setting • Setting district calendars To begin in September, 2012
The First 20% Student Learning Objectives
What IS a SLO? A student learning objective: • Is an academic goal for a teacher’s students that is set at the start of a course. • Represents the most important learning for the year (or semester, where applicable) • Is specific and measureable based upon available prior student learning data • Aligned with Common Core, AND State or National Standards, as well as any other school and district priorities • Represents growth from beginning to end of the course Teachers’ scores are based upon the degree to which their goals are attained.
More on SLOs • Need common assessments for individual growth across grade levels, content • 50% rule, applied to total student load • Teacher sets individual growth targets per student • Cross-scoring of summative assessments needed, to ensure equity in HEDI scoring (need for inter-rater reliability)
Who Needs a SLO, and How Many? • Any teacher who does not use a state growth measure (ELA/Math assessments, gr 4-8) • “non-tested” subjects (70%) • 50% + of student load • Full-credit courses carry more weight than part-credit, or semester • Teacher will likely have multiple SLOs • Teacher tracks, monitors progress of each student in SLO classes to impact growth
HOW do you design the SLO? For Growth, Start with EVIDENCE Teacher sets individual student baseline using • Historical data (ex., prior year’s grades) • Pre-Assessment performance Teacher predicts individual student growth in his/her course • Sets individual growth targets for students • Post-assessment given at end of course (can be state assessment) • Data analysis yields success rate of students, and teacher’s score on this section
New York State Assessments There are • NO state assessments in the arts • NO common opportunity-to-learn standards Regional BOCES are sponsoring writing sessions to design SLOs and assessments in the arts Local districts design, implement their own
Some districts… … are using ELA and/or Math state test scores IN PLACE OF assessment data in non-tested subjects (the district-based SLO model) Due to: • Lack of common assessments • Lack of inter-rater reliability • Lack of content oversight by content specialist • Lack of effective data system for monitoring and tabulating results • Ambitious timeline for implementation
SLOs must include . . . • NYS Learning Objective per grade selected • Specific population/grade level • Learning content • Interval of instructional time (full year, usually) • Evidence to be used/collected (three forms) • Historical • Pre-assessment • Post-assessment • Individual students’ baseline
continued. . . . • Individual student targets (set by teacher) • Teacher goal set • Teacher scoring range, by HEDI ratings • Rationale for the SLO and targets Eventually • Final individual student growth score • % of students meeting individual targets • Student % aligned with specific scoring band for HEDI rating
The Other 20% Local Achievement Assessment
Achievement Measures – the Last 20% • Must be common across districts for grade level and content areas • Should represent summative measure of the course • Not to be applied to the SLO course(s) • Teacher sets target for students • Can NOT be scored by the teacher of record
What is the difference . . . . . . Between SLO and Achievement measures? SLO involves setting a target for students based upon previous performance data, i.e. measuring students’ growth; applied to 50% of student load Achievement does not measure “growth” over the length of the course, but teacher needs to set group target; applied to one other course
Still to come Addition of Value-Added Growth Model Inclusion of other data in targeting growth • Demographic • Graduation • Attendance Planned for 2013-2014 school year
Release of Teacher Evaluation Information Plans to release individual teacher evaluation ratings to the public (HEDI) • Highly effective • Effective • Developing • Ineffective “Teachers evaluations can be viewed as the equivalent of a Carfax report, empowering parents to attempt to avoid the ‘lemons.’ “ B. Jason Brooks, Foundation for Education Reform and Accountability
Timeline June • Determine next year’s SLO courses, populations • Design pre-, summative assessments July and August • Summer workshops, planning with like SLO teachers • Calibrate scorers • Design post-assessments, local common measures • Administrators’ training in APPR forms, protocols September • Meet students, get historical achievement data • Administer, grade pre-assessments • Set goal targets for students and self • Meet with administration to review goals, etc. October • Set SLOs, student targets • Start applying strategies to gain student growth!