560 likes | 578 Views
Educator Evaluations in Michigan. Carla Howe Educational Evaluation Consultant Office of Evaluation, Strategic Research, & Accountability Michigan Department of Education. Evaluation is a Collaborative Process.
E N D
Educator Evaluations in Michigan Carla Howe Educational Evaluation Consultant Office of Evaluation, Strategic Research, & Accountability Michigan Department of Education
Evaluation is a Collaborative Process Research tells us that evaluation systems work best when evaluation is not something we do topeople, but rather when it is a shared process meant to provide feedback for learning and growth.
Positive Outcomes How can this new legislation and change be a positive for educators and students? • Will provide measurement of progress toward goals. • Allows teachers and administrators to focus on best practice and continuous improvement. • Helps reaffirm the profession by having a system in place to recognize excellence. • Helped secure Michigan’s approval of ESEA Flexibility which waived many of the unattainable targets within No Child Left Behind.
Principal/Assistant Principal Training Grants – 2012-13 SY • $1.75 million allocated for Principal and Asst Principal Training for Conducting Educator Evaluations • Applications for Training Programs were accepted in August • Final Review of Programs is complete • MEGS+ opened in November for grant applications by district; closes December 7 • Awards in January/Funds through School Aid
Overview of Current Plan and Issues • November release date for the aggregate effectiveness labels by school (number of teachers reported as highly effective, effective, minimally effective, ineffective) • Key important messages: • This is the FIRST YEAR; 800+ different systems (we have data to show this) • Districts did MASSIVE amounts of work to accomplish this • We do not believe that huge numbers of MI teachers are ineffective • Concerns • Issues with data privacy • Contextualizing the labels
Current Circumstances Our current legislation has allowed for local systems of evaluations, which has given districts flexibility to design systems that work best for them. • Over 800 systems across the state • Varying degrees of implementation across the state Public reporting of effectiveness labels is required by SFSF • Public Release in November via mischooldata.org • Teachers labels reported in aggregate by school (number of teachers in each of the four categories) • Principals/Administrators reported at the district level.
Important Context for the 2011-12 Results • First year of implementation of NEW systems based on student growth measures • State provided student growth measures are only available in grades 4-8 for reading and mathematics • Varying components across systems (i.e. between districts) • Varying percentages of growth across systems (i.e. between districts) • Some districts on prior contract (i.e. No new system, but reporting labels was required) • www.michigan.gov/educatorevaluations for supporting documentation
K-12 Educator Evaluation Survey • 792 districts completed the survey about their Evaluation systems from April to August • Required to be completed by SFSF • Results provide valuable insight into local systems • The types of frameworks used • The % of student growth as a component (law states “significant”, but it isn’t defined until 2013-14) • Types of growth measures included • Types of decisions informed by the results of evaluations
54 districts with a prior contract did not have to incorporate growth or a new system in 2011-12 50% of reporting districts # of districts Other Frameworks reported include: Charlotte Danielson Framework AND a local component, Teacher Advancement Program, My Learning Plan, 5 Dimensions of Teaching and Learning, Local District or ISD framework, McREL, STAGES, Kim Marshall Rubrics PRELIMINARY/DRAFT FINDINGS
Appropriate given the FIRST year of local evaluation systems # of districts
# of districts Others types of decisions include: Assignment to committees or roles beyond the classroom, classroom support and assistance, layoff/recall/transfer, mentoring, staff placement, scheduling, setting improvement goals, merit pay
Other Factors Reported As Part of Evaluations PRELIMINARY/DRAFT FINDINGS
Overview of Statewide Results Understanding educator evaluation labels in MI
Caveat…. • Labels are not EQUAL across districts • However, we know that people will want this type of analysis and we want it done appropriately • Resources to assist with communications: • Policy brief “facts and figures” • MDE PowerPoint
Statewide Results • IMPORTANT NOTES: • Based on the labels as determined by the local evaluation system; rigor of label designation is not consistent across districts • THERE is differentiation in label reporting now, 22% of teachers are reported as “highly effective” moving away from a satisfactory/unsatisfactory system • We do not believe that 1% of teachers labeled as “ineffective” is unreasonable in the first year
Impact of growth • Law required districts to implement systems based in “significant part” on student growth • How do the labels look different when the district used growth in greater percentages?
Growth and eval labels More differentiation in labels when growth counts at a higher rate LESS differentiation without growth
Distribution of Labels By Percent of Evaluation Based on Growth
Takeaway (for messaging too) • We believe the distribution of labels (i.e. number of teachers in each category): • Is appropriate in Year 1 of implementation • Reflects differentiation (esp highly effective/effective) • BUT we also see that systems using higher proportions of growth are making those differentiations • The statewide evaluation system will move us toward more growth measures at higher rates
Who is more likely to be rated as highly effective or effective? Teachers more likely to appear in highly effective category (versus other three) and in effective category (versus other two): • Those with more time in the same district • Teachers with a professional certificate (as opposed to all others) • Those with a master’s degree or higher • Teachers in districts with growth over 40% in their system
Relationship between effectiveness labels and Priority/Focus/Reward Important to remember: • A school-level designation does not mean that all teachers within that school are in a given level of effectiveness • Example: In a Priority School, there will be effective teachers as well as ineffective teachers
Notes: There are significantly more teachers reported as ineffective and minimally effective in Priority Schools than the statewide number, and in Focus or Reward schools.
Key Takeaways from the Results • These results are reasonable for the first year; represent a huge effort on the part of districts • There is differentiation in the system; there will be more as growth becomes a higher component; but we still do not believe large numbers of Michigan teachers are “ineffective”
MCEE Interim Progress ReportTimeline for MCEE Recommendations
MCEE’s Interim Progress Report Michigan Council for Educator Effectiveness (MCEE) issued its Interim Report on April 27, 2012. MCEE concluded a pilot study is imperative, saying rushing to develop a system "would be reckless, both fiscally and technically”.
Teacher/Student Data Link: What is it? • Data initiative to link each student + assessment data to the courses he/she took and to the teachers who taught those courses • Still fairly “new;” only in its 2nd year of operation • Will mature in the coming years to be able to provide measures and information over time. • The “TSDL” provided by BAA is not the same as the TSDL collection, rather info from the TSDL collection + assessment results.
Teacher-Student Linked Assessment File(From BAA Secure Site)
Teacher/Student Data Link: Why is it important? • Required under State Fiscal Stabilization Fund (SFSF) as a deliverable • Required as a compliance factor in the NEW School Accountability Scorecards for 2012-13: • Will be an “all or none” component, combined with the REP effectiveness labels reporting requirement. • At least 95% of your students must be included in your TSDL submission.
Information About the TSDL Assessment File • Functionality of the produced file is extremely limited, so districts choose which “pieces” make sense in their local context. • Generated for each educator of students in tested grades, regardless of subject taught. • BUT “growth”, or Performance Level Change (PLC), exists only for reading and mathematics for MEAP and MI-Access FI in grades 4-8
How does the TSDL Work? • Teachers are linked to courses • Students are linked to courses • For each course taught, a teacher has a list of students who were reported as taking that course. • Spring assessment data 2011 and fall assessment data 2011 will attribute to teachers from the 2010-2011 school year • “Feeder school” for fall assessment data
Linking Assessment Data to Students Once teachers are linked to students, the TSDL file provides: • Performance level change (PLC) for MEAP and MI-Access FI in reading and mathematics for each teacher where available (regardless of subject taught) in grades 4-8. • Performance level in writing, science, social studies, reading and mathematics for each teacher where available (regardless of subject taught) across all tested grades.
Access to TSDL data • TSDL User role must be established in the Secure Site to access the data at the district or school level • Spring Assessments/High school link available through the Secure Site as of January. • Fall Assessments (Elementary and Middle) TSDL through the Secure Site as of March.
Working with the TSDL Assessment File • District/school performs roster verification on the TSDL file • District/school needs to adjust each list based on rules like: • student attendance • subject taught match • grade taught • other local factors
Using PLC Data with MDE Tool • This year, the TSDL provides PLC data linked to teachers to districts for integration into local systems along with an optional tool. • These are general guidelines/suggestions—NOT requirements for reading and math in grades 4-8
Sample Components of an Educator Evaluation GrowthComponentContributionto Evaluation (as Shown Above): 20%
8 Sound Practices for Conducting Evaluations • Develop goals/purpose/guiding principles for the evaluation system. • Establish clear business rules to ensure consistent practice for all. • Include multiple measures for the growth component. • Share the system, the rules, the expectations with teachers early on. • Be open to feedback and possible revisions. • Designate someone to act as principal while you conduct a formal observation. • Plan for the unexpected. • Follow through.
1. Develop goals/purpose/guiding principles for the evaluation system • Ensure that all educators know why the work is being conducted – those creating and implementing it AND those that are the recipients of its results • Consider guiding principles for all arms of the evaluation system • Helps to calibrate the work – Does this (change, rule, etc.) align with our goals and purpose? • Helps provide a sound “defense” of criticisms • Facilitates decision making
2. Establish clear business rules to ensure consistent practice for all • Business rules ensure transparency in the system itself • By naming practices, everyone is aware of the process and how it should be unfolding • Holds all stakeholders accountable to the practices within the system • Requires follow through and direct communication
3. Include multiple measures for the growth component • The law requires multiple measures – and for good reason! • National and state assessments • Local assessments • Portfolio evidence • Determine the measures you’d like to include at the beginning of the year. • Put your teachers’ expertise to work in determining grade/subject specific practices/nuances.
4. Share the system, the rules, the expectations with teachers early on • This is good instructional practice that you would expect of your teachers in their classrooms -- model this for them. • Everyone has a clear idea of what to expect, how the work will be done, how the results will be used, and next steps for both evaluator and evaluatee. • No surprises! • Eases anxieties!