270 likes | 282 Views
Get the latest information on AchieveNJ implementation, including key takeaways and areas for improvement. Explore the implications for evaluation during the transition from NJ ASK to PARCC.
E N D
January 2015 Update on Educator Evaluation and the Transition to PARCC
Objectives • Provide an update on implementation of AchieveNJ, including key takeaways and areas of improvement identified by educators • Discuss implications for evaluation in the transition from NJ ASK to PARCC
Agenda 3
AchieveNJ: A Careful, Deliberate Path • State Advisory Committee, Pilot 1 launched • All districts launch improved evaluations • Input and continuous improvement • TEACHNJ Act passed • State Advisory Committee and external Rutgers reports issued • $38 million Race to the Top award for NJ Educator Effectiveness Task Force formed 2010 2011 2012 2013 2014 2015 • Task Force releases recommendations • Pilot 2 launched • 2nd round of evaluation regulations proposed • 1st round of evaluation regulations proposed • Interim implementation report released; 3rdround of evaluation regulations proposed Introduction toAchieveNJ 4
Current Status • Year 1 (2013-14) Interim AchieveNJ Implementation Report published November, 2014 • Educators now almost halfway through Year 2 of implementation • Office of Evaluation continues to support districts and leaders with resources and direct coaching, as needed • Statewide advisory committee comprised mainly of NJ educators meets monthly • 2013-14 median Student Growth Percentile (mSGP) scores recently released • Year 1 Final Implementation Report (including analysis of statewide findings) to be released in spring 2015
Release of 2013-14 mSGP Scores All districts received secure access to their 2013-14 teacher and principal/AP/VP Median Student Growth Percentile (mSGP) data on January 8, 2015. • NJDOE has worked with NJ educators in taking a long and thoughtful approach to implementing both evaluations and mSGP. • mSGP data is an important part, but only one part of an educator’s evaluation. These scores will be used to finalize 2013-14 evaluations and to inform educators’ ongoing professional development. • About 15% of teachers and 60% of principals/APs/VPs received 2013-14 mSGP scores. By statute, mSGPs (like all aspects of an individual’s evaluation) are confidential and should not be shared publicly.
Timeline of SGP Development in New Jersey Federal Mandate for Stimulus Funds: States Must Calculate “Student Growth”; Link Teachers to Students A thoughtful, multi-year approach to ensure data is accurate and usable District SGP Profile Reports Deployed via NJ SMART 2012-13 Teacher mSGP Reports Provided to All Districts for Learning Purposes and Data Preview Evaluation Pilot Advisory Committee Provides Feedback on Usefulness of SGP Data Student SGPs Provided to All Districts in NJ SMART 2013-14 mSGP Score Verification & Certification Process Completed by Districts 2011-12 Teacher Median SGP (mSGP) Reports Provided to Pilot Districts for Learning Purposes 2015 2010 2012 2013 2011 2014 NJ Adopts SGP Methodology for Calculating Student Growth SGP Training Begins for Districts; SGP Video Released TEACHNJ Act Passed; Growth Measures Required for Evaluation School SGPs Used in School Performance Reports per NJ’s Federal ESEA Waiver 2013-14 Teacher & Principal mSGP Reports Provided to All Districts for Use in Evaluations; Score Verification Process Announced
2013-14 mSGP Data • The 2013-14 mSGP data counts for 30% of qualifying teachers’ and 20 or 30% of qualifying principals’ 2013-14 evaluations. • Evaluation data of a particular employee shall be confidential in accordance with the TEACHNJ Act and N.J.S.A. 18A:6-120.d and 121.d. • Educator mSGP data should be handled in the secure manner one would treat, handle, and store any part of a confidential personnel record and should not be released to the public. • While a dry run for teacher mSGP data was conducted last year to improve roster verification processes, if educators identify a problem with the 2013-14 mSGP score, the Department is offering options for addressing the issue.
Evaluation Score Certification Tool • Districts will have an opportunity to certify that all 2013-14 evaluation data is correct or to make changes where necessary. • The Department will release the 2013-14 Evaluation Score Certification Tool, a new electronic application for districts to use in certifying final 2013-14 summative ratings for all educators, in late January. • This interface will allow districts to review data, correct any errors that occurred in the original NJ SMART submission, and certify the accuracy of each staff member’s final score. • Districts will have approximately one month to complete this process after release of the tool.
Agenda 11
2011-Present: Successes and Challenges Introduction toAchieveNJ
Year 1 Interim Implementation Report Methodology: Evidence from work with about 300 LEAs , inclusive of deeper analysis of 17 Partner Districts
Overall Findings • Districts have done a good job of implementing required elements. • We want to move from meeting requirements to high quality execution. Ownership Quality Compliance 14-15 15-16 13-14
Key Finding 1: More Observations and Feedback The majority of School Districts are getting the required number of observations completed. *numbers based off estimates
Key Finding 2: Observers Differentiate Between Lessons Districts are differentiating between the best and worst teaching in their schools, but distributions vary between districts. 10th 90th 10th 90th 3.63 3.30 3.00 2.57 Observation Score Observation Score Individual Observations Individual Observations District B District A
Key Finding 3: Observers Differentiate within Lessons Many observers are identifying the strengths and weaknesses of individual lessons and sharing that information with teachers.
Key Finding 4: Teachers Set Student Learning Goals (SGOs) Districts are setting the required number of measurable and specific goals. *numbers based off estimates
Key Finding 5: Use of Data to Set SGOs Nearly all (98.5%) sample SGOs included some form of baseline data to inform the goal they set for their students.
Key Finding 6: SGO Alignment and Quality Vary The alignment of SGOs to New Jersey content standards was inconsistent across Districts, as was the quality of assessments used. 2 1 K-8 District 1 K-8 District 2
Key Finding 7: Compliance with DEACs and ScIPs • 99% of Districts across the State • report having DEACs and ScIPs in place. • DEACs: 60% of partner districts report that they used the group to "analyze implementation successes and challenges to recommend improvements.” • ScIPs: 20% of partner districts said the ScIP was highly functioning and leading implementation. 99% 60% 20%
Agenda 23
With fewer, clearer and more rigorous standards… Increasing Student Achievement: An Aligned Approach • aligned assessments providing timely, accurate data… Effective Teaching • COMMON CORE Instructional Leadership • PARCC • ACHIEVE NJ • we impact • teachers and leaders Student Achievement • and an evaluation system that emphasizes feedback and support… to increase student achievement. Setting the Context
Implementation Timeline: Common Core, State Assessments, and Student Growth Data ‘11-’12 ‘13-’14 ‘12-’13 ‘10-’11 ‘14-’15 Setting the Context
Questions and Follow Up Peter Shulman Assistant Commissioner/Chief Talent Officer, Division of Teacher and Leader Effectiveness Carl Blanchard Interim Director, Office of Evaluation www.nj.gov/education/AchieveNJ educatorevaluation@doe.state.nj.us 609-777-3788