480 likes | 608 Views
March 28 , 2011. APPR Update. Chapter 103 Review. What does the new law require?. New system for teachers (and principals). 20% State student growth data (increases to 25% upon implementation of value0added growth model) 20% Locally selected (and agree upon) measures (decreasing to 15%)
E N D
March 28, 2011 APPR Update
Chapter 103 Review What does the new law require?
New system for teachers (and principals) • 20% State student growth data (increases to 25% upon implementation of value0added growth model) • 20% Locally selected (and agree upon) measures (decreasing to 15%) • 60% Multiple measures based on standards TDB
New system for teachers (and principals) Being referred to as HEDI (pronounced Heidi) • Highly effective (possibly>90) • Effective (possibly80-90) • Developing (possibly65-79) • Ineffective (possibly 0-64)
New system for teachers (and principals) • A single composite score of teacher (or principal) effectiveness
New system for teachers (and principals) • Training for all evaluators (through Network Teams – after first week of August) • Use of improvement plans for developing and ineffective ratings • Utilize in other decisions (merit, etc.) • Locally-developed appeals process • Expedited 3020a process after two ineffective ratings
New system for teachers (and principals) • All agreements after July 1, 2010 • For agreements prior to July 1, 2010, it depends on specific language in agreement • 4-8 math and ELA (and principals) July 2011 • Everyone else July 2012 • Implementation of the value-added growth model (20% > 25%) 2012-2013
New system for teachers (and principals) • All agreements after July 1, 2010 • For agreements prior to July 1, 2010, it depends on specific language in agreement • 4-8 math and ELA (and principals) July 2011 • Everyone else July 2012 • Implementation of the value-added growth model (20% > 25%) 2012-2013
Timetable Board of Regents Agenda
Timetable Month Action 60% discussion Local 20% discussion Value added 20% discussion and ratings/scores Regents Task Force recommendations (4th) Draft Regulations Emergency Adoption of Regulations • January • February • March • April • May • June
State student growth data 20% increasing to 25%
State student growth data • Value Added/Growth model • Annual achievement is more about the students than the teacher 680 670 2015 2015 Teacher A Teacher B
State student growth data • Value Added/Growth model • Adding average prior achievement for the same students shows growth 680 +20 growth 670 +25 growth 660 645 2014 2014 2015 2015 Teacher A Teacher B
State student growth data • Value Added/Growth model • Adding average prior achievement for the same students shows growth 680 +20 growth 670 +25 growth 660 645 2014 2014 2015 2015 Teacher A Teacher B
State student growth data • Value Added/Growth model • But what growth should students have shown? • What growth did similar students obtain? • What is the difference between the expected growth and the actual growth?
State student growth data • Value Added/Growth model • Comparing growth to the average growth of the similar student is the value-added 680 +20 growth +15 val add +5 val add 670 +25 growth 665 660 665 645 2015 avg. for similar students 2015 avg. for similar students 2014 2014 2015 2015 Teacher A Teacher B
State student growth data • Value Added/Growth model • Comparing growth to the average growth of the similar student is the value-added 680 +20 growth +15 val add +5 val add 670 +25 growth 665 660 665 645 2015 avg. for similar students 2015 avg. for similar students 2014 2014 2015 2015 Teacher A Teacher B
State student growth data • Calculating similar student growth • Lots of statistical analysis • Student characteristics such as academic history, poverty, special ed. status. ELL status, etc. • Classroom or school characteristics such as class percentages of needs, class size, etc.
State student growth data • Data collection and policy options • Linking students, teachers, and courses • Who is the teacher of record? • Scenario 1: Same Teacher the Entire Year • Scenario 2: Team Teaching • Scenario 3: Teacher for Part of the Year • Scenario 4: Student for Part of the Year • Scenario 5: Student Supplemental Instruction • Additional Scenarios???
State student growth data Non-tested areas
Non-tested areas • Teachers of classes with only one state test administration • K-12 educators • High school (no test) educators • Middle and elementary (no test) educators • Performance courses • Others
Non-tested areas • Use existing assessments in other content areas to create a baseline for science tests and Regents examinations • Use commercially available tests to create a baseline and measure growth
Non-tested areas • Add more state tests, such as: • Science 6-8 • Social studies 6-8 • ELA 9-11 (2011-2012) • PARCC ELA 3-11 (2014-2015) • PARCC math 3-11 (2014-2015)
Non-tested areas • Add more state tests, according to December 2009 Regents Item; discussed and approved prior to inclusion in SED’s plans: • ELA 9-11 (2011-2012)
Non-tested areas • Add more state tests, subject to funding availability and approval, such as: • Science 6-7 • Social studies 6-8
Non-tested areas • % growth model also can be used for school accountability measures • Collaborate with state-wide professional associations or a multi-state coalition • Empower local level resources to create and carry out a solution that meets state requirements
Non-tested areas • Use a group metric that is a measure of the school (or grade’s) overall impact • In other states where this is implemented it tends to be tied to performance bonuses
Local assessment measures 20% decreasing to 15%
Local assessment measures • Objectives include: • Provide a broader picture of student achievement by assessing more • Provide a broader picture by assessing differently • Verify performance of state measures
Local assessment measures • Reality check: • Balance state/regional/BOCES consistency while accounting for local context • School-based choice might appeal to teachers • Districts must be able to defend their decisions about the tests
Local assessment measures • Considerations include: • Rigor • Validity and reliability • Growth or achievement measures • Cost • Feasibility • May be achievement or growth measure
Local assessment measures • Options under consideration: • Districts choose or develop assessments for courses/grades • Commercially available products • Group metric of school or grade performance • Other options that meet the criteria (previous slide)
Other 60% Multiple measures
Other 60% • Begins with the teaching standards: • Knowledge of Students and Student Learning • Knowledge of Content and Instructional Planning • Instructional Practice • Learning Environment • Assessment for Student Learning • Professional Responsibilities and Collaboration • Professional Growth
Other 60% • Begins with the teaching standards: • Some things observable • Some not observable thus requiring some other form or documentation or artifact collection
Other 60% • Teacher practice rubrics: • Describe differences in the four performance levels • Articulate specific, observable differences in student and teacher behavior • Not known whether there will be a single rubric, menu to choose from, or total local option
Other 60% • Teacher practice rubrics: • Describe differences in the four performance levels • Articulate specific, observable differences in student and teacher behavior • Not known whether there will be a single rubric, menu to choose from, or total local option
Other 60% • Other items that might be included: • Teacher attendance • Goal setting • Student surveys • Portfolios/Evidence binders • Other observer
Timetable Board of Regents Agenda
Timetable Month Action 60% discussion Local 20% discussion Value added 20% discussion and ratings/scores Regents Task Force recommendations Draft Regulations Emergency Adoption of Regulations • January • February • March • April • May • June
Timetable Month Action NT Training (included evaluator training) NT turns training to local evaluators Implementation for covered teachers • August • August • September
NT training for teacher evaluators • Tentative dates set (with multiple options): • August 15, Rodax 8 Large Conference Room • August 22, McEvoy Conference Center • August 29, Rodax 8 Large Conference Room • Ongoing training during year (TBD)
Training for principal evaluators • Tentative dates set (with multiple options): • August 19, Rodax 8 Small Conference Room • August 26, McEvoy Conference Center • Ongoing training during year (TBD)
While we wait • Regional/BOCES collaboration: • Share data • Share APPR Plans • Build common understanding • Work on parts under local jurisdiction • Avoid duplication of work • Have a common voice
While we wait • APPR sub-site: • APPR button under “for school districts” at ocmboces.org or leadership.ocmboces.org • User name: lrldocs • Password: CBA1011
While we wait • Regional/BOCES collaboration: • Development of local 20% protocol • Achievement in non-tested areas • Qualities of effective Improvement plans and examples • Appeals process • Frameworks/models • Summative evaluation (examples, best practices, share practices) • Principal Evaluation (added back)
Next steps • Share results of this afternoon’s work • Gather again on __________ • Updates • Continue collaboration