270 likes | 423 Views
Value Added for Teacher Evaluation in the District of Columbia. Robin Chait, Office of the State Superintendent of Education Anna Gregory, District of Columbia Public Schools Eric Isenberg, Mathematica Policy Research Association for Education Finance and Policy 37 th Annual Conference
E N D
Value Added for Teacher Evaluation in the District of Columbia Robin Chait, Office of the State Superintendent of Education Anna Gregory, District of Columbia Public Schools Eric Isenberg, Mathematica Policy Research Association for Education Finance and Policy 37th Annual Conference March 16, 2012
Value Added Teacher value added = Students’ actual end-of-year test scores – Students’ predicted end-of-year test scores • Statistical model predicts student achievement • Account for pretests, student characteristics • Ranks teachers relative to an average teacher
Key Points • Implementation requires sufficient capacity • Communication strategy is vital • Value added is worth the investment
Where We Were in 2007 VS 8th grade reading proficiency (2007 NAEP) Teachers meeting or exceeding expectations 95% 12%
Why Value Added for DCPS? • Fairest way to evaluate teachers • Objective, data-based measure • Focused on student achievement
Value Added in DCPS Evaluation System IVA: Individual value added TLF: Teaching and learning framework (classroom observations) CSC: Commitment to school community SVA: School value added • Individual value-added measures: 50 percent of eligible teachers’ IMPACT scores 7
IMPACT Is High Stakes • Highly effective: performance pay • Ineffective (one year): subject to separation • Minimally effective (consecutive years): subject to separation 100 175 250 350 400 8
Help for DC Public Schools • Mathematica Policy Research • Technical Advisory Board [2012] • Steve Cantrell, Gates Foundation • Laura Hamilton, RAND Corporation • Rick Hanushek, Stanford University • Kati Haycock, Education Trust • David Heistad, Minneapolis Public Schools • Jonah Rockoff, Columbia Business School • Tim Sass, Georgia State University • Jim Wyckoff, University of Virginia
Challenges Consider face validity, incentive effects Teacher-student link data can be challenging All data decisions shared with district Timeline must allow DCPS to transition out poor performers, hire new teachers
No One-Size-Fits-All Value Added Model • Choosing student characteristics: communications challenge for race/ethnicity • Multiple years of data: bias/precision trade-off • Joint responsibility for co-teaching • Cannot estimate model of separate teacher effects • Can estimate “teams” model, but should team estimates count? • Comparing teachers of different grades
Roster Confirmation • Teacher-student links critical for value added • Administrative data can be challenging • Specialized elementary school teachers • Co-teaching • Pull-out and push-in programs • Midyear student transfers • Teachers surveyed to confirm administrative roster data (Battelle for Kids)
Business Rules: Documenting Data Decisions Every data decision defined, discussed, documented beforehand Let OSSE, DCPS review all decisions Document entire process Make quick progress when final data arrive
Production: Meeting Timelines, Ensuring Accuracy • October data: formulate business rules • February data • Establish data cleaning programs • Begin trial runs from analysis file to final output • April data: Final student data in trial runs • June (test score) data: produce final results
Race To The Top • Federal competition between states • Required student achievement to contribute 50% of teacher evaluation score • Decision to use DCPS value-added model for all eligible DC teachers • Brought DCPS and charter schools together • Each charter school LEA has own evaluation system used to inform personnel decisions
Common Decision-Making • Need to make decisions on value added • Quickly to meet production schedule • Informed by best available data • Obtains buy-in from charter schools and DCPS • Technical Support Committee (TSC) • Six members: five charter, one DCPS • Meets periodically • Consensus decisions sought
Data Infrastructure • Most data elements for value added exist • . . . but not necessarily collected on right schedule • Student background characteristics • Collected twice a year for AYP purposes • Need three-time-a-year collection, earlier schedule for value added
Need Capacity Within District • Do not just hire a contractor • Need dedicated staff to answer questions • Data team • Technical Support Committee
Communication Strategy • Value added hard to understand • Requires a strong statistical background • Final information is hard to connect to familiar test scores • Different from other student achievement measures teachers commonly use • Communication tools • Guidebooks • Information sessions
What Factors Affect a Student’s Achievement? Teacher’s Level of Expectations Student’s Prior Learning Student Achievement As Measured by the DC CAS Value-added isolates the teacher’s impact on student achievement. Teacher’s Content Knowledge Student’s Resources at Home Student’s Disability (If Any) Teacher’s Pedagogical Expertise Student’s English Proficiency Teacher’s Ability to Motivate
Initiatives Under Development • Student-level output for DC teachers • Would show pretest, predicted posttest, actual posttest score for each student • May be in graphical format • Intermediate value-added scores • Individual value-added scores based on intermediate tests • Could be given to teachers midyear
Conclusions • Implementing value added requires . . . • Availability and accessibility of current data • Confirmation of teacher-student links • Careful planning of production process • Sufficient capacity within local and/or state education agency to interact with value-added contractor • Teacher buy-in is not a given – communication strategy is vital • Properly implemented, value added is worth the investment • Fairest measure of teacher effectiveness • Provides data for answering research questions