1.19k likes | 1.36k Views
Student Growth Professional Development for the Principal and Teacher Evaluation Process. May 2014. Survey Results. 10 respondents from Advisory Committee Top topics to cover: Designing high quality assessments SLO’s Team activities related to assessment creation Top subtopics to cover:
E N D
Student Growth Professional Development for the Principal and Teacher Evaluation Process May 2014
Survey Results • 10 respondents from Advisory Committee • Top topics to cover: • Designing high quality assessments • SLO’s • Team activities related to assessment creation • Top subtopics to cover: • SLO’s: discussion, process/creation, fairness • Assessments: what is high quality, key elements, selection • Growth models: identification, implementation
Agenda for the Day • Student Growth Measures • Student Learning Objectives • Team Sharing Time • Assessment Development • Office Hours
Website with Resources http://edanalytics.org/projects/illinois-student-growth
PERA May update • Local Assessment Support (ISBE contract awarded) • Develop assessment literacy • Create local assessments, mostly in traditionally non-tested grades and subject • Run workshops, some on performance-based assessment, e.g. music, career/technical • Pilot these workshops, then run with larger groups, and/or make into webinars • Guidebook on student growth (posted) • Includes examples of timelines and decision points and examples from other states (build/borrow/buy assessments, set up data systems, set up SLOs, run PD) • Includes strong recommendation to have at least one pilot year • SLO template (posted) • Associated guidance shows that goals can be set by group (each starting point) or by student, or both • Draft SLOs are posted in elementary math, music, HS English, independent living, nutrition/culinary arts, consumer math • No guidance on how to make results comparable across teachers/schools/subjects.
PERA May Update • Guidance document on teachers of special education students, ELL, and early childhood (discussed, not finalized or posted) • Districts have been advised by Charlotte Danielson not to create new frameworks, and to create addenda instead. CPS has created these for special education, ELLs, arts, and physical education, and one for early childhood education (ECE) will be released soon. • These are needed because administrators don't always have experience in all areas; they illuminate specific considerations for these groups/acknowledge potentially different context • Lowest 20% of districts not yet determined • These must implement in 2015-16; Likely will be determined in summer 2014 • State default model • Use one type II and one type III assessment, if a type II exists in that subject area and the process for developing or selecting an assessment includes teachers, the assessment is aligned with the curriculum, and is reliable and valid. • Guidance on how to meet these requirements will be forthcoming • Otherwise, use two type III assessments, with SLOs, one determined by evaluator, one by teacher • Note that the state model is only required in situations where a joint committee cannot reach agreement on one or more aspects of student growth, and is only required for the parts where no agreement was reached. • So if a joint committee agrees to everything but the percentage assigned to student growth, it must use 50%.
PERA May Update • PEAC meetings are public • Upcoming meetings (all scheduled for 10 a.m. - 3 p.m. at ISU Alumni Center, 1101 N Main, Normal): • May 16 (may be moved to Springfield) • June 20 • July 18 • August 15 • www.isbe.net/peac
Student Growth Measures Student Learning Objectives Value Tables Simple Growth Adjusted Growth Value-Added
Student Learning Objectives (SLOs) • SLOs are goals that teachers and administrators set using baseline data such as: • Students’ performance on a prior year test • Students’ performance on a diagnostic test (e.g., for literacy or math skills) • Prior year attendance data or discipline rates • Goals are monitored throughout the year and success or failure determined at the end of the year • SLOs can be created by individual teachers, teams of teachers, or the district.
Student Learning Objectives Benefits Implementation Requirements • Parameters for educators to follow • Type I or Type or Type III • Consideration of unintended consequences and biases • Can be used with any assessment type • Can be uniquely tailored Drawbacks • Not necessarily standardized or comparable across educators • Potentially very easy to game • Potentially very hard to draw meaningful conclusions
ValueTables • Assigns point values to improvements across categories (e.g., proficient to advanced) • Valuetables can be weighted in many different ways and cut points can be placed at creator’s discretion.
Example: Grade 4 to 5 Math • Focus on categories 1b and 2a (but possible for any adjacent categories in other subjects and grades) • Grade 4: 1b covers 174 to 190, 2a covers 191 to 206 • Grade 5: 1b covers 183 to 200, 2a covers 201 to 217 • Student A: Moves from 174 to 200 (gain: 26 points, 85 value table points for moving from 1b to 1b) • Student B: Moves from 190 to 190 (gain: 0 points, 85 value table points) • Student C: Moves from 190 to 183 (gain: -7 points, 85 value table points) • Should students A, B and C be weighted the same? • Student D: Moves from 174 to 182 (gain: 8 points, 20 value table points for moving from 1b to 1a) • Should student C count more than student D?
Example: Grade 4 to 5 Math • Focus on categories 1b and 2a (but possible for any adjacent categories in other subjects and grades) • Grade 4: 1b covers 174 to 190, 2a covers 191 to 206 • Grade 5: 1b covers 183 to 200, 2a covers 201 to 217 • Student E: Moves from 191 to 217 (gain: 26 points, 90 value table points for moving from 2a to 2a) • Student F: Moves from 191 to 218 (gain: 27 points, 125 value table points for moving from 2a to 2b) • Student G: Moves from 190 to 218 (gain: 28 points, 150 value table points for moving from 1b to 2b) • Is the difference between students E, F and G substantial enough to justify three widely different weights?
Value Tables Benefits Implementation Requirements • Uses two points in time—better than attainment! • Maintain data from two consecutive years • Type I or Type II (considerably more challenging for Type III) • Creation of categories • Consideration of unintended consequences and biases Drawbacks • Does not separate teacher’s effect on student growth from student-level factors • Can lead to focus on “bubble kids” • Biased if the point values do not reflect the difficulty of making a particular improvement • Ignore improvements within categories, and measurement error • Easy to draw incorrect conclusions based on category instability
Simple Growth • Here we are not referring to the general definition of growth as outlined in state legislation, but a specific type of growth model. • Simple growth is average gain in test scores across students.
Simple Growth Gain High Low
Is it fair to compare these classrooms using simple growth? Test Score Range High Low Gain High Low
Simple Growth Benefits Implementation Requirements • Maintain data from two consecutive years • Type I or Type II (to be safe) • Subtraction! • Test where typical gain is the same regardless of starting point • Consideration of unintended consequences and biases • Uses two points in time—better than attainment! Drawbacks • Does not separate teacher’s effect on student growth from student-level factors • Ignores test measurement error • May be harder for some students to make gains—particularly high achievers • Easy to draw incorrect conclusions based on test scale instability
Adjusted Growth • Adjusted growth measures growth in student scores from one year to the next by looking at groups of students (divided according to their performance on a prior year test) and comparing their performance on a post-test
Using Real Data to Account for the Relative Difficulty of Making Gain Test Score Range High Low Gain High Low
Using Real Data to Account for the Relative Difficulty of Making Gain
Adjusted Growth Benefits Implementation Requirements • Maintain data from two consecutive years, and additional historical data to determine typical growth • Type I or Type II (to be safe) • More complex methods than subtraction • Consideration of unintended consequences and biases • Takes students’ starting points into account Drawbacks • Does not separate teachers’ effect on student growth from demographic factors • Ignores test measurement error
Value-Added • A value-added model: • takes a classroom of students’ pre-test scores and demographic characteristics, • compares those students to others like them, and • predicts what their post-test scores would be assuming students had an average teacher. • If the students’ actual teacher was able to produce more growth than predicted in her students, she will have a high value-added score. This teacher “beat the odds”. • This is the difference between actual student achievement and the average achievement of a comparable group of students (where comparability is determined by prior scores, at a minimum).
Value-Added Visual Representation Actual student achievement scale score Value-Added Starting student achievement scale score Predicted student achievement (Based on observationally similar students) Year 1 (Prior-test) Year 2 (Post-test)
Value-Added Benefits Implementation Requirements • Maintain data from two consecutive years • Large enough sample size (alone or in a consortium) • Type I or Type II • Research team or statistical capacity for calculation • Statistical reference group score and demographic data • Consideration of unintended consequences and biases • Comprehensive measure that accurately separates effects of educator on student growth from other confounding factors Drawbacks • Requires the most data and more complex modeling
Capabilities Checklist Please see the Capabilities Checklist handout
Goal Helper Tool This is a goal helper, not a goal creator. The goal helper provides a reality check based on historical data.
Goal Helper • A tool to help you make realistic goals • Four Tabs • Previous Year Raw Data (Blue) • Pre-Test Groups (Orange) • This Year Raw Data (Red) • Goal Output (Purple) • Note, the tool also includes seven (Gray) tabs; information in these tabs relates to the workings of the tool and should not be altered.
Tab 1: Previous Year Raw Data • Column A • Student ID • This should be unique by student • Column B • Last Year Pre-Test Score • Must be a numeric value for this student's Pre-Test • Column C • Last Year Post-Test Score • Must be a numeric value for this student's Post-Test Data here is for all students in the district taking the assessment at this grade level.
Tab 2: Pre-Test Groups • Desired Number of Pre-Test Groups • Yellow Box (ONLY) • Chose between 1 and 8 groupings • Information is provided in the orange box for the Average Gain by Pre-Test Score Range
Tab 3: This Year Raw Data • Column A • Student ID • This should be unique by student • Column B • Pre-Test Score • Must be a numeric value for this student's Pre-Test Data here is for ONLY the students in a particular class or school taking the assessment at this grade level.
Tab 4: Goal Output • Column A • Student ID • Column B • Pre-Test • Column C • Pre-Test Group • Column D • Average Gain by Pre-Test Group • Column E • Post-Test The purple box provides the projected group (class or school at this grade level) average Post-Test score.
DuPage Goal Helper • To access the goal helper, please have one member of your team contact Linda Kaminski at the DuPage ROE: lkaminski@dupage.k12.il.us
Team Reflection Time Please see Team Reflection Time Guiding Questions
What are SLOs? (A recap) Student Learning Objectives (SLO) are detailed, measurable goals for student academic growth to be achieved in a specific period of time (typically an academic year). “A detailed process used to organize evidence of student growth over a specified period of time.” – ISBE SLO webinar
Key Characteristics of SLOs* Target Population Which students will you target in the SLO? Expected Growth Targets What is your specific target, or goal, for student growth? Are goals different for different student populations, based on starting points? Time Span What is the timeframe for the SLO? (typically a semester or a year) Instructional Strategies What instructional strategies will you employ to support this SLO? How will you modify strategies based on student progress? Baseline Data and Rationale What are student needs? What baseline data did you review and what did it demonstrate? What are student starting points? Learning Goal What should students know and be able to do at the end of your course? How does this goal align to content standards? Assessments and Scoring How will you measure the learning goal? How will you collect data, monitor, and score final student outcomes? * Derived from ISBE SLO Guidebook and Template
SMART Goals • Specific • Measurable • Appropriate • Realistic • Time Limited Does the SLO statement identify a specificstudent population and growth target? How is growth being measured? Does the SLO identify a quantifiable growth target and a specific assessment/evidence source? Is the SLO appropriate for the grade level/subject area? Are growth targets appropriate for the student population? Are goals and growth target realistic, given the student population and assessment/evidence source? Does the SLO indicate the time period within which the goal must be met?
Flexibility of Approaches to the SLO Process Wisconsin New York Georgia Ohio More Structured More Flexible
Assessment Selection • Flexible – Allows educators to select and/or develop their own assessments to measure student growth. • Structured – Requires the use of pre-approved, standardized assessments
Target Setting • Statistical or Model Informed • SLO targets are determined using a statistical model to predict levels of student growth based on prior data • Objective, or standardized • Standardized, or common, way to set growth targets for teachers across classrooms, schools & districts • Subjective • Growth targets are set based on historical data, student needs, and context. • Relies more on professional judgment
Scoring • Statistical or Model Informed • SLO scored using a statistical model that predicts levels of student growth and set thresholds for final SLO ratings • No scoring rubric needed • Objective, or standardized • Scoring rubric includes prescriptive criteria for assigning a rating category based on student outcomes • Subjective • Scoring rubric includes broad and/or subjective language for assigning a rating category based on student outcomes
Structured: New York • Assessment selection • Requires use of state test where available • Provides list of state-approved assessments for district use • Target setting • Expectations based on state-provided scale • Scoring • Number of students reaching the target goal directly linked to final SLO score
Structured: Georgia • Assessments selection • Teacher-selected, but they must meet minimal criteria • Target setting • Teacher-selected, but the state approves the overall SLO goal • Scoring Rubric • Each possible score associated with % of students that met the goal