350 likes | 563 Views
Understanding Value-Added. Lesson 2: How Value-Added Works. What is the Value-Added Metric?. Academic Growth = Student Learning. 2009. 2010. Value-Added is the District’s measure of elementary school growth. Value-Added is a nationally recognized way of measuring growth.
E N D
Understanding Value-Added Lesson 2: How Value-Added Works
What is the Value-Added Metric? Academic Growth = Student Learning 2009 2010 • Value-Added is the District’s measure of elementary school growth. • Value-Added is a nationally recognized way of measuring growth. • Emphasizes continual student improvement • Provides information to understand what drives continual improvement
Measuring Growth, Not Attainment 200 200 210 210 220 220 230 230 240 240 250 250 260 260 270 270 280 280 290 290 300 300 In this school, the percent meeting state standards is 25% in both Year 1 and Year 2. Attainment is unchanged – but are students learning? Analyzing growth provides this information (Year 2) ISAT Scale Score (Year 1)
Accounting for Student Populations • Student academic growth varies by grade, prior performance, and demographics. • The goal of the Value-Added metric is to measure the school’s impacton student learning independent of student demographic factors. • Value-Added accounts for the following student factors: • Controlling for the factors above gives proper credit for growth to low attainment schools and schools that serve unique populations.
How it Works • Value-Added is not a comparison to similar schools. • We do not look for a comparison group of schools that match each other on all 9 student factors…such a group might not exist. • Rather, Value-Added compares growth of students in each school to growth of students across the District, controlling for the list of student factors. • To do this, we utilize a regression methodology, developed in collaboration between CPS and academic experts from the University of Wisconsin.
Regression Lines Regression shows how growth relates to another variable—in this case prior performance on the ISAT. This line shows the average gain on ISAT math between 2009 and 2010 for 4th graders. It is downward-sloping because at higher levels of prior performance, average growth is smaller. 25 Repeat the process for each grade level. 3rd to 4th Grade ISAT Test Scores Only All ISAT Math Scores for the District 20 15 Scale Score Gain, 2009 to 2010 10 5 0 175 375 125 225 275 325 2009 ISAT Score
Regression Lines Regression shows how growth relates to another variable—in this case prior performance on the ISAT. This line shows the average gain on ISAT math between 2009 and 2010 for 4th graders. It is downward-sloping because at higher levels of prior performance, average growth is smaller. 25 Repeat the process for each grade level. 3rd to 4th Grade ISAT Test Scores Only All ISAT Math Scores for the District 20 15 Scale Score Gain, 2009 to 2010 10 5 0 175 375 125 225 275 325 2009 ISAT Score
Regression Lines Regression shows how growth relates to another variable—in this case prior performance on the ISAT. This line shows the average gain on ISAT math between 2009 and 2010 for 4th graders. It is downward-sloping because at higher levels of prior performance, average growth is smaller. 25 Repeat the process for each grade level. 3rd to 4th Grade ISAT Test Scores Only All ISAT Math Scores for the District 20 15 Scale Score Gain, 2009 to 2010 10 5 0 175 375 125 225 275 325 2009 ISAT Score
Regression Lines Regression shows how growth relates to another variable—in this case prior performance on the ISAT. This line shows the average gain on ISAT math between 2009 and 2010 for 4th graders. It is downward-sloping because at higher levels of prior performance, average growth is smaller. 25 Repeat the process for each grade level. 3rd to 4th Grade ISAT Test Scores Only All ISAT Math Scores for the District 20 15 Scale Score Gain, 2009 to 2010 10 5 0 175 375 125 225 275 325 2009 ISAT Score
Controlling for One Variable Gain of 4th to 5th Grade students at a single school controlling for prior performance compared to the District average This student grew faster than other 5th grade students with the same prior ISAT score. 25 20 15 Scale Score Gain, 2009 to 2010 10 5 This student grew slower. 0 175 150 200 225 250 2009 ISAT Score
Controlling for Multiple Variables Regression allows us to control for multiple factors at one time – in this case prior performance and ELL status. The orange line shows the average gain of non-ELL students between 4th and 5th grade. The blue line shows the average gain for ELL students between 4th and 5th grade. This line shows the average gain for all students from 4th to 5th grade. Now we identify which students are English Language Learners 25 20 15 Scale Score Gain, 2009 to 2010 10 5 0 175 150 200 225 250 2009 ISAT Score
Controlling for Multiple Variables Regression allows us to control for multiple factors at one time – in this case prior performance and ELL status. The orange line shows the average gain of non-ELL students between 4th and 5th grade. The blue line shows the average gain for ELL students between 4th and 5th grade. This line shows the average gain for all students from 4th to 5th grade. Now we identify which students are English Language Learners 25 20 15 Scale Score Gain, 2009 to 2010 10 5 0 175 150 200 225 250 2009 ISAT Score
Controlling for Multiple Variables Regression allows us to control for multiple factors at one time – in this case prior performance and ELL status. The orange line shows the average gain of non-ELL students between 4th and 5th grade. The blue line shows the average gain for ELL students between 4th and 5th grade. This line shows the average gain for all students from 4th to 5th grade. Now we identify which students are English Language Learners 25 20 15 Scale Score Gain, 2009 to 2010 10 5 0 175 150 200 225 250 2009 ISAT Score
Controlling for Multiple Variables Regression allows us to control for multiple factors at one time – in this case prior performance and ELL status. The orange line shows the average gain of non-ELL students between 4th and 5th grade. The blue line shows the average gain for ELL students between 4th and 5th grade. This line shows the average gain for all students from 4th to 5th grade. Now we identify which students are English Language Learners 25 20 15 Scale Score Gain, 2009 to 2010 10 5 0 175 150 200 225 250 2009 ISAT Score
Controlling for Multiple Variables Regression allows us to control for multiple factors at one time – in this case prior performance and ELL status. The orange line shows the average gain of non-ELL students between 4th and 5th grade. The blue line shows the average gain for ELL students between 4th and 5th grade. This line shows the average gain for all students from 4th to 5th grade. Now we identify which students are English Language Learners 25 20 15 Scale Score Gain, 2009 to 2010 10 5 0 175 150 200 225 250 2009 ISAT Score
Controlling for Multiple Variables Regression allows us to control for multiple factors at one time – in this case prior performance and ELL status. Although this student grew slower than other 5th graders with the same pretest score, she grew faster than other ELL students with the same pretest score. 25 20 15 Scale Score Gain, 2009 to 2010 10 5 0 175 150 200 225 250 2009 ISAT Score
Controlling for Many Variables at Once Now, we can control for other factors besides prior performance for Student A. Based on Student A’s demographics, adjustments are made Compared to similar students district-wide, Student A has above average gain Scale Score Gain, 2009 to 2010 2009 ISAT Score
Controlling for Many Variables at Once Now, we can control for other factors besides prior performance for Student A. Based on Student A’s demographics, adjustments are made Compared to similar students district-wide, Student A has above average gain Scale Score Gain, 2009 to 2010 2009 ISAT Score
Controlling for Many Variables at Once Now, we can control for other factors besides prior performance for Student A. Based on Student A’s demographics, adjustments are made Compared to similar students district-wide, Student A has above average gain Scale Score Gain, 2009 to 2010 2009 ISAT Score
Summary of Regression • By measuring the impact of each student factor, the regression model isolates the impact of the school on student growth. • In other words, some growth is explained by external factors. We can measure the average impact of these external factors on growth at the District level and subtract that impact from the school’s absolute growth. • The growth that is left over after removing the impact of these factors is attributed to the school. This is the value added by the school.
Oak Tree Analogy • For an illustrative example of regression, view the “Oak Tree Analogy” presentation at: http://research.cps.k12.il.us/cps/accountweb/Research/ValueAdded/ • The Oak Tree presentation illustrates the Value-Added model by using an analogy of two gardeners tending to oak trees.
Some Things to Know • Tested Students • All students making normal grade progression who took ISAT in both the previous year and current year are included in analysis. • Mobile Students • Mobile students count towards the Value-Added score in each school they attended, but are weighted in the analysis by the amount of time they were in the school during the year. • English Language Learners • ELL students in Program Years 0 through 5 are excluded from the analysis. • This includes students who were in PY0-5 during the pretest year, even if they have since exited the ELL program or moved to PY6. • Students with Disabilities • IEP status is differentiated by type of IEP. • For example, the impact of a severe and profound disability is considered separately from the impact of a speech and language disability.
Value-Added Scores Value-Added measures the difference between the growth of students at a school and the growth of similar students across the District.
Standardization of Scores 200 210 220 240 230 • Growth on the ISAT is measured in ISAT scale score points • However, one ISAT scale score point of growth is more difficult to obtain in some grade levels than others. • As a result, standardization is used to ensure that all Value-Added scores are on the same scale. Student A “grew” by 35 ISAT scale score points
Standardization of Scores • Standardization is a common statistical process. In this case, it is used to convert ISAT scale score points to a standard scale. • The unit of measure is the “standard deviation” which is a measure of distance from the mean. • i.e., how much does School A’s score deviate from the mean? • This places all scores on the same scale, allowing for more precise comparisons between scores at different grade levels.
The Standard Scale • Features of the Standard Scale • The scale ranges from approximately -6 to 6. • Zero (0) is the District average. • About 68% of scores fall between -1 and 1. • About 95% of scores fall between -2 and 2. • About 99% of scores fall between -3 and 3. • Only about 1% of scores are less than -3 or more than 3. 34% 34% 2.5% 2.5% 13.5% 13.5%
Reading the Value-Added Reports Percentile: This is the percent of scores that fall below this score. Percentiles range from 0th to 99th Value-Added Score Performance Category: This is based on the percentile. Confidence Interval: This is explained in the next set of slides. Number of Students in the calculation: This is weighted by the amount of time students were in the school between the pretest and posttest.
Confidence Intervals • The Value-Added model controls for factors that CPS can measure, but there are some factors that cannot be measured, such as: • Motivation to learn • Family circumstances • Health • In addition, the Value-Added model is a statistical estimation of the school’s impact on student learning and therefore contains a certain amount of random error. • For these reasons, the Value-Added model includes confidence intervals.
Real World Example: Political Polling • A Political Polling company surveys a representative random sample of 1,000 community households about for whom they are going to vote on Election Day. The question they pose is: • If the election were held today, for whom would you cast your ballot? • The percentages of responses breakdown as follows: • Candidate Jones would receive 54% of the vote • Candidate Smith would receive 46% of the vote • There is a +/- 3% margin of error
Confidence Intervals in Political Polling With the margin of error of +/- 3%, the range of the percentage of people who plan on voting for each candidates is as follows: Candidate Jones would receive between 51% and 57% of the vote. 43% 44% 45% 46% 47% 48% 49% 50% 51% 52% 53% 54%55% 56% 57% Candidate Smith would receive between 43% and 49% of the vote. The confidence intervals do not overlap. Therefore the race is NOT “too close to call.” We can predict with a high degree of confidence that Candidate Jones will win the race.
Confidence Intervals in Value-Added • A confidence interval is a range of scores around the Value-Added estimate. • We are 95% confident that the true Value-Added score falls within the confidence interval range. • The confidence interval is “n” dependent, meaning larger samples yield smaller confidence intervals. • This is because in larger samples, a score that is different from the average is less likely to be due to random error alone. 1.0 • The Value-Added estimate is 1.0. • The confidence interval is ± 0.3. • The confidence interval range is from 0.7 to 1.3. Example: 0.7 1.3
Statistical Significance • If the confidence interval does not include zero, we say that the score is statistically significant, meaning we are 95% confident that the score is different from zero. • A color is associated with each score based on the statistical significance:
How Confidence Intervals are Reported This is how Value-Added scores are displayed in the reports. This school has a Value-Added score of -0.5 in reading (the score is ½ of a standard deviation below the mean) • The confidence interval ranges from -1.9 to 0.8 • Because the confidence interval includes zero, we say that this school is not statistically different from zero at the 95% confidence level. • For that reason, the bubble is yellow.
Using Value-Added Information • Performance Management • As an assessment of school performance • To identify areas needing additional support or professional development • To identify best practice strategies for improving student growth • School Accountability (i.e., Performance Policy) • Additional Compensation Plans (i.e. Chicago TAP) In all of these applications, Value-Added is used as just one additional piece of information, along with other data.
For More Information • More lessons and other resources for understanding Value-Added are available at: http://research.cps.k12.il.us/cps/accountweb/Research/ValueAdded/ • Lesson 2 (Part 2): Oak Tree Analogy • Lesson 3: Technical Specifications of the Value-Added Regression Model