1 / 20

Understanding Value-Added

Understanding Value-Added. Lesson 2: How Value-Added Works. Recap: What is Value-Added?. Academic Growth = Student Learning. 2012. 2013. Emphasizes continual student improvement Provides information to understand what drives continual improvement.

thuyet
Download Presentation

Understanding Value-Added

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Understanding Value-Added Lesson 2: How Value-Added Works Office of Accountability

  2. Recap: What is Value-Added? Academic Growth = Student Learning 2012 2013 • Emphasizes continual student improvement • Provides information to understand what drives continual improvement Value-Added is the District’s measure of elementary school and teacher growth. Value-Added is a nationally recognized way of measuring growth. Office of Accountability

  3. Measuring Growth, Not Attainment 200 200 210 210 220 220 230 230 240 240 250 250 260 260 270 270 280 280 290 290 300 300 In this school, the percent meeting state standards is 25% in both Year 1 and Year 2. Attainment is unchanged – but are students learning? Analyzing growth provides this information (Year 2) Scale Score (Year 1) Office of Accountability

  4. Accounting for Student Populations Student academic growth varies by grade, prior performance, and demographics. The goal of the Value-Added metric is to measure the school or teacher’s impacton student learning independent of student demographic factors. Value-Added accounts for the following student factors: Controlling for the factors above gives proper credit for growth to low attainment schools and schools that serve unique populations. Office of Accountability

  5. How it Works • Value-Added is not a comparison to similar schools. • We do not look for a comparison group of schools that match each other on all 10 student factors…such a group might not exist. • Rather, Value-Added compares growth of students in each school to growth of students across the District, controlling for the list of student factors. • To do this, we utilize a regression methodology, developed in collaboration between CPS and academic experts from the University of Wisconsin. Office of Accountability

  6. What is Regression? By measuring the impact of each student factor, the regression model isolates the impact of the teacher on student growth. In other words, some growth is explained by external factors. We can measure the average impact of these external factors on growth at the District level and subtract that impact from the teacher’s absolute growth. The growth that is left over after removing the impact of these factors is attributed to the teacher. This is the value added by the teacher. Office of Accountability

  7. For More on Regression… Two other presentations on this topic are available at http://cps.edu/Pages/valueadded.aspx For an illustrative example of regression, view the “Oak Tree Analogy” presentation. This presentation illustrates the Value-Added model by using an analogy of two gardeners tending to oak trees. For technical details, view “Lesson 3: Technical Specifications of the Value-Added Regression Model” Office of Accountability

  8. Some Things to Know • Tested Students • All students making normal grade progression who took ISAT or NWEA MAP in both the previous year and current year are included in analysis. • Mobile Students • Mobile students count towards the Value-Added score in each school they attended, but are weighted in the analysis by the amount of time they were in the school during the year. • At the teacher-level, mobile students count towards the Value-Added score for each teacher that provided instruction to that student, but are weighted in the analysis by the time they were in the school and the amount of instruction provided by each teacher. • English Language Learners • For ISAT: ELL students in Program Years 0 through 5 are excluded from the analysis. • For NWEA MAP: Students with an ACCESS literacy score below 3.5 are excluded. • Students with Disabilities • IEP status is differentiated by type of IEP. • For example, the impact of a severe and profound disability is considered separately from the impact of a speech and language disability. Office of Accountability

  9. Value-Added Scores Value-Added measures the difference between the growth of students for whom a school or teacher provided instruction and the growth of similar students across the District. Office of Accountability

  10. Standardization of Scores 200 210 220 240 230 Student A “grew” by 35 scale score points Growth is measured in scale score points (for NWEA, these are called “RIT” scores). However, one scale score point of growth is more difficult to obtain in some grade levels than others. As a result, standardization is used to ensure that all Value-Added scores are on the same scale. Office of Accountability

  11. Standardization of Scores • Standardization is a common statistical process. In this case, it is used to convert scale score points to a standard scale. • The unit of measure is the “standard deviation” which is a measure of distance from the mean. • i.e., how much does School A’s score deviate from the mean? • This places all scores on the same scale, allowing for more precise comparisons between scores at different grade levels. Office of Accountability

  12. The Standard Scale • Features of the Standard Scale • Zero (0) is the District average. • About 68% of scores fall between -1 and 1. • About 95% of scores fall between -2 and 2. • About 99% of scores fall between -3 and 3. • Only about 1% of scores are less than -3 or more than 3. 34% 34% 2.5% 2.5% 13.5% 13.5% Office of Accountability

  13. Reading the Value-Added Reports (School-Level Report) Percentile: This is the percent of scores that fall below this score. Percentiles range from 0th to 99th Value-Added Score Confidence Interval: This is explained in the next set of slides. Number of Students in the calculation Office of Accountability

  14. Confidence Intervals • The Value-Added model controls for factors that CPS can measure, but there are some factors that cannot be measured, such as: • Motivation to learn • Family circumstances • Health • In addition, the Value-Added model is a statistical estimation of the school or teacher’s impact on student learning and therefore contains a certain amount of random error. • For these reasons, the Value-Added model includes confidence intervals. Office of Accountability

  15. Real World Example: Political Polling • A Political Polling company surveys a representative random sample of 1,000 community households about for whom they are going to vote on Election Day. The question they pose is: • If the election were held today, for whom would you cast your ballot? • The percentages of responses breakdown as follows: • Candidate Jones would receive 54% of the vote • Candidate Smith would receive 46% of the vote • There is a +/- 3% margin of error Office of Accountability

  16. Confidence Intervals in Political Polling With the margin of error of +/- 3%, the range of the percentage of people who plan on voting for each candidates is as follows: Candidate Jones would receive between 51% and 57% of the vote. 43% 44% 45% 46% 47% 48% 49% 50% 51% 52% 53% 54%55% 56% 57% Candidate Smith would receive between 43% and 49% of the vote. The confidence intervals do not overlap. Therefore the race is NOT “too close to call.” We can predict with a high degree of confidence that Candidate Jones will win the race. Office of Accountability

  17. Confidence Intervals in Value-Added • The Value-Added estimate is 1.0. • The confidence interval is ± 0.3. • The confidence interval range is from 0.7 to 1.3. • The district average (0) is not in the confidence interval, so we are 95% confident that the school’s effectiveness is different than the average (above average in this example). 1.0 Example: 0.7 1.3 • A confidence interval is a range of scores around the Value-Added estimate. • We are 95% confident that the true Value-Added score falls within the confidence interval range. • The confidence interval is “n” dependent, meaning larger samples yield smaller confidence intervals. • This is because in larger samples, a score that is different from the average is less likely to be due to random error alone. Office of Accountability

  18. Statistical Significance If the confidence interval does not include zero, we say that the score is statistically significant, meaning we are 95% confident that the score is different from zero. A color is associated with each score based on the statistical significance: Office of Accountability

  19. How Confidence Intervals are Reported This school has a Value-Added score of -0.5 in reading (the score is ½ of a standard deviation below the mean) • The confidence interval ranges from -1.9 to 0.8 • Because the confidence interval includes zero, we say that this school is not statistically different from zero at the 95% confidence level. • For that reason, the bubble is yellow. This is how Value-Added scores are displayed in the school-level reports. Office of Accountability

  20. For More Information • More lessons and other resources for understanding Value-Added are available at: • http://cps.edu/Pages/valueadded.aspx • Lesson 2 (Part 2): Oak Tree Analogy • Lesson 3: Technical Specifications of the Value-Added Regression Model Office of Accountability

More Related