490 likes | 753 Views
12. Measuring Learning:. Where does Value Added Data fit in this Process ?. K. How do we measure learning in Cleveland ?. Formative data sources: . How do we measure learning in Cleveland ?. Formative data sources: Student work. How do we measure learning in Cleveland ?.
E N D
12 Measuring Learning: Where does Value Added Data fit in this Process? K
How do we measure learning in Cleveland? • Formative data sources:
How do we measure learning in Cleveland? • Formative data sources: • Student work.
How do we measure learning in Cleveland? • Formative data sources: • Student work. • Short-cycle assessments which focus on a narrow range of skill.
How do we measure learning in Cleveland? • Formative data sources: • Student work. • Short-cycle assessments which focus on a narrow range of skill. • Benchmark tests which have been aligned to the State high-stakes testing and measure a larger skill set.
How do we measure learning in Cleveland? • Formative data sources: • Student work. • Short-cycle assessments which focus on a narrow range of skill. • Benchmark tests which have been aligned to the State high-stakes testing and measure a larger skill set. • These are all used to actively shape instructional process: a feedback loop.
Using formative assessment to shape instruction: Standard Test to see if we are “On Course!”
Using formative assessment to shape instruction: Standard Use testing to “target” instructional practices to facilitate reaching learning goals.
3rd Grade Benchmark Reading Test • A bridge between formative and summative assessment.
3rd Grade Benchmark Reading Test • A bridge between formative and summative assessment. • Developed by CMSD staff!
3rd Grade Benchmark Reading Test • A bridge between formative and summative assessment. • Developed by CMSD staff! • Given 3 times per year – provides feedback for instruction.
3rd Grade Benchmark Reading Test • A bridge between formative and summative assessment. • Developed by CMSD staff! • Given 3 times per year – provides feedback for instruction. • Tied directly to targeted standards based instructional materials.
3rd Grade Benchmark Reading Test • A bridge between formative and summative assessment. • Developed by CMSD staff! • Given 3 times per year – provides feedback for instruction. • Tied directly to targeted standards based instructional materials. • Accurately identifies 92% of students who performed below proficient on the OAT (r=.82).
How do we measure learning in Cleveland? • Summative Measures (Accountability):
How do we measure learning in Cleveland? • Summative Measures (Accountability): • Proficiency Testing (being phased out).
How do we measure learning in Cleveland? • Summative Measures (Accountability): • Proficiency Testing (being phased out). • Achievement Testing (fully operational in the 2007-2008 School Year)
How do we measure learning in Cleveland? • Summative Measures (Accountability): • Proficiency Testing (being phased out). • Achievement Testing (fully operational in the 2007-2008 School Year) • Graduation Testing
How do we measure learning in Cleveland? • Summative Measures (Accountability): • Proficiency Testing (being phased out). • Achievement Testing (fully operational in the 2007-2008 School Year) • Graduation Testing • One can argue that these can be used to actively shape instructional processes.
How do we measure learning in Cleveland? • Summative Measures (Accountability): • Proficiency Testing (being phased out). • Achievement Testing (fully operational in the 2007-2008 School Year) • Graduation Testing • One can argue that these can be used to actively shape instructional processes. • However…
What will “Value Added” analyses add to these? • A measure of change that is relative to the students’ prior levels of achievement.
What will “Value Added” analyses add to these? • A measure of change that is relative to the students’ prior levels of achievement. • Traditional criterion referenced summative assessment measures whether or not one has reached a threshold level of learning irrespective of one’s starting point.
What will “Value Added” analyses add to these? Finish Line = Criterion!
What will “Value Added” analyses add to these? Who has more learning “ground” to cover in order to cross the criterion finish line?
What will “Value Added” analyses add to these? • Value Added analyses take into account where different learners “start” by looking at past performance on accountability testing.
What will “Value Added” analyses add to these? • Value Added analyses take into account where different learners “start” by looking at past performance on accountability testing. • Past testing data is used in a complex statistical modeling method (mixed-model a.k.a. multilevel modeling or hierarchical linear modeling) to develop predicted scores for a student based on like scoring students.
What will “Value Added” analyses add to these? • Value Added analyses take into account where different learners “start” by looking at past performance on accountability testing. • Past testing data is used in a complex statistical modeling method (mixed-model a.k.a. multilevel modeling or hierarchical linear modeling) to develop predicted scores for a student based on like scoring students. • Demographic variables are not used in the model.
What will “Value Added” analyses add to these? = Value Added Predicted Scores! Finish Line = Criterion!
What will “Value Added” analyses add to these? This student is still below proficient on the accountability measure. . .
What will “Value Added” analyses add to these? This student is still below proficient on the accountability measure. . . But, he has advanced beyond his predicted level of performance – this gain beyond his predicted score would be attributed to his teacher and/or school = Value Added.
What will “Value Added” analyses add to these? • As a measure of relative change, Value Added has the promise to offer a means to credit teachers whose children show gain beyond what would be expected given their past performance; however. . .
What will “Value Added” analyses add to these? • As a measure of relative change, Value Added has the promise to offer a means to credit teachers whose children show gain beyond what would be expected given their past performance; however. . . • Sander’s value added method was developed in Tennessee with a significantly different testing system than Ohio’s.
Questions to be addressed when placing Value Added analyses in context: • As we all know, predicted scores are only as good as the information used to create them. . .
Questions to be addressed when placing Value Added analyses in context: • As we all know, predicted scores are only as good as the information used to create them. . . • Value added was developed in a system that used Norm referenced tests that were vertically scaled. Ohio?
Questions to be addressed when placing Value Added analyses in context: • As we all know, predicted scores are only as good as the information used to create them. . . • Value added was developed in a system that used Norm referenced tests that were vertically scaled. Ohio? • Will OPT data be used in the early years of the implementation of the Value Added accountability system in Ohio? Given the repetition of OPT items. . .
Questions to be addressed when placing Value Added analyses in context: • As we all know, predicted scores are only as good as the information used to create them. . . • Value added was developed in a system that used Norm referenced tests that were vertically scaled. Ohio? • Will OPT data be used in the early years of the implementation of the Value Added accountability system in Ohio? Given the repetition of OPT items. . . • Will the new achievement tests have a sufficient track record for this purpose in 2007-2008?
Questions to be addressed when placing Value Added analyses in context: • As we all know, predicted scores are only as good as the information used to create them. . . • Value added was developed in a system that used Norm referenced tests that were vertically scaled. Ohio? • Will OPT data be used in the early years of the implementation of the Value Added accountability system in Ohio? Given the repetition of OPT items. . . • Will the new achievement tests have a sufficient track record for this purpose in 2007-2008? • It is reasonable to assume that individual differences can be accounted for by past test scores?
Questions to be addressed when placing Value Added analyses in context: • As we all know, predicted scores are only as good as the information used to create them. . . • Value added was developed in a system that used Norm referenced tests that were vertically scaled. Ohio? • Will OPT data be used in the early years of the implementation of the Value Added accountability system in Ohio? Given the repetition of OPT items. . . • Will the new achievement tests have a sufficient track record for this purpose in 2007-2008? • It is reasonable to assume that individual differences can be accounted for by past test scores? • Do the tests have sufficient “stretch” at the ends to support inferences at extreme scores?
Questions to be addressed when placing Value Added analyses in context: • As we all know, predicted scores are only as good as the information used to create them. . . • Value added was developed in a system that used Norm referenced tests that were vertically scaled. Ohio? • Will OPT data be used in the early years of the implementation of the Value Added accountability system in Ohio? Given the repetition of OPT items. . . • Will the new achievement tests have a sufficient track record for this purpose in 2007-2008? • It is reasonable to assume that individual differences can be accounted for by past test scores? • Do the tests have sufficient “stretch” at the ends to support inferences at extreme scores? • How (does?) the model account for mobility?
More Questions to be addressed when placing Value Added analyses in context: • All predicted scores have error. . . • Value added analyses take this into account when interpreting gains for students, teachers, and schools.
More Questions to be addressed when placing Value Added analyses in context: • All predicted scores have error. . . • Value added analyses take this into account when interpreting gains for students, teachers, and schools. • This is a two-edged sword. On one hand, it minimizes the effect of any extreme scores. On the other, it means that two teachers can move students, on average, the same “distance” and receive different value added “scores” due to the standard errors associated with the predicted scores of their classrooms.
More Questions to be addressed when placing Value Added analyses in context: • All predicted scores have error. . . • Value added analyses take this into account when interpreting gains for students, teachers, and schools. • This is a two-edged sword. On one hand, it minimizes the effect of any extreme scores. On the other, it means that two teachers can move students, on average, the same “distance” and receive different value added “scores” due to the standard errors associated with the predicted scores of their classrooms. • This area will likely require close contextualization of the data in terms of the absolute gains in the class.
Conclusions • As a measure of relative change, Value Added has been promoted to offer a means to measure the effect of individual teachers and schools on students’ rate of learning.
Conclusions • As a measure of relative change, Value Added has been promoted to offer a means to measure the effect of individual teachers and schools on students’ rate of learning. • It would be of particular value as a means of formative assessment; however, it has been legislated as part of Ohio’s accountability system and will be in effect as such in 2007-2008.
Conclusions • As a measure of relative change, Value Added has been promoted to offer a means to measure the effect of individual teachers and schools on students’ rate of learning. • It would be of particular value as a means of formative assessment; however, it has been legislated as part of Ohio’s accountability system and will be in effect as such in 2007-2008. • Therefore, CMSD will be incorporating Value Added information in our assessment of school and District function, but. . .this data will be interpreted in context of other measures of learning, and
Conclusions • As a measure of relative change, Value Added has been promoted to offer a means to measure the effect of individual teachers and schools on students’ rate of learning. • It would be of particular value as a means of formative assessment; however, it has been legislated as part of Ohio’s accountability system and will be in effect as such in 2007-2008. • Therefore, CMSD will be incorporating Value Added information in our assessment of school and District function, but. . .this data will be interpreted in context of other measures of learning, and • CMSD will actively work to replicate/validate the results of the Value Added Analysis.