350 likes | 364 Views
This study explores the cognitive barriers faced by eligible students in 10th grade reading assessments and examines the impact of item manipulations on improving their performance. Linguistic complexity, formatting, vocabulary, and distractor analysis were used to identify effective strategies.
E N D
Reducing Cognitive Load in 2% Assessments: What Works (Or Doesn’t Work) for Eligible Students? Caroline E. Parker, Sue Bechard, Joanna Gorin NCSA, Detroit, Michigan June 22, 2010
Study Summary • Using three different analysis methods (cognitive interviews, item difficulty modeling, distractor analysis), identified cognitive barriers in existing 10th grade reading items. • Manipulated34 items from four released passages based on those results. • Pilot study examined impact of manipulations. • Developed student profiles from cognitive interviews to understand student characteristics.
I. Identified Cognitive Barriers • Working memory capacity • Limited executive functioning (ordering, organizing) • Inability to identify important information from text • Challenging vocabulary • Inability to draw inferences • Inappropriate use of prior/outside knowledge
II. Item Manipulation Process Following an item from start to finish
Example: Original Item The author’s “difficulty” (line 1) was caused primarily by the • long distances that had to be traveled • unanticipated changes in the project • refusal to question some widespread assumptions • cultural limitations that hindered communication • challenge of mastering a new musical form
Open to closed stem Revised: In line 1, what causes the author to have “difficulty” describing why he traveled to West Africa? • His need to master a new musical form • The unexpected changes in the project • His refusal to question some common beliefs • The cultural differences and language barriers • His exhaustion from walking long distances Original: The author’s “difficulty” (line 1) was caused primarily by the • long distances that had to be traveled • unanticipated changes in the project • refusal to question some widespread assumptions • cultural limitations that hindered communication • challenge of mastering a new musical form
Vocabulary Revised: In line 1, what causes the author to have “difficulty” describing why he traveled to West Africa? • His need to master a new musical form • The unexpected changes in the project • His refusal to question some common beliefs • The cultural differences and language barriers • His exhaustion from walking long distances Original: The author’s “difficulty” (line 1) was caused primarily by the • long distances that had to be traveled • unanticipated changes in the project • refusal to question some widespread assumptions • cultural limitations that hindered communication • challenge of mastering a new musical form
Move distractor and change ‘travel’ to ‘walking’ Revised: In line 1, what causes the author to have “difficulty” describing why he traveled to West Africa? • His need to master a new musical form • The unexpected changes in the project • His refusal to question some common beliefs • The cultural differences and language barriers • His exhaustion from walking long distances Original: The author’s “difficulty” (line 1) was caused primarily by the • long distances that had to be traveled • unanticipated changes in the project • refusal to question some widespread assumptions • cultural limitations that hindered communication • challenge of mastering a new musical form
III. Impact of Manipulations (pilot study) Do item manipulations minimize the identified cognitive barriers for students with disabilities? For this presentation, we focus on the pilot study, though the cognitive interviews study also analyzed the item manipulations
p-values for ANOVAs – Effects of Manipulation and Design Features on Item Difficulty
All Analyses conducted • Analysis of Item Difficulty • Individual Item Difficulties • Chi-square tests of independence across Manipulation • Mean Item Difficulty • Two Way ANOVA • Manipulation (within-subjects factor) • Design Feature (between-subjects factor) • Analysis of Student Scores • Two Way ANOVAs (separate for each passage) • Manipulation • Previous Reading Performance Level • Two Way ANOVAs (separate for each passage) • Manipulation • Order of Passage • Performance of Students Previously At or Below Chance • Distractor Analysis for Items At or Below Chance • Item Order Effects • Analysis of Missing Data • Point Biserials
Conclusions from All Pilot Study Analyses • Blues items were more difficult under all conditions. • Linguistic manipulations (with or without Format modification) had the most positive effect on student scores/item difficulty. • Closing the stem of items was associated with positive change in student scores/item difficulty. • The effects of item manipulations did not interact with prior reading ability (i.e., Proficiency Levels). • Students perform better/items are easier when passages come earlier.
Pilot Study Conclusions (2) • Based on the Cognitive Modeling, it appears that many of the properties of items that make them difficult for the 2% population are the same as for the general population. • Manipulating items in expected linguistic ways will change the difficulty level of the items. • We don’t know enough about what specific manipulations need to be made, though the most conclusive evidence seems to be for the closing of the item stems. • Modifying existing tests may be too constrained to allow for the development of the type of test needed for this population.
Cognitive Interview Conclusions for Sample Item • Restructuring passage can address short-term memory issues • Changing complex vocabulary words seems to increase accessibility, and separates students who just identify key words for matching from those who understand passage • In R2, students demonstrated greater understanding of item in their cognitive interview responses
IV. Student Profiles From target population of study (all students with disabilities who did not reach proficiency on high school reading assessment), how can we develop criteria for identifying students eligible for AA-MAS? cp
Richard: Opportunity to learn? • Mismatch between being almost proficient on state reading assessment and being placed in life skills classes • Speech disability limited communication during interview • Successful on cognitive interview (56%) but did not demonstrate metacognitive skills. Also successful on pilot. • When prompted, he said he chose answers because of “the way it sounds to me”
Tom: Mixed performance, shy, easily distracted • Largest difference between state assessment score and cognitive interviews (and pilot) • Described passions that demonstrate literacy skills but not in accepted contexts (graphic novels) • Opportunity to learn? In special education and some very basic non-special education English
Emma: Fairly consistent low performance • Not successful in any of the test-taking contexts (slight improvement in pilot) • Found it challenging to describe her thinking process in the cognitive interview • Placed in special education classes for English • We didn’t find evidence that an ART 2% AA-MAS would help her demonstrate more knowledge/skills/ abilities than the regular assessment
Student Profile Conclusions • The 57 students varied in assessment scores, learning settings, ability to engage in cognitive interviews • 75% of Round 2 students varied in their scores across assessment types (state vs. pilot vs. cognitive interview) • Students in special education setting may not be receiving grade-level instruction • Some students showed no change in demonstration of knowledge/skills/abilities in cognitive interview setting • Some students found the think-aloud strategy helpful as a learning strategy • Some students responded positively to human contact during test-taking • More successful students could: • Demonstrate metacognitive understanding • Use inference appropriately • Understand vocabulary • Filter out irrelevant prior knowledge • Identify important information
Overall Study Conclusion/Policy Implications • There are students who are not being measured well in the current assessment system • It is very difficult to isolate who those students are • Assessment development MUST acknowledge the difficulty of assessing some students • Access to curriculum is still an issue • Paper and pencil changes aren’t enough but tests can still be improved