690 likes | 1k Views
Assessment Literacy and Performance-Based Assessments. Jennifer Borgioli Learner-Centered Initiatives, Ltd. Organizational Focus. Assessment to produce learning… a nd not just measure learning.
E N D
Assessment Literacy and Performance-Based Assessments Jennifer Borgioli Learner-Centered Initiatives, Ltd.
Organizational Focus Assessment to produce learning… and not just measure learning.
“Less than 20% of teacher preparation programs contain higher level or advanced courses in psychometrics (assessment design) or instructional data analysis.” Inside Higher Education, April 2009
Do you honestly want to know what X exactly is? Is your life going to be improved by momentarily knowing what x is? No. Absolutely not. This whole problem is a conspiracy against hardworking American students. Let me tell you, solving for X right now is not going to stop the recession. It fact, it’s not going to do anything. And another thing. When have you ever had to know what is X is in your long esteemed professional career? Exactly. This is a futile attempt for “educators” in this district to boast of their student’s success rate. I am going to go the rest of my life not knowing what X is. Because what is X when you really think about it? A letter, the spot, two lines crossing each other. I don’t think anyone will ever really know what X truly is because the essence of X is beyond our brain potential. In conclusion, Harry S. Truman’s middle name was just the letter S, not an actual name. Now that is a letter that’s actually being utilized. See, you learned something, and it was not because of this logarithm. The End.
Implications Minimize interruptions. Make them worthy.
“The higher the stakes of an assessment’s results, the higher the expectation for the documentation supporting the assessment design and the decisions made based on the assessment results.”
Assessment Definition: The strategic collection of evidence of student learning. (Martin-Kniep, 2005) Analogy: Assessment: test as dogs: pitbull A thing and a process
Traditional Assessment Performance-Based Assessment
Performance-Based Assessments (PBAs) A performance task is an assessment that requires students to demonstrate achievement by producing an extended written or spoken answer, by engaging in group or individual activities, or by creating a specific product. (Nitko, 2001)
Performance vis-à-vis- traditionalLiskin-Gasparro (1997) and Mueller (2008)
How do we ensure alignment and validity in assessment?Degrees of Alignment
If you want to assess your students’ ability to perform, design, apply, interpret. . . . . . then assess them with a performance or product taskthat requires them to perform, design, apply, or interpret.
I cannot claim my assessment is valid if I do not have some type of articulated test map
How many?3-5 3 – 5 standards in a PBA (reflected in rows in the rubric) 3 – 5 items per standard on a traditional test
I cannot claim my assessment is reliable if I do not have statistics to support my claim
Reliability Indication of how consistently an assessment measures its intended target and the extent to which scores are relatively free of error. Low reliability means that scores cannot be trusted for decision making. Necessary but not sufficient condition to ensure validity.
three general ways to collect evidence of reliability Stability: How consistent are the results of an assessment when given at two time-separated occasions? Alternate Form: How consistent are the results of an assessment when given in two different forms?; Internal Consistency: How consistently do the test’s items function?
Three Types of Measurement Error Subject effect Test effect Environmental effects
Others… Fatigue Sleep deprivation Illness Disability
Testing Fatigue Test Familiarity Bias Score Score
Examples Not enough space for a response Confusing items Typos Misleading (or lacking) directions Scorer inconsistencies
10. Format the item vertically instead of horizontally. From A Review of Multiple-Choice Item-Writing Guidelines for Classroom Assessment by Haladyna, Downing, and Rodriguez
21. Place choices in logical or numerical order. Students should not have to hunt to find an answer. Answers should be provided in a logical, predictable pattern.
Final Eyes isn’t about editing rather “is this what you want the students to see/read?”
From Haladyna: 26. Avoid All-of-the-above. 28. Avoid giving clues to the right answer, such as specific determiners including always, never, completely, and absolutely
Develop Test Maps and Item Analysis Procedures • The higher the stakes of an assessment, the more we need to play by the rules • If it’s a mid-term or final exam, there should be a test map. • Consider also: • Item analysis • Using choice E (primarily for pre-assessments)
Engage in peer review “Final Eyes” • Is each item aligned to a standard?* • Is each item rigorous? • Is each item fair? • Does each item have one, unambiguous correct key?* • Are all plausible/text based? • Are all tasks meaningful and build upon student comprehension? *Very hard to answer without a test map