210 likes | 380 Views
Assessment Literacy Series. -Module 4- Scoring Keys and Rubrics. Objectives. Participants will: Develop scoring keys for all multiple choice items outlined with in the blueprint. Develop scoring rubrics for c onstructed r esponse items/tasks that reflect a performance continuum.
E N D
Assessment Literacy Series -Module 4- Scoring Keys and Rubrics
Objectives Participants will: • Develop scoring keys for all multiple choice items outlined with in the blueprint. • Develop scoring rubrics for constructed response items/tasks that reflect a performance continuum.
Helpful Tools Participants may wish to reference the following: Guides • Handout #6 – Scoring Key Example • Handout #7 – Rubric Examples Templates • Template #5-Scoring Key-Rubric Stuff • Performance Task Framework
Outline of Module 4 Module 4: Scoring Keys and Rubrics Scoring Key for MC Process Steps Rubrics for Constructed Response Sample Answers Scoring Criteria
Scoring Keys Scoring Keys typically contain elements such as the following: • Performance measure name or unique identifier • Grade/Course • Administration timeframe (e.g., fall, mid-term, final examination, spring, etc.) • Item tag and type • Maximum points possible • Correct answer (MC) or rubric with sample answers or anchor papers (SCR & ECR)
Scoring Keys Scoring keys for MC items: • Answers within the key must represent a single, correct response. • Answers should be validated once the key is developed to avoid human error. • Validating answers should be done prior to form review. • Items changed during the review stage must be revalidated to ensure the scoring key is correct.
Process Steps[Template #4] • Enter the assessment information at the top of the Scoring Key. • Record the single, correct answer during item development. For SCR and ECR items/tasks, the scoring rubrics should be referenced in the answer column and put in the correct rubric table on the Rubric Template. • Record the item number, item tag, item type, and point value. • Record the MC answers in the answer column. For each CR item, include the general scoring rubric and sample response for each point value. • Repeat Steps 1-4 until all items/tasks on the blueprint are reflected within the Scoring Key.
QA Checklist • All items/tasks articulated on the blueprint are represented within the Scoring Key. • MC items have been validated to ensure only one correct answer among the possible options provided exists. • MC answers do not create a discernible pattern. • MC answers are “balanced” among the possible options. • Scoring Key answers are revalidated after the final operational form reviews are complete.
Holistic vs. Analytic Rubric Scoring Holistic Scoring • Provides a single score based on an overall determination of the student’s performance • Assesses a student’s response as a whole for the overall quality • Most difficult to calibrate different raters Analytic Scoring • Identifies and assesses specific aspects of a response • Multiple dimension scores are assigned • Provides a logical combination of subscores to the overall assigned score
Rubric Scoring Considerations • Describe whether spelling and/or grammar will impact the final score. • Avoid using words like “many”, “some”, and “few” without adding numeric descriptors to quantify these terms. • Avoid using words that are subjective, such as “creativity” or “effort”. • Avoid subjective adjectives such as “excellent” or “inadequate”.
Rubrics for ECR Tasks • Create content-based descriptions of the expected answer for each level of performance on the rubric. • Provide an example of a fully complete/correct response along with examples of partially correct responses. • Reference the item expectations in the rubric. • Make the rubric as clear and concise as possible so that other scorers would assign exact/adjacent scores to the performance/work under observation.
Process Steps[Template #4] • Create the item/task description for the student. • Using a “generic” rubric, begin by modifying the language using specific criteria expected in the response to award the maximum number of points. • Next, determine how much the response can deviate from “fully correct” in order to earn the next (lower) point value. [Continue until the full range of possible scores is described] • Using the “sample” rubric, create an example of a correct or possible answer for each level in the rubric. • In review, ensure the item/task description for the student, the scoring rubric, and the sample rubric are aligned.
QA Checklist • CR items/tasks have scoring rubrics that reflect a performance continuum. • CR items/tasks include sample responses for each level of performance. • CR scoring rubrics are clear and concise. • CR scoring rubrics include all dimensions (aspects) of the tasks presented to the students. • CR scoring rubrics avoid including non-cognitive (motivation, timeliness, etc.) or content irrelevant attributes.
Summary & Next Steps Summary Module 4: Scoring Keys & Rubrics • Developed a scoring key and rubrics for all items. Next Steps Module 5: Operational Forms & Administrative Guidelines • Given the items/tasks developed, create an operational form with applicable administrative guidelines.