1 / 13

Fairness, Accuracy, & Consistency in Assessment

Fairness, Accuracy, & Consistency in Assessment. NCATE’s recommendations to reduce bias and ensure fairness to students. Fairness “ Assess what’s been taught ” Candidates should be aware of the knowledge, skills, and dispositions which are measured in the assessments. Was it taught?

ira
Download Presentation

Fairness, Accuracy, & Consistency in Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Fairness, Accuracy, & Consistency in Assessment NCATE’s recommendations to reduce bias and ensure fairness to students

  2. Fairness“Assess what’s been taught”Candidates should be aware of the knowledge, skills, and dispositions which are measured in the assessments. • Was it taught? • Curriculum map – shows where students learn and practice what is assessed • Do students understand the expectations? • Syllabi containing instructions and timing of assessments • Rubrics/scoring guides shared with students

  3. FairnessAccording to these guidelines, are your program’s assessments FAIR? • Is your curriculum map up to date? • Does the curriculum map link to all professional standards and GSE standards and dispositions? • Does the curriculum map indicate any gaps? • Do syllabi indicate timing of assessments? • Are rubrics/scoring guides shared with students when assigning the work?

  4. Accuracy“Assessments measure what they say they measure”Assessments should be aligned with standards and proficiencies they are designed to measure • Are assessments aligned with standards? • Content match • Complexity match • Appropriate degree of difficulty • Is there corroborating evidence? • Is there field input on the assessment?

  5. AccuracyAccording to these guidelines, are your program’s assessments ACCURATE? • Do your program’s assessment types (test, observation) match what is being assessed? • Dispositions = observations • Skills = performance assessment • Which of your assessments “validate” (relate to) other assessments? • Ex. - Work sample to lesson plan assignment • Have you had input from working professionals?

  6. Consistency“Assessments produce dependable, trustworthy results”The assessment results should be reliably consistent regardless of time and rater • Are scoring tools sufficiently descriptive? • Language to differentiate between components and between performance levels • Clear descriptors that promote accurate scoring • Students can tell from scoring tool why they were rated at a certain level • Are raters trained? • Agreement between “what a 3 in lesson planning” looks like • Raters understand of the consequences of final scores • Programs have a plan to support/address students with insufficient performance that alleviate rater-pressure

  7. ConsistencyAccording to these guidelines, are your program’s assessments CONSISTENT? • Does your program consistently use descriptive rubrics for major assignments and performance observations? • Does your program have regular rater training? • Has your program engaged in rubric calibration and moderation activities? • Are raters aware of how their assessment’s scores affect the student? • Are raters aware of how their assessment’s scores contribute to program review? • What is the plan to support struggling students?

  8. Avoiding Bias“removing contextual and cultural bias from assessments”The assessment itself and assessment context should be analyzed for factors that would affect performance. • Are clear assessments administered in the proper environment? • Location/equipment • Clear instructions/questions • Have assessments been reviewed for bias? • Racial/ethnic/cultural stereotypes • Disability resource center review • Assignments that favor one group over another

  9. Avoiding BiasAccording to these guidelines, are your program’s assessments BIAS-FREE? • Have all key assessments been reviewed for clarity of expectations? • Have all your assignments been reviewed for accessibility? • Have all your assignments been scrutinized for cultural bias and stereotypes? • Has your program analyzed student outcomes according to sub-groups to determine if consistent scoring bias exists?

  10. GSE Rubric GuidelinesDeveloped by the Assessment Committee - 2010 • 4 levels of competency: Unsatisfactory, Emerging, Proficient, Exemplary • In ascending order from left to right • If numbers are used: 1-4 from left to right

  11. Rubric Moderation • Process of strengthening consistency • Develops inter-rater reliability through shared examination and discussion of student work. • Involves all/many raters • Process • Recruit raters for a two-hour session • Provide 4 samples of work • Provide rubric and have raters “score” each sample • Discuss as a group why the raters chose the scores • Debrief about what was learned, what remains unanswered.

  12. Rubric Calibration • Process of strengthening consistency • Develops inter-rater reliability by setting expectations of what the scores mean regarding student work. • Involves all/many raters • Process • Recruit raters for a two-hour session • Provide 4 samples of work that have been pre-scored (anchor papers), at least one at a low, medium, and high level of performance • Discuss rubric areas and expectations for each level and component before scoring begins • Provide rubric and have raters “score” each sample with your discussion in mind. • Have raters compare the scores they assigned to the “anchor” papers • Debrief about what was learned, what remains unanswered.

  13. Next Steps • How is your program doing in providing fair, accurate, consistent, bias-free assessments to students? • What work needs to be done in your program to ensure quality assessments are used? • What do you need to be able to accomplish this?

More Related