1 / 36

Performance Assessment, Rubrics, & Rating Scales

Performance Assessment, Rubrics, & Rating Scales. Trends Definitions Advantages & Disadvantages Elements for Planning Technical Concerns. Types of Performance Assessments. Performance Assessment. Who is currently using performance assessments in their courses or programs?

lore
Download Presentation

Performance Assessment, Rubrics, & Rating Scales

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Performance Assessment, Rubrics, & Rating Scales • Trends • Definitions • Advantages & Disadvantages • Elements for Planning • Technical Concerns Deborah Moore, Office of Planning & Institutional Effectiveness

  2. Types of Performance Assessments Deborah Moore, Office of Planning & Institutional Effectiveness

  3. Performance Assessment Who is currently using performance assessments in their courses or programs? What are some examples of these assessment tasks? Deborah Moore, Office of Planning & Institutional Effectiveness

  4. Primary Characteristics • Constructed response • Reviewed against criteria/continuum (individual or program) • Design is driven by assessment question/ decision Deborah Moore, Office of Planning & Institutional Effectiveness

  5. Why on Rise? • Accountability issues increasing • Educational reform has been underway • Growing dissatisfaction with traditional multiple choice tests (MC) Deborah Moore, Office of Planning & Institutional Effectiveness

  6. Exercise 1 • Locate the sample rubrics in your packet. • Working with a partner, review the different rubrics. • Describe what you like and what you find difficult about each (BE KIND). Deborah Moore, Office of Planning & Institutional Effectiveness

  7. Advantages AsReported By Faculty • Clarification of goals & objectives • Narrows gap between instruction & assessment • May enrich insights about students’ skills & abilities • Useful for assessing complex learning Deborah Moore, Office of Planning & Institutional Effectiveness

  8. Advantages for Students • Opportunity for detailed feedback • Motivation for learning enhanced • Process information differently Deborah Moore, Office of Planning & Institutional Effectiveness

  9. Disadvantages • Requires Coordination • Goals • Administration • Scoring • Summary & Reports Deborah Moore, Office of Planning & Institutional Effectiveness

  10. Disadvantages • Archival/Retrieval • Accessible • Maintain Deborah Moore, Office of Planning & Institutional Effectiveness

  11. Disadvantages • Costs • Designing • Scoring (Train/Monitor) • Archiving Deborah Moore, Office of Planning & Institutional Effectiveness

  12. Steps in Developing Performance Assessments 1. Clarify purpose/reason for assessment 2. Clarify performance 3. Design tasks 4. Design rating plan 5. Pilot/revise Deborah Moore, Office of Planning & Institutional Effectiveness

  13. Steps in Developing Rubrics 1. Identify purpose/reason for rating scale 2. Define clearly what is to be rated 3. Decide which you will use a. Holistic or Analytic b. Generic or Task-Specific 4. Draft the rating scale and have it reviewed Deborah Moore, Office of Planning & Institutional Effectiveness

  14. Recommendations Deborah Moore, Office of Planning & Institutional Effectiveness

  15. Steps in Developing Rubrics (continued) 5. Pilot your assessment tasks and review 6. Apply your rating scales 7. Determine the reliability of the ratings 8. Evaluate results and revise as needed. Deborah Moore, Office of Planning & Institutional Effectiveness

  16. Descriptive Rating Scales • Each rating scale point has a phrase, sentence, or even paragraph describing what is being rated. • Generally recommended over graded-category rating scales. Deborah Moore, Office of Planning & Institutional Effectiveness

  17. Portfolio Scoring Workshop Deborah Moore, Office of Planning & Institutional Effectiveness

  18. Subject Matter Expertise Experts like Dr. Edward White join faculty in their work to refine scoring rubrics and monitor the process. Deborah Moore, Office of Planning & Institutional Effectiveness

  19. Exercise 2 • Locate the University of South Florida example. • Identify the various rating strategies that are involved in use of this form. • Identify strengths and weaknesses of this form. Deborah Moore, Office of Planning & Institutional Effectiveness

  20. Common Strategy Used • Instructor assigns individual grade for an assignment within a course. • Assignments are forwarded to program-level assessment team. • Team randomly selects a set of assignments and assigns a different rating scheme. Deborah Moore, Office of Planning & Institutional Effectiveness

  21. Exercise 3 • Locate Rose-Hulman criteria. • Select one of the criteria. • In 1-2 sentences, describe an assessment task/scenario for that criterion. • Develop rating scales for the criterion. • List traits • Describe distinctions along continuum of ratings Deborah Moore, Office of Planning & Institutional Effectiveness

  22. Example of Consistent & Inconsistent Ratings Deborah Moore, Office of Planning & Institutional Effectiveness

  23. Calculating Rater Agreement (3 Raters for 2 Papers) Deborah Moore, Office of Planning & Institutional Effectiveness

  24. Rater Selection and Training • Identify raters carefully. • Train raters about purpose of assessment and to use rubrics appropriately. • Study rating patterns and do not keep raters who are inconsistent. Deborah Moore, Office of Planning & Institutional Effectiveness

  25. Some Rating Problems • Leniency/Severity • Response set • Central tendency • Idiosyncrasy • Lack of interest Deborah Moore, Office of Planning & Institutional Effectiveness

  26. Exercise 4 • Locate Generalizability Study tables (1-4). • In reviewing table 1, describe the plan for rating the performance. • What kinds of rating problems do you see? • In table 2, what seems to be the biggest rating problem? • In table 3, what seems to have more impact, additional items or raters? Deborah Moore, Office of Planning & Institutional Effectiveness

  27. Generalizability Study (GENOVA) • G Study: identifies sources of error (facet) in the overall design; estimates error variance for each facet of the measurement design • D Study: estimates reliability of ratings with current design to project outcome of alternative designs Deborah Moore, Office of Planning & Institutional Effectiveness

  28. Summary • Interpretation - Raters using the rubric in non-systematic ways • Reliability (phi) values range from .21 to .67 for the teams—well below .75 level desired Deborah Moore, Office of Planning & Institutional Effectiveness

  29. What Research Says About Current Practice Deborah Moore, Office of Planning & Institutional Effectiveness

  30. Use on the rise Costly Psychometrically challenging Summary Deborah Moore, Office of Planning & Institutional Effectiveness

  31. Thank you for your attention. • Deborah Moore, Assessment Specialist • 101B Alumni GymOffice of Planning & Institutional Effectiveness • dlmoor2@email.uky.edu • 859/257-7086 • http://www.uky.edu/LexCampus/; http://www.uky.edu/OPIE/ Thank you for attending. Deborah Moore, Office of Planning & Institutional Effectiveness

More Related