1 / 25

Building Rubrics For Large-scale, Campus-wide Assessment Afternoon Session

Building Rubrics For Large-scale, Campus-wide Assessment Afternoon Session. Thomas W. Zane tom.zane@slcc.edu Diane L. Johnson djohnson@new.edu Jodi Robison jrobison@new.edu. Afternoon Session Agenda. Build a critical thinking rubric designed for adjuncts who teach a GE course.

nemo
Download Presentation

Building Rubrics For Large-scale, Campus-wide Assessment Afternoon Session

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Building Rubrics For Large-scale, Campus-wide AssessmentAfternoon Session Thomas W. Zane tom.zane@slcc.edu Diane L. Johnson djohnson@new.edu Jodi Robison jrobison@new.edu

  2. Afternoon Session Agenda • Build a critical thinking rubric designed for adjuncts who teach a GE course. • Build a written communications literacy rubric designed for a group of senior portfolio reviewers. • Build a rubric based on the generic criteria method. • Q&A and Wrap-up.

  3. Afternoon Session Assumptions • You are tasked to take the lead on a large-scale assessment project at your institution. • You face two big issues: • What are the steps for building the scoring rubrics? • How could you guide faculty through the process?

  4. Case #1: Critical ThinkingOur institution has decided to measure critical thinking across the curriculum. Together we form the committee charged with building a way to measure this college-wide outcome.

  5. Critical Thinking Rubric – Early Decisions • Purpose – Measure critical thinking ability based on student submissions in MANY types of courses. • Target of Measurement – Ability level as perceived from written submissions. • Define Critical Thinking • Critical thinking is the conscious and deliberate use of thinking skills and strategies used for guiding what to think, believe, or do.

  6. Critical Thinking Rubric – Search for Criteria – • Found thousands of articles and rubrics. • Identified primary sources. • Collected criteria. • Defined the criteria.

  7. Critical Thinking – Define the Criteria

  8. Critical Thinking Rubric – Design The Scale • CT is a human ability that increases in cognitive demand • So…build a scale that reflects this! • Four points (standard for our online systems). • Define the scale points.

  9. Scale Definitions • Determine what each score level means to you. • Define the level as best you can.

  10. Critical Thinking – Select the Specific Aspects for Each Row • Each of the six areas of critical thinking are still too broad. • We need to break them down into smaller aspects. • We went back to the Internet to find rows in rubrics that fit our definitions.

  11. Created a sample of rows for faculty to use. • Used the same six categories to allow for aggregation of data across the campus. • Split each of the six into different sorts of approaches to fit various disciplines. • Promised to come to the department if a row could not be found. (None have requested this.)

  12. Sample (starting point) Rows

  13. Build Your Rubric • Assume you are a department assessment coordinator now. • Assume our signature assignment is to write a paper about off-roading equipment. (Assignment is in the workbook.) • Review Appendix F to select a row for each of the six categories.

  14. Localize the Descriptors • Add assignment-specific language to the rubric descriptors to support more natural feedback to students.

  15. Case #2: Written Communications • Use the same assumptions. • You need a college-wide measure, but you need buy-in from ALL departments. • Let’s build a new rubric using the same methods.

  16. Written Communications Literacy – Early Decisions • Purpose – measure written communications quality from student submissions. • Target of Measurement – writing qualities (trait-based). • DefineWritten Communications Literacy.

  17. Written Communications – Define the Construct • A search for rubrics on this trait yielded over 100K hits. • So, we dropped back and punted. • We used the AAC&U VALUE rubric as a starting point. • In addition, we consulted the commonly used 6+1 Traits Rubric. • From these, we found criteria and potential wording for some rubric rows.

  18. Scale Definitions • We worked with our English teachers to narrow down what the scores should mean. • Major writing errors necessitating major revision or rewrite • Minor quality errors that could be resolved with minor revisions • Competent writing that would pass as is • Excellent writing that went beyond minimum standards

  19. Define the Criteria • Content Development • Genre and Discipline-Specific Conventions • Claims • Credible Evidence • Analysis • Control of Syntax and Mechanics • Overall Impact

  20. Written Communications Rubric Development Procedure • Print/Open a copy of the Starting Point Rubric for Written Communication . • Review the section of the document that corresponds to each row in the rubric template. • Select ONE approach for measuring that criterion (row) from the “Descriptor Categories” in column one of the examples tables. • Place the description into the “3” cell on that row. • Write a descriptor for the remaining column cells along that row.

  21. Review Quality of Our Rubric • Turn back to section 7 in the morning session notes. • Review our rubric in light of the checklist.

  22. Case #3Build From Generic Criteria • Take a look in your workbooks at the generic criteria listing. • We argue that these criteria could lead you to most of the criteria you might want to use in future assignment-based rubrics.

  23. Examples (Workbook) • 2 Examples in the workbook. • Bolded criteria were selected as the most important. • Which do you feel were more important?

  24. Now It Is Your Turn • Select a favorite assignment (or one of the examples). • Select 3-4 important criteria. • Complete a rubric with 3-4 rows including: • Criteria definitions. • Score scale definitions. • Descriptors.

  25. Wrap Up • Questions? • What Worked? • What Needs Work? • Reminder: If you want feedback on your first attempts at a rubric, send them to tom.zane@slcc.edu

More Related