250 likes | 380 Views
Building Rubrics For Large-scale, Campus-wide Assessment Afternoon Session. Thomas W. Zane tom.zane@slcc.edu Diane L. Johnson djohnson@new.edu Jodi Robison jrobison@new.edu. Afternoon Session Agenda. Build a critical thinking rubric designed for adjuncts who teach a GE course.
E N D
Building Rubrics For Large-scale, Campus-wide AssessmentAfternoon Session Thomas W. Zane tom.zane@slcc.edu Diane L. Johnson djohnson@new.edu Jodi Robison jrobison@new.edu
Afternoon Session Agenda • Build a critical thinking rubric designed for adjuncts who teach a GE course. • Build a written communications literacy rubric designed for a group of senior portfolio reviewers. • Build a rubric based on the generic criteria method. • Q&A and Wrap-up.
Afternoon Session Assumptions • You are tasked to take the lead on a large-scale assessment project at your institution. • You face two big issues: • What are the steps for building the scoring rubrics? • How could you guide faculty through the process?
Case #1: Critical ThinkingOur institution has decided to measure critical thinking across the curriculum. Together we form the committee charged with building a way to measure this college-wide outcome.
Critical Thinking Rubric – Early Decisions • Purpose – Measure critical thinking ability based on student submissions in MANY types of courses. • Target of Measurement – Ability level as perceived from written submissions. • Define Critical Thinking • Critical thinking is the conscious and deliberate use of thinking skills and strategies used for guiding what to think, believe, or do.
Critical Thinking Rubric – Search for Criteria – • Found thousands of articles and rubrics. • Identified primary sources. • Collected criteria. • Defined the criteria.
Critical Thinking Rubric – Design The Scale • CT is a human ability that increases in cognitive demand • So…build a scale that reflects this! • Four points (standard for our online systems). • Define the scale points.
Scale Definitions • Determine what each score level means to you. • Define the level as best you can.
Critical Thinking – Select the Specific Aspects for Each Row • Each of the six areas of critical thinking are still too broad. • We need to break them down into smaller aspects. • We went back to the Internet to find rows in rubrics that fit our definitions.
Created a sample of rows for faculty to use. • Used the same six categories to allow for aggregation of data across the campus. • Split each of the six into different sorts of approaches to fit various disciplines. • Promised to come to the department if a row could not be found. (None have requested this.)
Build Your Rubric • Assume you are a department assessment coordinator now. • Assume our signature assignment is to write a paper about off-roading equipment. (Assignment is in the workbook.) • Review Appendix F to select a row for each of the six categories.
Localize the Descriptors • Add assignment-specific language to the rubric descriptors to support more natural feedback to students.
Case #2: Written Communications • Use the same assumptions. • You need a college-wide measure, but you need buy-in from ALL departments. • Let’s build a new rubric using the same methods.
Written Communications Literacy – Early Decisions • Purpose – measure written communications quality from student submissions. • Target of Measurement – writing qualities (trait-based). • DefineWritten Communications Literacy.
Written Communications – Define the Construct • A search for rubrics on this trait yielded over 100K hits. • So, we dropped back and punted. • We used the AAC&U VALUE rubric as a starting point. • In addition, we consulted the commonly used 6+1 Traits Rubric. • From these, we found criteria and potential wording for some rubric rows.
Scale Definitions • We worked with our English teachers to narrow down what the scores should mean. • Major writing errors necessitating major revision or rewrite • Minor quality errors that could be resolved with minor revisions • Competent writing that would pass as is • Excellent writing that went beyond minimum standards
Define the Criteria • Content Development • Genre and Discipline-Specific Conventions • Claims • Credible Evidence • Analysis • Control of Syntax and Mechanics • Overall Impact
Written Communications Rubric Development Procedure • Print/Open a copy of the Starting Point Rubric for Written Communication . • Review the section of the document that corresponds to each row in the rubric template. • Select ONE approach for measuring that criterion (row) from the “Descriptor Categories” in column one of the examples tables. • Place the description into the “3” cell on that row. • Write a descriptor for the remaining column cells along that row.
Review Quality of Our Rubric • Turn back to section 7 in the morning session notes. • Review our rubric in light of the checklist.
Case #3Build From Generic Criteria • Take a look in your workbooks at the generic criteria listing. • We argue that these criteria could lead you to most of the criteria you might want to use in future assignment-based rubrics.
Examples (Workbook) • 2 Examples in the workbook. • Bolded criteria were selected as the most important. • Which do you feel were more important?
Now It Is Your Turn • Select a favorite assignment (or one of the examples). • Select 3-4 important criteria. • Complete a rubric with 3-4 rows including: • Criteria definitions. • Score scale definitions. • Descriptors.
Wrap Up • Questions? • What Worked? • What Needs Work? • Reminder: If you want feedback on your first attempts at a rubric, send them to tom.zane@slcc.edu