270 likes | 583 Views
Developing Rubrics . Dr. Jesus R. Dela Rosa Instructor, English Language Center JIC ABET Member December 05, 2011. Outline. Objectives Introduction Discussion Rubrics Proposed rubrics for Oral presentation; Lab work & Writing report Academic Exchanges. Objectives.
E N D
Developing Rubrics Dr. Jesus R. Dela Rosa Instructor, English Language Center JIC ABET Member December 05, 2011
Outline • Objectives • Introduction • Discussion • Rubrics • Proposed rubrics for • Oral presentation; Lab work & Writing report • Academic Exchanges
Objectives • Discuss rubrics briefly • Explain a proposed rubric for oral presentation • Gather feedback on its content and form from the participants
Introduction • Delivery of AS programs • Different courses= different assessment tools • Bloom’s learning domains • Cognitive (head), affective(heart), psychomotor (hands)
Introduction • Different tools to assess outcomes • Direct = What students/learners can do • Indirect= What other people think and feel about students/learners can do/have (?) • Criterion 4. Continuous Improvement • Upper State U model of the Self-Study Report (SSR) • 41 pages out of 100 (41%) • how we measure student outcomes/performance through assessment tools clearly specified in our Course Assessment Charts • Some reminders to SSR writers • Use of various assessment tools (direct/indirect ) • Use of rubrics is encouraged in some courses
Discussion • Why rubrics • A hype? (hot issue) • From test to task (performance-based not knowledge-based) • “Tell me-I listen; Teach me- I learn- Show me how- I learn and live” • ABET – outcome-based • Does not believe in “one size fits all” assessment (triangulation)
Why rubrics • Written tests- a thing of the past? • Alternative assessment tools • The bottom line • Rubrics-learning grounded in the real world where students live • Student-centered; standards driven (criterion-referenced)
What are Rubrics • Rubrics - performance-based assessmentsthat • evaluate student performance on any given task/set of tasks that ultimately leads to a final product, or learning outcome (over-all assessment)= Student Outcomes • use specific criteria as a basis for evaluating or assessing student performances (specific) marking sheet = subjective –more objective • narrative descriptions (descriptors) of possible performance related to a given task • http://www.teach-nology.com/currenttrends/alternative_assessment
Rubrics • Types of rubrics (according to rating scales) • Holistic- • More global • Views the final product as a set of interrelated tasks contributing to the whole • Anchor points are used to assign value to descriptions of products or performances that contribute to the whole
Holistic rubric • Holistic scoring proves to be efficient and quick • One score provides an overall impression of ability on any given product or work • Often written generically and can be used with many tasks • Saves time by minimizing the number of decisions raters must make • Ratersif trained properly tend to apply them consistently, resulting in more reliable measurement
Holistic rubric • Disadvantages • Scoring does not provide detailed information about student performance in specific areas of content or skill • No specific feedback about students’ strengths/weaknesses • Does little to separate the tasks • Performances may meet criteria in two or more categories, making it difficult to select the one best description • Criteria cannot be differentially weighted
Analytic rubric • Analytic- • Scoring breaks down the objective or final product into component parts. • Each part is scored independently. • The total score is the sum of the rating for all of the parts that are being evaluated. • Useful in giving feedback on areas of student performance (strengths/weakness). • Dimensions can be measured to reflect relative importance. • Progress over time can be demonstrated when used repeatedly.
Analytic rubric • Disadvantages • More time to prepare (ask any JIC ABET member!) • More possibilities for raters to disagree • More difficult to achieve intra- and inter-rater reliability on all of the criteria/dimensions
Holistic or analytic? • Left to the better judgment of the experts • 6 criteria at most (holistic); more criteria (analytic) • Whether holistic or analytical scales • Important factors in developing effective rubrics- • Use of clear criteria that will be used to rate a student's work. • The performance being evaluated is directly observable. • Students should be informed as to what criteria they are being held accountable.
Proposed rubric for oral presentationRbrcjessedit.docx • Analytic as to type (7 criteria-more ?less?) • Criteria mapped to performance indicators (PIs)-JIC ABET (establish consistency) • Four scales (1-Beginning; 2-Developing; 3-Competent; 4-Outstanding) • Provision for scoring each criterion • Assessment report for oral presentation • Results form part of applicable PIs & SOs
Validation and Reliability Concerns • Rubrics just like other assessment tools have validation and reliability concerns • AS programs may be left to decide or JIC-ABET • Content validity • Let colleague(s) review your rubric (documented/recorded) • Inform students how it works (beginning of semester?) • Check if it is manageable (pilot-testing; to be documented)
Summary • AS programs have different courses • Different courses have different learning domains • Rubrics as direct assessment tools • why, what, types, differences, advantages/disadvantages and which rubric to use • Proposed rubric for oral presentation prepared by JIC ABET- parts; assessment report • Validity and reliability concerns in future • Content validity (colleagues & students) • Other validation/reliability concerns
Conclusions • Rubrics are direct assessment tools that are necessary to measure learning outcomes. • Rubrics require some time to develop and validate. • There are validity and reliability concerns on the development and administration of rubrics. • JIC-ABET Committee has prepared rubrics for guidance of/fine-tuning by the AS programs.