540 likes | 1.1k Views
Revised Bloom’s Taxonomy and Designing Effective Rubrics. Jeffrey A. Greene, Ph.D. Assistant Professor of Educational Psychology, Measurement, and Evaluation School of Education University of North Carolina at Chapel Hill May 27, 2010. Our Learning Objectives.
E N D
Revised Bloom’s Taxonomy and Designing Effective Rubrics Jeffrey A. Greene, Ph.D. Assistant Professor of Educational Psychology, Measurement, and Evaluation School of Education University of North Carolina at Chapel Hill May 27, 2010
Our Learning Objectives • Understand Revised Bloom’s Taxonomy • Relation to learning outcomes and assessment • Be able to create a Taxonomy Table • Understand purposes of formative and summative assessment • Be able to create effective rubrics
Purpose of Revising Bloom’s Taxonomy • Align educational objectives, instruction, and assessment • Enable teachers to prepare students for state-wide standardized assessments without “teaching to the test” • Expand education beyond memorization to “higher-order” cognition • Highlight that different types of objectives require different types of assessment
Revised Bloom’s Taxonomy Verbs (Cognitive processes) Nouns (Kinds of Knowledge)
Kinds of Knowledge • Factual/Declarative: “what” knowledge • Conceptual: connections, relations among “what” knowledge • Also classifications, generalizations, theories, models • Procedural: “how” knowledge • Domain-general, subject-specific • Metacognitive: “when” or “where” or “for what reason” knowledge • Knowledge of strategies • Knowledge of tasks • Monitoring understanding • Self-regulating learning
Cognitive Processes • Remember • Recognizing, recalling • Understand • Interpreting, exemplifying, classifying, summarizing, inferring, comparing, explaining • Apply • Executing, implementing • “Near transfer”
Cognitive Processes • Analyze • Breaking down into parts, differentiating, organizing, attributing • Requires conceptual knowledge • “What happens if the system breaks down?” • Evaluate • Making judgments, checking, critiquing • Create • Generating, planning, producing novel products
Taxonomy Table Standard 301.b.1 “Knowledge and understanding of substantive law…necessary for effective and responsible participation…” Question: what does “understand” really mean in this standard? Apply, analyze, evaluate?
Let’s Create a Taxonomy Table • We need a learning objective
Curriculum-Wide Taxonomy Table Standard 302.b.i “Proficiency as an entry level practitioner in legal research”
Challenges to Revised Taxonomy • What happens when learning objectives, instruction, assessment do not line up? • Objectives and instruction without assessment • Instruction and assessment without objectives • Objectives and assessment without instruction • Does this all take too much time?
Connection to Objectives and Instruction • Ideally objectives use verbs from taxonomy • 19 cognitive processes • Can put instructional methods in boxes as well • What content/activities/skills/knowledge/beliefs are necessary to promote this noun/verb combo? • What combinations of nouns/verbs need to be taught in-class, and what can be done through technology? • Declarative knowledge / remember, understand • Classtime = practice, conceptual knowledge
Connection to Assessment • Standard 304.a.2 “Employ a variety of assessment methods and activities, consistent with effective pedagogy, systematically and sequentially throughout the curriculum…” • Every learning objective in the taxonomy table should have an assessment (or part of an assessment) assigned to it • Emphasis on variety of cognitive processes and formative feedback requires expanded list of assessment types • Classification task “Is X an example of class Y?” • Essays • Experiential / performance-based assessments • Professor Penny Wilrich’s presentation this morning
Additional Assessment Challenges • How do you assess procedural, conceptual, metacognitive knowledge? • How do you assess 302.b.1.i “Legal analysis and reasoning, legal research, problem solving…etc?” • How do you assess emotional intelligence, collaboration skills, ability to work in a hierarchy • Requires: • Assessment conceptualized as both formative and summative • Ways of scoring constructed response items • Rubrics
Formative and Summative Assessment • Standard 304.a.3: “Provide feedback to students periodically and throughout their studies about their progress in achieving its learning outcomes” • This morning: “People don’t like to be assessed” • Assessment culture • Formative: diagnostic, measure of developing knowledge • Informative feedback • Repeated assessment: same skills, different contexts • Often ungraded or self-graded • Summative: evaluative, indication of achievement
Rubrics • Rubrics help clarify important assessment criteria for ill-structured tasks • Performances, debates, presentations • Help in accuracy, reliability of scoring • Rubrics should be specific to assignment • Provide space for narrative comments
4 Key Dimensions of Rubrics • Evaluative criteria for scoring: should follow from learning objectives • Quality definitions: determine how to discriminate between good, adequate, and poor performance on criterion • Scoring strategy: • Holistic: judge entire product using combined criteria • Analytic: score criteria individually, then aggregate for final score • Scorer(s): Professor, peers, self, others?
Rubric Problems • Too general: “A superior essay is one in which the essay itself is written in a high-quality, excellent manner. An inferior essay is typified by poor quality. An adequate essay falls between the superior and inferior essays in terms of organization.” • Too specific (and detailed): “A superior essay has (1) a specific statement about the two major components of rubrics, (2) a discussion of the two types of scoring strategies…”
Exercise: Developing Rubrics • Evaluative Criteria • Quality Definitions
Developing and Using Rubrics • By professor: iterate and improve • In collaboration with students • Time consuming • Can build trust, understanding • When to distribute rubrics: • With assignment: can help with formative assessments • With grades: justifies summative assessments
Advanced Assessment Challenges • 302.b.2.ii “Ability to recognize and resolve ethical and other professional dilemmas” • Recognize: “Remember” cognitive process • Resolve: “Apply” cognitive process • Practice assessments? Role plays?
Questions? • How can taxonomy tables and rubrics be used: • Within classes • Across classes • Across curricula
References Airasian, P. W., & Miranda, H. (2002). The role of assessment in the revised taxonomy. Theory Into Practice, 41(4), 249-254. Anderson, R. S., & Puckett, J. B. (2003). Assessing students’ problem-solving assignments. New Directions for Teaching and Learning, 95, 81-87. Krathwohl, D. R. (2002). A revision of Bloom’s taxonomy: An overview. Theory Into Practice, 41(4), 212-218. Lalley, J. P., & Gentile, J. R. (2008). Classroom assessment and grading to assure mastery. Theory Into Practice, 48(1), 28-35. Rhodes, T. L. (Ed.) (2010). Assessing outcomes and improving achievement: Tips and tools for using rubrics. Association of American Colleges and Universities. Tractenberg, R., E., Umans, J., & McCarter, R. (2010). A mastery rubric: Guiding curriculum design, admissions and development of course objectives. Assessment & Evaluation in Higher Education, 35(1), 15-32.
More General Resources Assessment & Evaluation in Higher Education journal: http://www.tandf.co.uk/journals/titles/02602938.asp Association of American College and Universities resources on assessment: http://www.aacu.org/resources/assessment/index.cfm ETS “Culture of Evidence” Reports: http://www.ets.org/portal/site/ets/menuitem.1488512ecfd5b8849a77b13bc3921509/?vgnextoid=e35dee84d15e7110VgnVCM10000022f95190RCRD&vgnextchannel=b5e2d4a2394e7110VgnVCM10000022f95190RCRD