590 likes | 608 Views
Welcome to the UA Assessment Showcase 2012!. Sponsored by the Office of the Provost, the Assessment Coordinating Council (ACC), and the Office of Instruction and Assessment. Actionable Assessment of Academic Programs: Principles and Practices for Usable Results. Jo Beld
E N D
Welcome to the UA Assessment Showcase 2012! Sponsored by the Office of the Provost, the Assessment Coordinating Council (ACC), and the Office of Instruction and Assessment
Actionable Assessmentof Academic Programs:Principles and Practicesfor Usable Results Jo Beld Professor of Political Science Director of Evaluation & Assessment Assessment Showcase April 17, 2012
Agenda • Principles: Conceptual frameworks • Practices: Making assessment useful • Practices: Engaging faculty • Politics and policy: The bigger picture
Conceptual frameworks Utilization-focused assessment (Patton, 2008): Focus on intended uses by intended users
Conceptual frameworks Backward design (Wiggins & McTighe, 2005): “Beginning with the end in mind”
Conceptual frameworks Traditional assessment design: Choose an assessment instrument Gather and summarize evidence Send a report to someone
Conceptual frameworks Backward assessment design: Identify intended users and uses Define and locate the learning Choose assessment approach
Conceptual frameworks Once you’ve defined your outcomes, start planning your assessment project here
Making assessment useful Studio art major • Developed evaluation form for senior exhibit that doubles as assessment instrument • Addressed disconnect between student and faculty criteria for artistic excellence • Revised requirements for the major • Refocused common foundation-level courses
Making assessment useful Chemistry major • Using ACS exam as final in Chem 371: Physical Chem • Students outperform national average and do well in kinetics despite limited coverage in course • Chem 371 being retooled to focus on thermodynamics and quantum mechanics
Making assessment useful History major % “exemplary” ability to… • Gathering evidence in 2011-12 (voluntarily!) to examine sequencing in the major • Examining ability to understand and work with historiography in new intermediate seminars for major
Making assessment useful Statistics concentration • Collaboratively designed final exam question and grading rubric in Stats 270 to examine interpretation and communication of results; two faculty graded essays • Instructor adjusted teaching in response to findings
Making assessment useful Management Studies concentration • Quiz scores: Teams outperform best individual students • Course evaluations: Students believe they learned “much” or “exceptional amount” by working together in teams (73%) • Team-based learning being extended to other courses Mean Results – Management Studies 251 Course Quizzes
Making assessment useful Interdisciplinary programs • Collaboratively developed assessment questionnaire • Considering direct assessment of interdisciplinary proficiency using common rubric with program-level portfolio • Will consider whether all programs should have capstone course or experience
Making assessment useful Benefits for individual courses: • Setting priorities for content/instruction • Revising/expanding assignments • Clarifying expectations for students • Enhancing “scaffolding” • Piloting or testing innovations • Affirming current practices
Making assessment useful Benefits for the program as a whole: • Strengthening program coherence • Sending consistent messages to students • Revising program requirements • Extending productive pedagogies • Affirming current practices
Making assessment useful More program benefits: • Telling the program’s story to graduate schools and employers • Enhancing visibility to disciplinary and inter-disciplinary associations • Supporting grant applications • Meeting requirements for specialized accreditation
Making assessment useful Benefits for faculty members: • Efficiencies in curriculum and instruction • Confidence that what you’re doing is working • Collaboration and collegiality within and across departments • Professional development for early faculty • Better integration of adjunct faculty
Making assessment useful How might assessment be useful for an individual course, your program as a whole, or your faculty colleagues?
Engaging faculty • Consider your colleagues • De-mystify assessment • Reduce costs and enhance rewards
Engaging faculty Consider your colleagues: Faculty roles, commitments, and disciplinary identities offer both incentives and disincentives to engage assessment
Engaging faculty Your colleagues as practitioners of their disciplines: Studio Art
Engaging faculty Your colleagues as practitioners of their disciplines: Chemistry
Engaging faculty Your colleagues as practitioners of their disciplines: Political Science
Engaging faculty Demystifying assessment: • Not scholarship of teaching and learning • Not individual teaching evaluation • Not student satisfaction data • Not necessarily quantitative • Not rocket science (unless that’s what you teach!)
Engaging faculty “Direct” Assessment” “Direct” Assessment Evidence of what students actually know, can do, or care about “Indirect” Assessment Evidence of learning-related experiences or perceptions
Engaging faculty Common direct assessment “artifacts” • Theses, papers, essays, abstracts • Presentations and posters • Oral or written examination items • Responses to survey or interview questions that ask for examples of knowledge, practice, or value
Engaging faculty Common indirect assessment “artifacts” • Course mapping, course-taking patterns or transcript analysis • Responses to survey or interview questions about experiences, perceptions, self-reported progress, or impact of program experiences • Reflective journals
Engaging faculty But wait!! Aren’t we observing student work all the time anyway? What’s the difference between grading and assessment?
Grading summarizes many outcomes for one student Assessment summarizes one outcome for many students Engaging faculty
Engaging faculty The purpose of assessment is to provide systematic, summarized informationabout the extent to which a group of students has realized one or more intended learning outcomes
Engaging faculty Reducing the costs of assessment • Use what you’ve already got • Borrow freely • Integrate assessment into work you are already doing • Share the work broadly • Limit your agenda
Engaging faculty Reaping the rewards of assessment • Address questions that matter to faculty • Build in collaboration • Pair direct with indirect methods • Choose approaches that “multi-task” • Dedicate time for discussion and application
Engaging faculty Plan intentionally for use of results • Borrow strategies from past successes in collective departmental action • Focus reporting on planned actions, not on the evidence itself • Weight Watchers, not the Biggest Loser • Dedicate time and resources for action
Engaging faculty What can you do in your program to: • Link assessment to faculty identities and incentives • De-mystify assessment • Reduce costs OR • Enhance benefits?
The bigger picture Accreditation by a federally-recognized accreditation agency is required for access to federal student aid • Recognition requires accreditors to evaluate whether an institution maintains clearly-specified educational objectives and is successful at meeting them.
The bigger picture Guiding values of new HLC criteria: “A commitment to assessment would mean assessment at the program level that proceeds from clear goals, involves faculty at all points in the process, and analyzes the assessment results; it would also mean that the institution improves its programs…on the basis of those analyses.”
Table Talk – Questions for Dr. Beld Table Facilitators (a.k.a. FLC members) Paul Blowers Chemical & Environmental Engineering Eliud Chuffe Spanish & Portuguese Faiz Currim Management Information Systems Wendy Davis Animal Science Ryan Foor Agricultural Education Herman Gordon Cellular & Molecular Medicine Christopher Johnson Educational Technology Amy Kimme-Hea English Carl Maes College of Optical Sciences Katrina Miranda Chemistry & Biochemistry John Murphy Pharmacy Practice & Science Teresa Polowy Russian & Slavic Studies Claudia Stanescu Physiology Hal Tharp Electrical & Computer Engineering Deb Tomanek Office of Instruction & Assessment
Assessment & Research in Student Affairs Angela Baldasare, Ph.D. Director, Divisional Assessment & Research baldasar@email.arizona.edu
Starting from Scratch: From Outcomes to Assessment Activities Aurelie Sheehan, Ph.D. Director, Creative Writing asheehan@email.arizona.edu
Unlocking AssessmentLinking Findings to Outcomes David Cuillier, Ph.D. Director, University of Arizona School of Journalism cuillier@email.arizona.edu
Our Process • Define learning outcomes • Measure at overall program level • Link findings specifically to outcomes • Make adjustments (report & faculty retreat) • Feedback loop – see if it worked
Outcome #10: Technology MEASURE: 2009 survey of multimedia knowledge On a scale of 0-9 Photoshop 6.24 Final Cut 2.59 Dreamweaver.76 Soundslides .47 Audacity .35 CSS .35 Flash .24 FINDING: Need more Soundslides/Audacity training ADJUSTMENT: Created multimedia class in 2010 FEEDBACK LOOP: To survey students again 2012
Outcome #9: Writing MEASURE: Survey of intern supervisors FINDING: Positive trajectory on student writing ADJUSTMENT: Keep doing what we’re doing
Lessons learned • Faculty buy-in through program-level view • One person responsible • Model assessment plans (e.g., Elon) • Explicit reporting – state it clearly • Focus on findings, not methods
Program Assessment: Raw Data to Findings Ingrid Novodvorsky, Ph.D. Director, College of Science Teacher Preparation Program novod@email.arizona.edu
TEACHER PREPARATION PROGRAM Student Learning Outcomes—Core Understandings (These describe attributes of a well-prepared science teacher.)