360 likes | 505 Views
Actionable Assessment of Academic Programs: Principles and Practices for Usable Results. Jo Michelle Beld Professor of Political Science Director of Evaluation & Assessment Integrative Learning and the Departments – July 2013. Agenda. Principles: Conceptual frameworks
E N D
Actionable Assessmentof Academic Programs:Principles and Practicesfor Usable Results Jo Michelle Beld Professor of Political Science Director of Evaluation & Assessment Integrative Learning and the Departments – July 2013
Agenda Principles: Conceptual frameworks Practitioners: Case studies Practices: Specific strategies
Principles Utilization-focused assessment (Patton, 2008): Focus on intended uses by intended users
Principles Backward design (Wiggins & McTighe, 2005): “Beginning with the end in mind”
Principles Traditional assessment design: Choose an assessment instrument Gather and summarize evidence Send a report to someone
Principles Backward assessment design: Identify intended uses Define and locate the learning Choose assessment approach
Practitioners Sample departments/programs (pp. 2-3): History Chemistry Studio Art Management Studies Interdisciplinary Programs
Practitioners Studio Art (p. 4) • Developed evaluation form for senior exhibit that doubles as assessment instrument • Addressed disconnect between student and faculty criteria for artistic excellence • Revised requirements for the major • Refocused common foundation-level courses
Practitioners Chemistry • Using ACS exam as final in Chem 371: Physical Chem • Students outperformed national average and did well in kinetics despite limited coverage in course • Chem 371 being retooled to focus on thermodynamics and quantum mechanics
Practitioners History (p. 5): % “exemplary” ability to… • Gathered evidence in 2011-12 (voluntarily!) to examine sequencing in the major • Examining ability to understand and work with historiography in new intermediate seminars for major
Practitioners Statistics • Collaboratively designed final exam question and grading rubric in Stats 270 to examine interpretation and communication of results; two faculty graded essays • Instructor adjusted teaching in response to findings
Practitioners Management Studies • Quiz scores: Teams outperformed best individual students • Course evaluations: Students believed they learned “much” or “exceptional amount” in teams (73%) • Team-based learning being extended to other courses Mean Results – Management Studies 251 Course Quizzes
Practitioners Interdisciplinary programs (pp. 6-8): • Collaboratively developed assessment questionnaire • Next:Should all programs have capstone course or experience? Will conduct direct assessment of interdisciplinary proficiency using common rubric with program-level portfolio
Principles (redux) Uses in individual courses: • Setting priorities for content/instruction • Revising/expanding assignments • Clarifying expectations for students • Enhancing “scaffolding” • Piloting or testing innovations • Affirming current practices
Principles (redux) Uses in the program as a whole: • Strengthening program coherence • Sending consistent messages to students • Revising program requirements • Extending productive pedagogies • Affirming current practices
Principles (redux) More program uses: • Telling the program’s story to graduate schools and employers • Enhancing visibility to disciplinary and inter-disciplinary associations • Supporting grant applications • Meeting requirements for specialized accreditation
Principles (redux) What specific use of assessment in your department or program could make assessment more meaningful for your colleagues?
Practices “Direct” Assessment” “Direct” Assessment Evidence of what students actually know, can do, or care about “Indirect” Assessment Evidence of learning-related experiences or perceptions
Practices Common direct assessment “artifacts” • Theses, papers, essays, abstracts – individually or in a portfolio • Presentations and posters • Oral or written examination items • Responses to survey or interview questions that ask for examples of knowledge, practice, or value
Practices Common indirect assessment “artifacts” • Course mapping, course-taking patterns or transcript analysis • Responses to survey or interview questions about experiences, perceptions, self-reported progress, or impact of program experiences • Reflective journals
Practices But wait!! Aren’t we observing student work all the time anyway? What’s the difference between grading and assessment?
Practices Grading summarizes many outcomes for one student Assessment summarizes one outcome for many students
Practices The purpose of assessment is to provide systematic, summarized informationabout the extent to which a group of students has realized one or more intended learning outcomes
Practices Options to consider: • Use an instrument developed by someone else • Adapt an existing instrument • Add to something you’re already doing • Connect to institutional-level evidence • Invent something new
Practices Rubrics • Inter-institutional level (p.8) • Institutional level (p. 9) • Department/program level (pp. 10-12) • Course level (p. 13) • Some brief advice (pp. 14-15)
Practices Externally-developed tests • Biology: ETS Major Field Exam • Chemistry: American Chemical Society exams • Classics: Eta Sigma Phi Latin and Greek exams • French: ACTFL Oral Proficiency interviews • Nursing: California Critical Thinking Skills Test • Psychology: Area Concentration Achievement Tests
Practices Locally-developed tests or test items • Chemistry: Safety quiz for all lab students • Physics:Programming test items in introductory and advanced seminar courses • Statistics: Collectively-written final exam essay question in intermediate gateway course
Practices Institutional-level surveys • Analysis and/or administration of selected items by department/division (p. 16) NSSE, RPS • Triangulation (pp. 17-19) • Reflection (p. 20)
Practices Locally-developed questionnaires • Program-level learning outcomes survey (Mathematics – p. 21) • Knowledge and attitudes survey (Environmental Studies – students in on-campus course and undergraduate research field experience; pre/post in both; paired with direct assessment of student work – p. 22)
Practices Course-embedded outcomes reporting (pp. 23-24) • Identify one or more outcomes of interest • Identify one or more assignments that develop and demonstrate the outcome(s) • Rate each student’s work just in relation to that outcome • Aggregate results and suggest significance for course/program
Practices Where possible, pair indirect observations of processes and perceptions with direct observations of outcomes (pp. 22, 25).
Practices The dual goal of sampling: Representativeness and Manageability
Practices Examples involving comprehensive sampling: • Survey of all senior majors • Application of rubric to all research abstracts in all seminars • Application of rubric to all work submitted for senior art show
Practices Examples involving selective sampling: • Application of rubric to randomly-selected subset of final papers in capstone course • Pre/post administration of locally-developed quiz in required sophomore methods course • End-of-course survey in one introductory and one senior-level course • Aggregation of results on selected items in an evaluation form for student work
Practices Which of these assessment strategies might be appropriate for the course- or program-level use of assessment evidence you identified earlier?
A final thought…. Less really is more!