1 / 36

Actionable Assessment of Academic Programs: Principles and Practices for Usable Results

Actionable Assessment of Academic Programs: Principles and Practices for Usable Results. Jo Michelle Beld Professor of Political Science Director of Evaluation & Assessment Integrative Learning and the Departments – July 2013. Agenda. Principles: Conceptual frameworks

umay
Download Presentation

Actionable Assessment of Academic Programs: Principles and Practices for Usable Results

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Actionable Assessmentof Academic Programs:Principles and Practicesfor Usable Results Jo Michelle Beld Professor of Political Science Director of Evaluation & Assessment Integrative Learning and the Departments – July 2013

  2. Agenda Principles: Conceptual frameworks Practitioners: Case studies Practices: Specific strategies

  3. Principles Utilization-focused assessment (Patton, 2008): Focus on intended uses by intended users

  4. Principles Backward design (Wiggins & McTighe, 2005): “Beginning with the end in mind”

  5. Principles Traditional assessment design: Choose an assessment instrument Gather and summarize evidence Send a report to someone

  6. Principles Backward assessment design: Identify intended uses Define and locate the learning Choose assessment approach

  7. Practitioners Sample departments/programs (pp. 2-3): History Chemistry Studio Art Management Studies Interdisciplinary Programs

  8. Practitioners Studio Art (p. 4) • Developed evaluation form for senior exhibit that doubles as assessment instrument • Addressed disconnect between student and faculty criteria for artistic excellence • Revised requirements for the major • Refocused common foundation-level courses

  9. Practitioners Chemistry • Using ACS exam as final in Chem 371: Physical Chem • Students outperformed national average and did well in kinetics despite limited coverage in course • Chem 371 being retooled to focus on thermodynamics and quantum mechanics

  10. Practitioners History (p. 5): % “exemplary” ability to… • Gathered evidence in 2011-12 (voluntarily!) to examine sequencing in the major • Examining ability to understand and work with historiography in new intermediate seminars for major

  11. Practitioners Statistics • Collaboratively designed final exam question and grading rubric in Stats 270 to examine interpretation and communication of results; two faculty graded essays • Instructor adjusted teaching in response to findings

  12. Practitioners Management Studies • Quiz scores: Teams outperformed best individual students • Course evaluations: Students believed they learned “much” or “exceptional amount” in teams (73%) • Team-based learning being extended to other courses Mean Results – Management Studies 251 Course Quizzes

  13. Practitioners Interdisciplinary programs (pp. 6-8): • Collaboratively developed assessment questionnaire • Next:Should all programs have capstone course or experience? Will conduct direct assessment of interdisciplinary proficiency using common rubric with program-level portfolio

  14. Principles (redux) Uses in individual courses: • Setting priorities for content/instruction • Revising/expanding assignments • Clarifying expectations for students • Enhancing “scaffolding” • Piloting or testing innovations • Affirming current practices

  15. Principles (redux) Uses in the program as a whole: • Strengthening program coherence • Sending consistent messages to students • Revising program requirements • Extending productive pedagogies • Affirming current practices

  16. Principles (redux) More program uses: • Telling the program’s story to graduate schools and employers • Enhancing visibility to disciplinary and inter-disciplinary associations • Supporting grant applications • Meeting requirements for specialized accreditation

  17. Principles (redux) What specific use of assessment in your department or program could make assessment more meaningful for your colleagues?

  18. Practices “Direct” Assessment” “Direct” Assessment Evidence of what students actually know, can do, or care about “Indirect” Assessment Evidence of learning-related experiences or perceptions

  19. Practices Common direct assessment “artifacts” • Theses, papers, essays, abstracts – individually or in a portfolio • Presentations and posters • Oral or written examination items • Responses to survey or interview questions that ask for examples of knowledge, practice, or value

  20. Practices Common indirect assessment “artifacts” • Course mapping, course-taking patterns or transcript analysis • Responses to survey or interview questions about experiences, perceptions, self-reported progress, or impact of program experiences • Reflective journals

  21. Practices But wait!! Aren’t we observing student work all the time anyway? What’s the difference between grading and assessment?

  22. Practices Grading summarizes many outcomes for one student Assessment summarizes one outcome for many students

  23. Practices The purpose of assessment is to provide systematic, summarized informationabout the extent to which a group of students has realized one or more intended learning outcomes

  24. Practices Options to consider: • Use an instrument developed by someone else • Adapt an existing instrument • Add to something you’re already doing • Connect to institutional-level evidence • Invent something new

  25. Practices Rubrics • Inter-institutional level (p.8) • Institutional level (p. 9) • Department/program level (pp. 10-12) • Course level (p. 13) • Some brief advice (pp. 14-15)

  26. Practices Externally-developed tests • Biology: ETS Major Field Exam • Chemistry: American Chemical Society exams • Classics: Eta Sigma Phi Latin and Greek exams • French: ACTFL Oral Proficiency interviews • Nursing: California Critical Thinking Skills Test • Psychology: Area Concentration Achievement Tests

  27. Practices Locally-developed tests or test items • Chemistry: Safety quiz for all lab students • Physics:Programming test items in introductory and advanced seminar courses • Statistics: Collectively-written final exam essay question in intermediate gateway course

  28. Practices Institutional-level surveys • Analysis and/or administration of selected items by department/division (p. 16) NSSE, RPS • Triangulation (pp. 17-19) • Reflection (p. 20)

  29. Practices Locally-developed questionnaires • Program-level learning outcomes survey (Mathematics – p. 21) • Knowledge and attitudes survey (Environmental Studies – students in on-campus course and undergraduate research field experience; pre/post in both; paired with direct assessment of student work – p. 22)

  30. Practices Course-embedded outcomes reporting (pp. 23-24) • Identify one or more outcomes of interest • Identify one or more assignments that develop and demonstrate the outcome(s) • Rate each student’s work just in relation to that outcome • Aggregate results and suggest significance for course/program

  31. Practices Where possible, pair indirect observations of processes and perceptions with direct observations of outcomes (pp. 22, 25).

  32. Practices The dual goal of sampling: Representativeness and Manageability

  33. Practices Examples involving comprehensive sampling: • Survey of all senior majors • Application of rubric to all research abstracts in all seminars • Application of rubric to all work submitted for senior art show

  34. Practices Examples involving selective sampling: • Application of rubric to randomly-selected subset of final papers in capstone course • Pre/post administration of locally-developed quiz in required sophomore methods course • End-of-course survey in one introductory and one senior-level course • Aggregation of results on selected items in an evaluation form for student work

  35. Practices Which of these assessment strategies might be appropriate for the course- or program-level use of assessment evidence you identified earlier?

  36. A final thought…. Less really is more!

More Related