1 / 12

Assessment for Research Universities and Graduate Programs

Assessment for Research Universities and Graduate Programs. Barbara Wright Associate Director, WASC bwright@wascsenior.org. Our roadmap:. Why is assessment a challenge? What is assessment, really? How can it work in a research or graduate environment? A case study.

janepadilla
Download Presentation

Assessment for Research Universities and Graduate Programs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment for Research Universities and Graduate Programs Barbara Wright Associate Director, WASC bwright@wascsenior.org

  2. Our roadmap: • Why is assessment a challenge? • What is assessment, really? • How can it work in a research or graduate environment? • A case study

  3. Why is assessment a challenge? • Complex learning goals • Fewer models, less information • Less interest • Less felt need • Competition for faculty time, attention • Dominance of research as a value

  4. The Assessment Loop 1. Goals, questions 4. Use 2. Gathering evidence 3. Interpretation

  5. So what is assessment? It’s a systematic process of 1) setting goals for or asking questions about student learning, 2) gathering evidence, 3) interpreting it, and 4) using it to improve the effects of college on students’ learning and development at any level of analysis from the individual student to the course, program, or institution

  6. Assessment is not • Just the first step • Just the first and second steps It’s the WHOLE process.

  7. Methods matter • Descriptive data – good • Indirect evidence – better • Direct evidence of learning – best

  8. Direct methods include … • Portfolios • Capstones (e.g., dissertations) • Performances (e.g., poster or conference presentation) • Assignments (e.g. papers, research projects) • Secondary readings • Comps, qualifying exams • Commercial tests

  9. Four dimensions of learning -- • What students learn (cognitive as well as affective, social, civic, professional, spiritual and other dimensions) • How well(thoroughness, complexity, subtlety, agility, transferability) • What happens over time(cumulative, developmental effects; “value added”) • Is this good enough? (the ED question)

  10. Thinking about standards . . . • Absolute standards: the knowledge/skill level of champions, award winners, top experts • Contextual standards: appropriate expectations for, e.g., a 10-year old, a college student, an experienced professional • Developmental standards: amount of growth, progress over time, e.g., 2 years of college, 5 years • Institutional, regional, national standards?

  11. Keys to successful assessment • SLOs are meaningful • Methods align with learning goals, Qs • Methods are embedded, not add-ons • Validity is in balance with reliability • Analysis is collective, collegial • Findings are actionable, lead to improvement

  12. Successful assessment, cont. • SL is part of program review • Faculty get support, development • Rewards are provided • Innovations are planned for, funded • The approach is sustainable – and intellectually interesting

More Related