410 likes | 426 Views
Explore assessment principles, hierarchy of specificity, mapping outcomes, faculty development, and feedback to elevate student learning and development at every level. Discover methods for evaluating complex outcomes effectively.
E N D
The Bigger Picture: The Next Level of Assessment Practice Barbara D. Wright Associate Director, Western Association of Schools and Colleges bwright@wascsenior.org AAC&U Conference, Miami, FL
Our roadmap . . . • What are your goals for this workshop? Questions? • Some key points about assessment • The hierarchy of specificity: an exercise • Break • Supporting structures • Case studies • Your plans • Wrap-up, workshop evaluation AAC&U Conference, Miami, FL
The Assessment Loop Goals, questions Use Gathering evidence Interpretation AAC&U Conference, Miami, FL
What exactly is assessment? It’s a systematic process of 1) setting goals for or asking questions about student learning, 2) gathering evidence, 3) interpreting it, and 4) using it to improve the effects of college on students’ learning and development – at any level of analysis from the individual student to the course, program, or institution. AAC&U Conference, Miami, FL
Other (subordinate) steps in the assessment process . . . • Planning • Mapping goals onto curriculum • Adding outcomes to syllabi • Offering faculty development • Reporting • Communicating • Adding assessment to program review • Assessing the assessment AAC&U Conference, Miami, FL
Mapping outcomesonto curriculum and pedagogy -- it can reveal . . . • where the skill is taught • how it is taught • how consistently it is reinforced • where there are intervention points • (But don’t obsess on syllabi or course descriptions. Ultimately, they’re just inputs, not outcomes.) AAC&U Conference, Miami, FL
A faculty lament . . . • We already test and assign grades. We flunk the ones who don’t measure up. Why do we have to do assessment? AAC&U Conference, Miami, FL
Evaluation first, feedback second Quality assurance Individuals Private Follow-up random, serendipitous Follow-up not supported Focus on the student, the course Feedback first, evaluation second Quality improvement Samples Collective, collegial Follow-up systematic, expected Follow-up supported, rewarded Focus on the program. the institution Testing and Grading vs. Assessment AAC&U Conference, Miami, FL
Levels of Assessment • Individual student learning within courses • Individual student learning across courses • Courses • Programs • The institution From: Levels of Assessment, Miller and Leskes, AAC&U, 2005 AAC&U Conference, Miami, FL
From Levels of Assessment, Miller and Leskes, AAC&U, 2005 • “Evidence of student learning should be used for multiple levels of assessment … The best evidence … comes from direct observation of student work rather than from an input inventory (e.g., list of courses completed) or summary of self reports … Course-embedded assignments provide the most valid evidence for all levels of analysis.” AAC&U Conference, Miami, FL
From Levels of Assessment, Miller and Leskes, AAC&U, 2005 • “The ways of sampling, aggregating, and grouping the evidence for analysis … depend on the original questions posed. The questions will also determine how the data are interpreted to produce action.” Sample questions at the institutional level: ►Just how information-literate are our graduates? ►Do they make steady progress throughout their college career? AAC&U Conference, Miami, FL
From Levels of Assessment, Miller and Leskes, AAC&U, 2005 • “Faculty members and staff accomplish aggregation by describing standards, translating them into consistent scoring scales, and anonymously applying the resulting rubrics to the evidence at hand. Such a process does not assign a grade to an individual student but rather attempts to understand better the learning process and how to improve.” AAC&U Conference, Miami, FL
Some complex learning goals -- • Communication • Critical thinking • Information literacy • Quantitative problem-solving • Team and leadership skills • Intercultural competence • Ability to transfer knowledge, skills • Exercise of civic, social responsibility AAC&U Conference, Miami, FL
Methods for complex outcomes • are open-ended • pose authentic, compelling tasks • stimulate student engagement, creativity • require integration of knowledge, skills, dispositions • demonstrate cumulative learning • are educative for students and educators alike • provide meaningful info for improvement AAC&U Conference, Miami, FL
Methods for complex outcomes … at any level of analysis -- • Portfolios • Capstones • Performances • Common assignments • Secondary readings • Course management programs • Local tests • Student self-assessment AAC&U Conference, Miami, FL
Four dimensions of learning -- • What students learn (cognitive learning, skills, dispositions) • How well?(thoroughness, complexity, subtlety, agility, transferability) • What happens over time?(cumulative, developmental effects) • Is this good enough? (federal concern) AAC&U Conference, Miami, FL
The rubric – it • defines what we’re looking for • offers a set of scoring guidelines • tells where to look • tells what to look for (criteria) and • provides descriptors of each level of quality In other words, it’s a tool for determining “what” and “how well.” AAC&U Conference, Miami, FL
The hierarchy of specificity Institution-wide goals College-wide goals Department- & program- wide goals Course-level goals AAC&U Conference, Miami, FL
The hierarchy of specificity Oral & written communication Professional communication Ability to write for business Ability to write a business plan AAC&U Conference, Miami, FL
Now you try it -- • Communication • Critical thinking • Information literacy • Quantitative problem-solving • Team and leadership skills • Intercultural competence • Ability to transfer knowledge, skills • Exercise of civic, social responsibility • ? AAC&U Conference, Miami, FL
Think horizontally as well as vertically . . . Oral & written communication Professional communication Ability to write for business Ability to write a business plan Internship * Student government * Business courses * Gen Ed AAC&U Conference, Miami, FL
Now you try it . . . • Communication • Critical thinking • Information literacy • Quantitative problem-solving • Team and leadership skills • Intercultural competence • Ability to transfer knowledge, skills • Exercise of civic, social responsibility • ? . . . across the college experience AAC&U Conference, Miami, FL
Remember – when you’ve got data, you’re only halfway there • Data are not information; information is not knowledge • an inclusive “community of interpretation” is needed to make meaning of the findings • Interpretation should lead to plans for improvement, shared commitment • Reports, plans, and recommendations do not equal action • Action requires resources • Faculty can’t do it all on their own AAC&U Conference, Miami, FL
Who needs to be involved? • Faculty • Student affairs, library, academic support, IR, director of TLC, etc. • Students • Chairs, deans, VPAA • Advisory board members, external experts, faculty • ? AAC&U Conference, Miami, FL
“Institutionalizing assessment” – 2 aspects: • The PLAN for assessment (i.e. shared definition, purpose, values, vocabulary, communication, use of findings) • The STRUCTURES and RESOURCES that make the plan doable AAC&U Conference, Miami, FL
Three alternatives for institutionalizing assessment: • Centralized • Ability to focus on desired priorities, level of analysis • Administrative control • Efficiency • Economies of scale • Central administration grows • There are administrative, PR costs • Connection to grassroots lacking • Faculty are alienated or disinterested: Assessment is an administrator’s job AAC&U Conference, Miami, FL
Three alternatives for institutionalizing assessment: • Decentralized • Minimal additional costs • Close to classroom, sites where learning occurs • High faculty ownership, involvement • Discipline-appropriate approaches • Perception: “Sure, just dump it on the faculty” • Communication challenging • Duplication, inefficiency are likely • There are opportunity costs • There is no uniform approach • Higher levels of analysis problematic AAC&U Conference, Miami, FL
Three alternatives for introducing and institutionalizing assessment: • A middle course • Identify elements that can be handled at institutional level, e.g. • Overall coordination • Data storage, reporting (IR) • Articulation of policy, parameters, levels Implementation (e.g. budget for software, consultants, conferences) • Provision of rewards (individual, program) • Communication (faculty, publics, accreditors, state, etc.) AAC&U Conference, Miami, FL
Three alternatives for introducing and institutionalizing assessment: • A middle course • Delegate elements best handled “on the ground” in individual programs, e.g. • Definition of course-level outcomes • Development of methods • Interpretation of findings • Recommendations for improvement • Reflection on what has been learned • Contributions to higher-level analysis AAC&U Conference, Miami, FL
Assessment at higher levels requires- • A formal structure • Assessment acknowledged in policy, contracts, other documents • Dedicated resources • Budget • Personnel • Location • Accountability • Etc. AAC&U Conference, Miami, FL
How to institutionalize -- • Make assessment a freestanding function • Attach to an existing function, e.g. • Accreditation • Academic program review • Annual reporting process • Center for Teaching Excellence • Institutional Research AAC&U Conference, Miami, FL
Maximum flexibility Minimum threat, upset A way to start Little impact Little sustainability Requires formalization eventually, e.g. Office of Assessment Make assessment freestanding -- Positives and Negatives May be difficult to connect to higher levels of analysis AAC&U Conference, Miami, FL
Maximum motivation Likely compliance Resources available Staff, faculty assigned Clear cause/effect Resentment of external pressure Us/them dynamic Episodic, not ongoing Reporting, gaming, not improving Little faculty involvement Little connection to the classroom, learning Main focus: inputs Attach to accreditation -- Positives and Negatives May be easier to connect AAC&U Conference, Miami, FL
Strong impact possible Ongoing Close connection to faculty, classroom, learning Maximum responsiveness to “use” phase Impact depends on how broadly assessment is done No enforcement Little/no reporting, communicating Rewards, recognition vary, may be lip service Attach to Center for Teaching Excellence -- Positives and Negatives AAC&U Conference, Miami, FL
Some impact (depending on stakes) Some compliance Some resources available Staff, faculty assigned Cause/effect varies Impact depends on how well PR is done Episodic, not ongoing Inputs, not outcomes Reporting, not improving Generally low faculty involvement Weak connection to the classroom, learning Attach to program review -- Positives and Negatives AAC&U Conference, Miami, FL
Some impact (depending on stakes) Ongoing Some compliance Habit, expectation Closer connection to classroom, learning Cause/effect possible Allows flexibility Impact depends on how seriously, how well AR is done No resources Reporting, not improving, unless specified Chair writes; faculty involvement varies Attach to annual report -- Positives and Negatives AAC&U Conference, Miami, FL
Optional part One small part of total PR process “Assessment” vague, left to program Various PR elements of equal value (or no value indicated) Little faculty involvement Required Core of the process (emphasized in instructions) Assessment expectations defined Points assigned to PR elements; student learning gets 50% or more Broad involvement How can we increase weighting of learning & assessment in PR? E.g., From to AAC&U Conference, Miami, FL
Don’t confuse program-level assessment and program review • Program-level assessment means we look at learning on the program level (not the individual student or course level) and ask what the key learning experiences of a program add up to. • Program review looks for program-level assessment of student learning but goes beyond it, also examining other components of the program, e.g. mission, faculty, facilities, demand, etc. AAC&U Conference, Miami, FL
New trends for PR/assessment (cf. WASC accreditation process) • Create a program portfolio • Keep program data continuously updated • Do assessment on annual cycle • Enter assessment findings, uses, by semester or annually • For periodic PR, review portfolio and write reflective essay on student AND faculty learning AAC&U Conference, Miami, FL
Budget items • Released time, support staff time • Equipment, supplies (e.g. purchased instruments, software, computers, servers, books, photocopying) • Development (e.g., consultants, workshops) • Incentives and rewards (e.g. faculty mini-grants, travel, stipends, merit) • Communications AAC&U Conference, Miami, FL
So what’s your plan? • What learning goal is a priority for higher-level analysis? • What’s your question? • What’s already in place? • What do you need? • Who needs to be involved? • What’s the timeline? AAC&U Conference, Miami, FL