1 / 29

Closing the Loop: How to Do It and Why It Matters

June, 2008. Minneapolis, MN. 2. Our roadmap --. What does closing the loop" really mean?What are the impediments?What can we do about them?Where does my institution stand?. June, 2008. Minneapolis, MN. 3. 1. Goals, questions. 2. Gathering evidence. . . . . 3. Interpretation. 4. Use. The Asses

ghada
Download Presentation

Closing the Loop: How to Do It and Why It Matters

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


    1. Closing the Loop: How to Do It and Why It Matters Barbara D. Wright Associate Director, Senior Commission Western Association of Schools and Colleges (WASC) bwright@wascsenior.org

    2. Our roadmap -- What does “closing the loop” really mean? What are the impediments? What can we do about them? Where does my institution stand?

    4. Other (subordinate) steps in the assessment process . . . Planning Mapping goals onto curriculum Adding outcomes to syllabi Offering faculty development Reporting Communicating Adding assessment to program review Assessing the assessment

    5. Where do we get stuck? Step 1? Step 2? Step 3? Step 4? Revisiting? Somewhere else?

    6. Why do we get stuck?

    7. Why the loop doesn’t get closed ... Incomplete definition of assessment Premature planning Lack of leadership Inadequate resources Trying to do too much Attention not sustained A compliance mentality

    8. Why the loop doesn’t get closed, cont. . . . Philosophical resistance Conflict with other faculty duties “Assessment fatigue” after #1, #2 Discomfort with collaboration It’s someone else’s job – IR, maybe Cynicism about the whole enterprise ??

    9. Shifts in our understanding of assessment Isolated facts, skills Memorization, reproduction Comparing performance against other students A full range of knowledge, skills, dispositions Problem solving, investigating, reasoning, applying, communicating Comparing performance to established criteria

    10. Shifts in assessment, cont. Scoring right, wrong answers a single way to demonstrate knowledge, e.g. m/c or short-answer test Simplified evidence Looking at the whole reasoning process Multiple methods & opportunities, e.g., open-ended tasks, projects, observations Complex evidence

    11. Shifts in assessment, cont. A secret, exclusive & fixed process Reporting only group means, normed scores Scientific A filter An add-on open, public & participatory Disaggregation, analysis, feedback Educative A pump Embedded

    12. Shifts in assessment, cont. “teacher-proof” assessment Students as objects of measurement episodic, conclusive Reliability Respect, support for faculty & their judgments Students as participants, beneficiaries of feedback continual, integrative, developmental Validity

    13. A related problem . . . The too complete definition of assessment:

    14. Step #2: Learning goals and the hierarchy of specificity

    15. Levels of specificity – an example

    16. Thinking horizontally as well as vertically about evidence (step #2)...

    17. Step #3: Interpretation An inclusive, collegial “community of judgment” Meaning out of data Shared understanding of strengths, weaknesses, needs Decisions, planning for action Communication about planned action Catalyst for change in campus culture

    18. Step #4: Using the findings Defining the action Planning for implementation Who will manage? Contribute? What expertise, support will it take? What funding is needed? How will we get what we need? Implementing

    19. Closing the loop: Back to Step #1 What do the findings tell us now? Did our “treatment” improve learning? What else do the findings show? What’s the next step? What have we learned about our assessment process? Infrastructure? What can be improved?

    20. Why the loop doesn’t get closed Incomplete definition of assessment Premature planning Lack of leadership Inadequate resources Inconsistent attention A compliance mentality

    21. Premature planning > Faculty frustration Resentment, cynicism A plan of dubious value

    22. Mapping can be useful. It … shows where desired outcomes are already taught, at what level reveals gaps promotes cross-course & cross-disciplinary conversation, collaboration makes course, program outcomes clearer to students shows where to make interventions

    23. Mapping also carries risks. It … Plays into faculty focus on inputs (i.e., new courses, course content) Assumes we know what to do before we’ve looked at evidence, findings Can take a lot of time (e.g., syllabus review, course development, course approval, effectiveness, etc.) Tempts us to remove outcomes that can’t be readily mapped or taught

    26. Why the loop doesn’t get closed: a faculty perspective . . . Philosophical resistance Conflict with other faculty duties “Assessment fatigue” after #1, #2 Discomfort with collaboration It’s someone else’s job – IR, maybe Cynicism about the whole enterprise ??

    27. Conflict with faculty duties Is assessment Teaching? Research? Service? All of the above Something else?

    28. Why the loop should be closed - Return on the investment in #1, #2 Improvement of learning Stronger programs Fewer silos, more integration More collegiality Happier, more successful students Happier, more satisfied employers More willing donors

    29. Why close the loop, cont. Better retention, graduation rates More successful accreditation review Shared campus understanding of mission, learning goals, and what is being done to achieve them Clearer, more substantive communication with the public

    30. So where does your institution stand? Where are you on the loop in terms of assessing general education? What are the major impediments that you need to address? How can you address them? What resources can you draw on? What strategies can you use?

More Related