220 likes | 481 Views
Student Learning Assessment: Clo sing the Loop. Leanne Charlesworth & Mary Louise Gerek Nazareth College of Rochester February 9, 2010.
E N D
Student Learning Assessment:Closing the Loop Leanne Charlesworth & Mary Louise Gerek Nazareth College of Rochester February 9, 2010
“It is pointless simply to “do assessment”; the results of assessment activities should come full circle to have a direct impact on teaching and learning and on the institution’s strategic plan to fulfill its mission…if the results of assessment are not used to improve student learning, assessment becomes at best a descriptive set of data about students and, at worse, a useless exercise.” (MSCHE, 2007, p. 59)
Assessment as a Four-Step, Continuous Cycle 1. Establish Learning Goals 2. Provide Learning Opportunities 4. Use the Results 3. Assess Student Learning Suskie, 2009, p.4
Nazareth College Planning CyclePlanning, Implementation Evaluation, and Improvement Planning Step 1: Development/Review of unit or program Planning Step 2: Establish Measurable Outcomes with Performance Indicators Improvement: Propose improvements based on findings from analysis Evaluation Step 2: Analyze evidence and report results to internal and external stakeholders Implementation Step 1: Design Programs & Services Implementation Step 2: Delivery of Programs/Services Provide Learning Opportunities Evaluation Step 1: Gather Data/Evidence Required to Measure Outcomes Adapted from NACUBO EHE Workbook, 2009, and T. Banta, IUPUI
Evidence Diagram • Trudy Banta Handout
Loop Closing & Potential Focal Points: Curriculum or student programming Teaching methods or student practices Resource allocations Assessment approach or measures Learning goals or outcomes 6
Examples: National Context • Student Learning Assessment: Options and Resources. 2nd Edition, MSCHE.
Examples: Nazareth Context • History • Social Work • Inclusive Childhood Education • Academic Advising
Examples: Nazareth Context History 2002 Program Review identified weakness in historiography through the review of comprehensive exams and portfolio artifacts. Created new sophomore course required by all students. Revamped all courses to include more historiography. Academic Assessment 2008 - 2009 Implemented the “Towson State” assessment methodology (little bubbles) and applied score sheet to assess historiography performance in the portfolios and research papers. Results of assessment show improvements in student performance. 9
Examples: Nazareth Context • Social Work • Multiple indirect and direct measures within assessment process. One direct measure is senior field evaluation. • 2005, 2006: Senior field evaluations below program benchmark. • Response: Implemented mid-semester senior field evaluation and new policy regarding students falling below a mid-semester benchmark. • Results: 2007-2009, meeting senior final field evaluation benchmarks. • New Loop: Examining new approaches to field performance evaluation focusing on validity and consistency, including consistency across educational levels and area institutions.
Examples: Nazareth Context Inclusive Childhood Education Multiple measures for assessing claims: Student teaching evaluations NYS content specialty tests – required for certification Portfolio content assessed through departmental rubrics Results have been used to: Offer training to cooperating teachers on evaluating student teachers Evaluate curricular, pedagogical and programmatic options to ensure strong student writing Change rubrics to improve assessment process 11
Examples: Nazareth Context Academic Advisement Developed seven SLOs for the advisement process. Assess two to three each year. Innovations – based on results about students’ knowledge of degree requirements AA is offering a new workshop for juniors this spring. Developed resource for advisors, “A Guide To Teaching New Advisees,” to remind them what to teach advisees so they become self-sufficient in their educational planning. Survey of Registering Freshmen Changed the registration process in response to the questions. Continue to monitor issues as they arise. 12
Case Scenario Criminal Justice Case Study Take 2-3 minutes to read through the summary information. Turn to a partner and, focusing on “closing the loop,” share observations and recommendations. Results 13
Desired Planning Data collection And Analysis Utilize Data Follow up Barriers to Closing the Loop Typical
Barriers to Closing the Loop Philosophical resistance or differences. Viewed as someone else’s job (e.g., IR & A, chairperson, junior faculty). Inappropriate initial focus and approach; information seems to have no use. Not rewarded or valued. Overload; conflict with other faculty duties. Assessment fatigue, after completing earlier steps in the cycle. Innovation and change are not part of the infrastructure. 15
So why do it? Improved faculty communication and collaboration. Clearer communication with public and students. Step toward shared departmental and campus understanding of mission, learning goals, and best strategies to achieve them. Validate or improve your work and its impact. Improved student experiences and learning; student success! 16
Strategies for Overcoming Barriers Plan for action early-on in the assessment process; collaborate; prepare for change. Develop long term assessment cycle and plans. Focus on feasibility and sustainability throughout the assessment process. Widely share and discuss results; engage students in the assessment process; engage in shared decision-making. Identify the “who, what, when” of steps including implementation and follow-through. Clarify the process for budget prioritizing; provide resources for action, implementation. Keep the focus on the purpose: student learning and success. 17
If a tree falls in a forest and no one is there to hear it, does it make a sound? Document, Document, Document!
“Tell A Story” (Suskie, 2009) • Address key questions: • What have you learned about your students’ learning? • What are you going to do about what you have learned? • When, where, and how are you going to do it? • Focus on “big news.” • For example, recurring themes, statistically significant findings, trends over time, etc.
Assessment Documentation and Summaries • Who (internally and externally) needs to know? • Why? What decisions will be made? • What do they need to know or see to make those decisions? • Focus on brief summaries, highlighting important information. Share details when needed or as requested. • Share information with Nazareth IR & A, but the IR & A reporting requirement should not drive the process.
What is “Good” Assessment? Clear & Important goals Used Reasonably accurate & truthful results Cost effective Valued Suskie, MSCHE
References • Maki, P.L. (2004). Assessing for learning: Building a sustainable commitment across the institution. Sterling, VA: Stylus Publishing. • Middle States Commission on Higher Education, (2005). Assessing student learning and institutional effectiveness: Understanding Middle States expectations. Philadelphia, PA: Author. • Middle States Commission on Higher Education (2007). Student learning assessment: Options and resources. (2nd ed). Philadelphia, PA: Author. • Suskie, L. (2009). Assessing student learning: A common sense guide. (2nd ed.). San Francisco, CA: Jossey Bass. • Walvoord, B. (2004). Assessment clear and simple: A practical guide for institutions, departments, and general education. San Francisco, CA: Jossey-Bass. • Wright, B.D. (2008). Closing the loop: How to do it and why it matters. AAC & U Institute on General Education. Retrieved: http://www.aacu.org/meetings/institute_gened/documents/WRIGHT_ClosingLoopAACU_6-08.ppt.