290 likes | 415 Views
Re-visioning the General Education System. Dr. Darby Hiller, Office of Research, Planning & Effectiveness Northwestern Michigan College Michigan Association for Institutional Research November 6-8, 2013 Grand Rapids, Michigan. Ancient History. 2000-2005.
E N D
Re-visioning the General Education System Dr. Darby Hiller, Office of Research, Planning & Effectiveness Northwestern Michigan College Michigan Association for Institutional Research November 6-8, 2013 Grand Rapids, Michigan
Ancient History 2000-2005
In the beginning…there was Accreditation • Develop common learning outcomes • Develop methods to assess them • Communications • Critical Thinking • Cultural Perspectives • Artifact Method + Rubrics • Monitoring Report to HLC (2003)
Then IR happened… • CHECK • Rotating one outcome a semester • Larger artifact samples: at least 350 for near-graduates • Big scoring day: 30-50 faculty members • Surveys of student perception: current students, graduates • ACT’s CAAP test • Communications • Critical Thinking • Driving proof of ADJUST • College-Wide Assessment Team becomes Scholarship Action Group Progress Report to HLC (2005)
Ideal Quality Improvement Academic leaders Academic faculty Academic faculty IR and Scholarship Action Group
Artifact Scoring Results • 2013- Quantitative Reasoning Artifact Results • 2012- Critical Thinking Artifact Results • 2012- Communications Artifact Results • 2011- Quantitative Reasoning Artifact Results • 2011- Critical Thinking Artifact Results • 2010- Communications Artifact Results • 2009- Critical Thinking Artifact Results • 2008- Communications Artifact Results • 2007- Critical Thinking Artifact Results • 2006- Communications Artifact Results • Inter-reader reliability • 2005- Artifact Results (Spring 2005) • Inter-reader reliability • 2004- Artifact Results (Fall 2004) • Inter-reader reliability • 2003- Artifact Results (Fall 2003) • Assessment Updates • Assessment Update 2008 • What We Have Learned 2007 • What We Have Learned 2006 • What We Have Learned 2005 • What We Have Learned 2004 • 2004 addendum • Assessment Newsletter - Fall 2003 • Assessment Newsletter - Fall 2002 • Assessment Newsletter - Spring 2002 • Assessment Newsletter - Fall 2001 • CAAP Critical Thinking Test Results • 2009- CAAP After-action report • 2007- CAAP After-action report • 2005- CAAP After-action report • 2003- CAAP After-action report • 2002- CAAP After-action report 2002- CAAP Report addendum • CAAP Writing Test Results • 2002- CAAP After-action report • Student Perceptions of Learning Survey • 2006- After-action report • 2004- After-action report • 2003- After-action report • 2002- After-action report Our process led to lots of Assessment ResultsFaculty frequently joked about winning the lottery again
Some Adjustment 2005-2009
Adjusting the “Plan” • Cultural Perspectives went away • Not pervasive enough in the curriculum • Research showed grads might not be exposed • Content vs. skill • Became a degree requirement – select a course
Adjusting the “Plan” • Added Quantitative Reasoning • With a new scoring method • More one-piece flow • Scoring in the discipline • Instructor as first scorer • Logistical Issues
Adjusting the “Check” • CAAP went away • Assessment method did not match our outcomes • Student Perceptions Survey went away • Indirect perceptions did not match the direct results • Assessment Coordinator • Half-time faculty member joined IR
Revolutionary Adjustment 2010-2013
Current State Value Stream Mapping WASTE: Compensating Steps Batched Processes
Who is the Customer? • Higher Learning Commission • VP for Educational Services • Faculty • Students • Community Stakeholder • Other?
Systems Thinking Institution The Box Kite Program Course Learner
Systems Thinking YOU ARE HERE
Systems Thinking Do Institution Check Aggregated Results Plan Adjust Program Artifacts from course that support the outcome Course Sample of Near-graduates Learner
Strategic Re-Visioning 2012-2013
Strategic Goal Make it so…that we can use the results of assessment to improve student learning. Gen Ed Re-visioning Team, a sub-team of Curriculum Committee
Incubating the Process • Fall 2013 – Communications • Sample of Near-graduates was selected (N=380) • Full-time faculty in selected courses • First scorers for all student in their class • Adjunct faculty • First scorers for sample of selected students • Send in scores and artifacts to IR • N=1110, near grads are a subset of this group
Incubating the Process • Spring 2014 – Critical Thinking • Same process • May 2014 • General Education Day • Faculty score only the sample of near graduates artifacts as second scorers (and third if needed)
What We’ll Get, Incrementally Results Overtime learners will know exactly where they are and how far they’ve come
Challenges and Rewards • Perceived changes in faculty workload • Full-time vs. Part-time faculty participation • Data collection – the “Kaizen” technology is not available yet • Scannable rubrics • Electronic worksheets for faculty • Fuel for discussions in the discipline about improving student learning – how do we document the evidence of the follow-up and adjust? • The value proposition for learners – Here is what you’ll get…and we can prove it.
Thank you! What questions do you have? Darby Hiller, dhiller@nmc.edu