270 likes | 296 Views
Explore the importance of improvement in education assessment, with examples and strategies to enhance student learning outcomes. Learn how to differentiate between change and improvement in program evaluations.
E N D
A Simple Model for ImprovementWeigh Pig, FeedPig, Weigh Pig For: Palo Alto College Keston Fulcher, Ph.D. 1/9/2018 James Madison University
Improvement by JMU’s Computer Information Systems2015 Seniors vs 2016 Seniors RARE, RARE, RARE Outstanding Experienced Professional Beginning Competent Developing Excellent
Where is Learning Improvement? • Banta, Jones, and Black (2009): 6 % of programs could demonstrate improvement • Blaich and Wise (2011): Even with top-tier methodological support, little evidence of action taken on results • Banta and Blaich (2011): Could not find enough examples of learning improvement to write an article about it
Use Results for Improvement Assessment Cycle
Why Improvement Matters • We “sell” assessment this way • Accountability, spirit is often improvement • More interest within programs/units
Confusion over Definition • Improvement? • Making change to assessment process • Making change to program (e.g., curricular, pedagogical modification)
Model for Improvement (Learning Example) Now, we can say this change is an improvement. At this point, only a change.
Model for Improvement (Administrative Example) Now, we can say this change is an improvement. At this point, only a change.
“Change” v. “Improvement” Improvement or not? The Country Music B.A. program at Houston College has been conducting student learning outcomes assessment for several years. Based on 3 years of assessment data, faculty from the Country Music B.A. program found that the rubric they use to rate students’ pedal steel guitar performances does not yield reliable scores. Furthermore, while discussing the rubric during a staff meeting, faculty realize they are interpreting and using the rubric in varied ways. With the help if an assessment expert, the faculty revise the rubric and engage in rater training on the rubric every semester.
“Change” v. “Improvement” Improvement or not? Based on ratings from their revised rubric, the Country Music B.A. program faculty find that students’ pedal steel guitar performances could improve. Students are unable to change the pitch of one or more strings while other strings stay at the same pitch. Students also struggle to change the pitch of the strings at differing rates. Faculty who teach CM-450 (Adv Guitar Technique) – the class that teaches students to play the pedal steel guitar– revised their curricula to include three additional lessons on changing the pitch of the pedal steel guitar. They also changed one of the required class recital assignments to include a song that requires multiple pitch changes, so students practice this technique as part of a graded class assignment and receive instructor feedback.
“Change” v. “Improvement” Improvement or not? Two years after making programmatic changes, the faculty member from the Country Music B.A. program examine their pedal-guitar assessment data. They discover that senior Country Music majors received higher scores on the revised pedal steel guitar rubric compared to senior Country Music majors from two cohorts ago. Improved Learning!
Level Problem Program Faculty Individual Faculty 402 Program Assessment Faculty Development 301 310 390 390 LID 210 250 280
LID Integration • Integrate faculty/staff development & assessment • Work with colleagues • Ensure support from administration • Focus on 1 (yes, 1) Outcome
Definition Distinguish between change & improvement
Prioritize Make improvement university-level priority
Focus Select one outcome to improve
If We Succeed… • Superior student learning • More effective processes • Positive faculty and staff outcomes • Better accountability • IMPROVED HIGHER EDUCATION
Evidencing Improvement: A Definition Strong evidence supporting substantive improvement due to program/unit modifications. This program responded to previous assessment results, made implementation modifications, RE-assessed, and found improvement. The rationale and explanation of the modifications leading to the improvement are clearly laid out. The methodology is of sufficient strength that most reasonable alternative hypotheses can be ruled out (e.g., sampling concerns, validity issues with instrument or student motivation). In essence, the improvement interpretation can withstand reasonable critique from faculty, staff, curriculum experts, assessment experts, and external stakeholders (modified from Fulcher, Sundre, Russell, Good, & Smith, 2015).
Special Thanks to… Pig Paper Co-Authors Rodgers-Good Coleman Smith
Acknowledgements… • Cara Meixner, CFI Executive Director • Carol Hurney, former CFI Executive Director • Jeanne Horst, CARS; Steve Harper, CFI • College of Business and CIS • especially Diane Lending & Tom Dillon • College of Arts and Letters and School of Communication Studies • especially Corey Hickerson, Lori Britt, and AnnickDupal • Provost Jerry Benson • Vice Provosts Linda Halpern and Marilou Johnson; former Vice Provost Teresa Gonzalez • Deans Mary Gowan and David Jeffrey • Kristen Smith and Megan Rodgers Good
Questions… • fulchekh@jmu.edu • References • First Pig Paper: http://learningoutcomesassessment.org/documents/Occasional_Paper_23.pdf • Second Pig Paper: http://www.rpajournal.com/return-of-the-pig-standards-for-learning-improvement/