270 likes | 477 Views
How Campuses are Closing the GE Assessment Loop: A GEAR Report. Presentation to Association of Institutional Research and Planning Officers Buffalo, New York June 11, 2009. Presenters. Patricia Francis, SUNY Oneonta (GEAR Co-Chair)
E N D
How Campuses are Closing the GE Assessment Loop: A GEAR Report Presentation to Association of Institutional Research and Planning Officers Buffalo, New York June 11, 2009
Presenters Patricia Francis, SUNY Oneonta (GEAR Co-Chair) Melanie Vainder, Farmingdale State College (GEAR Co-Chair) Michael S. Green, Hudson Valley Community College Hedva Lewittes, College at Old Westbury
Session Topics • GEAR Group History and Charge • Rationale for Triennial Review/Closing the Loop Process • Review Criteria for Closing the Loop Reports • Good Closing the Loop Practice – Hudson Valley Community College • Good Closing the Loop Practice – College at Old Westbury
GEAR Group: History, Charge, and Triennial Review/Closing the Loop Process Patricia Francis Associate Provost for institutional assessment and effectiveness SUNY Oneonta
GEAR Group Role • Formation of GEAR Group in January 2001 • To create guidelines for campus GE assessment plans • Provide initial and ongoing review of these plans, focusing on how campuses were using assessment results to improve their GE program • Implementation of Strengthened Campus-Based Assessment in 2004 • Triennial Review/CTL Process (October 2007) • Plan updates triennial, not biennial (19 due each year, on March 1) • To document substantive changes to existing campus plans • To demonstrate that campuses were indeed using assess-ment data to improve teaching, learning, and programs
Review Criteria for Closing the Loop Reports Melanie vainder Professor of professional Communications Farmingdale state college
Development of Guidelines • Triennial Updates • Campuses asked to provide minimal information unless substantive changes had been made to plan • GEAR’s review of updates evaluative (i.e., leading to approval or recommendations for revision) • Closing the Loop Reports • GEAR’s review of CTL reports “for feedback only” • Close alignment between GEAR and Middle States expectations (“Closing the Loop Tips”) • GEAR’s intention that good CTL practices could be identified and shared with other campuses
Making Changes Based on Assessment • Criterion #2 • Did faculty members meet to review and discuss assessment results as compared to a priori standards? • Did these discussions lead to conclusions about strengths and weaknesses and, as appropriate, suggestions for programmatic or pedagogical change? • Criterion #3 • Were proposed changes clearly linked to assessment findings? • Criterion #4 • Were recommended changes in curriculum/teaching actually made?
Documenting the Assessment Process • Criterion #6 • Are assessment data based on documented results (i.e., not informal observation or anecdote)? • Are intended changes based on these results documented? • Does the institution provide evidence that closing the loop activities are taking place on an ongoing basis and that these activities are having a subsequent impact on student learning?
Institutional Support for Assessment • Criterion #7 • Does the institution provide professional development op-portunities that will help faculty make programmatic changes intended to improve student learning? • Does the institution support faculty members in their attempts to learn more about the assessment of student learning? • Criterion #8 • Are resources allocated specifically as a result of documented assessment data? • Does the institution establish budgeting priorities based on assessment practices and results?
Implications for the Assessment Process • Criterion #5 • Do documented assessment findings lead to changes in the campus’ assessment plan or processes? • Are results from student performance evaluated and compared to results from prior assessments? • Criterion #9 • Is the assessment process itself evaluated and revised for the next round (with faculty governance input as appropriate)? • Are all appropriate members of the campus community involved in this evaluation process? • Are evaluation results and proposed changes shared with the larger campus community?
Dissemination of Assessment Results • Criterion #1 • Does the institution provide evidence that student learning outcomes data are shared with all appropriate faculty and staff members for the purpose of review and discussion? • Criterion #10 • Are student learning outcomes data shared, in aggregate form, with faculty, departments, support staff, and administrators?
Closing the Loop at Hudson Valley Community College Michael s. green Executive to the president for institutional effectiveness and strategic planning Hudson valley community college
Closing the Loop Report - Structure • Section I – Describes Changes to General Education Assessment Plan • Section II – Describes Use of the Assessment Process and Results
Section I – Changes to Plan • Describe Changes to Approved Plan and Provide Rationale for Changes • Describe how Plan Undergoes Review • When does review take place? • Who does review? • How are suggested changes approved and implemented? • How are Changes to be Assessed?
Section I – Changes to Plan (cont.) • Describe Results of Any Pilot Studies • Describe Plans for Immediate Future
Section II – Use of Process and Results • Describe Who Receives Results and How Results are Communicated to the Campus • Describe how Results are Used at Department/Program and/or Campus Level(s) (Provide Specific Examples)
Section II – Use of Process and Results (cont.) • Describe any Professional Development Activities at Department and/or College Level(s) Related to Instruction and Assessment • Describe the Linkages Between Assessment, Planning, and Resource Allocation
Suggestions • Make Collection of GE Assessment Results and Steps Taken Due to the Results (e.g., Changes to Instruction and Courses, Professional Development Activities) Part of Regular Assessment Processes • Communicate Directly with Department Chairs and Faculty to Get Information about How Results are Used • Share Steps Taken Due to Assessment Results with Campus Community (No Need to Reinvent the Wheel)
Closing the Loop at the College at Old Westbury Hedva lewittes Professor of psychology and Director of academic Assessment College at old westbury
Starting the Process: Reviewing GE Summary Report Forms • Compare Results from Two Completed Cycles • Review Recommended Actions • Which were accomplished and which were problematic? • Describe the process and people involved • Use Section on “Program Improvements Made” from Second Cycle of GE Assessment as Outline for CTL Report • Include improvements at the institutional level that address findings and recommendations across knowledge areas.
Example: Western Civilization • First Assessment Round Revealed Target Not Met for Western Civilization Outcome #2 • Results Disseminated to GE Committee Representatives for Knowledge Area and Chairs (Criterion #1) • Improvements Were Made Based on Review and Discussion (Criteria #2, #3) • Explicit incorporation of outcomes into syllabi and assignments (especially with adjunct faculty) • Development of new course, History of Math • Clear Improvement in Student Performance in Second Assessment Round
Example: Basic Communication [Written] • Writing Assessments Indicated Need to Improve Writing in Knowledge Areas, Including American History, Social Sciences, and Humanities (Criteria #1, #2, #3) • Development and Filling of New Position (Director of the Writing Center) (Criterion #8) • Actions to Be Assessed in Next Round (Criterion #5) • Referring students to the Writing Center • Bringing classes to the Center for instruction
Example: Critical Thinking • Revision of Assessment Procedures Across Knowledge Areas to Accommodate SUNY Rubrics (Criteria #5, #9) • Rubric Training Workshop Held, Co-sponsored by Dean and Teaching for Learning Center (Criteria #7, #8) • Results Revealed Fewer Students Meeting Standards Compared to Previous Assessment (Criteria #1, #2) • New Workshop Planned (Criteria #3, #4, #7, #8) • Will emphasize teaching, integrating critical thinking into the disciplinary curriculum, linking it with assignments/exams and collaboration • To be funded by Provost
Example: Mathematics • Results from Initial Mathematics Assessment (Criteria #1, #2, #3) • Percentage of students meeting standards high compared to other measures (e.g., grades, withdrawal rates) • Higher-level problem-solving skills not assessed adequately • Need to accommodate students at different ability levels • Subsequent Professional Development and Funding Activities (Criteria #7, #8) • Faculty attendance at SUNY Conference and rubrics workshops • Submission of course re-design grant • Funding of power math format course
Examples: Criteria #6 (Mechanisms in Place to Document Changes) and #9 (Assessing the Assessment) • Actions Taken to Clarify/Simplify Process and Encourage Feedback • Development of Uniform Course Report Format, allowing faculty to provide assessment results and comment on process • Development of Status L0g Report • Actions Taken to Build More Collaborative Process • Plans for “Domain Dinners” • Provide opportunity to develop uniform measures • Bring faculty together from different departments who teach in same GE area • Serve to “celebrate” assessment
How Campuses are Closing the GE Assessment Loop: A GEAR Report Presentation to Association of Institutional Research and Planning Officers Buffalo, New York June 11, 2009