1 / 28

Student Success Plans Regional Meeting February 9, 2007

Student Success Plans Regional Meeting February 9, 2007. Youngstown State University Office of Assessment Sharon Stringer sastringer@ysu.edu Heather DiGregorio hldigreg@gmail.com. Structure of the Assessment Process at YSU. Assessment Process Overview. Step One: Fall 2003

doli
Download Presentation

Student Success Plans Regional Meeting February 9, 2007

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Student Success PlansRegional MeetingFebruary 9, 2007 Youngstown State University Office of Assessment Sharon Stringer sastringer@ysu.edu Heather DiGregorio hldigreg@gmail.com

  2. Structure of the Assessment Process at YSU

  3. Assessment Process Overview • Step One: Fall 2003 • Design and Review Student Learning Outcomes • Step Two: Spring 2004 • Design, Review, and Revise Assessment Plans • Step Three: 2004-2005 • Implementation of Assessment Plans: Monitor Progress and Visit/Assist Departments

  4. Assessment Process Overview cont’d • Step Four: 2005-2006 & on an annual basis • Review of Data on Student Learning in Departmental Assessment Reports and Provision of Feedback to Academic Departments

  5. Step One: Design and Review Student Learning Outcomes • During the Fall semester of 2003, departments submitted their learning outcomes for each of their degree programs • Assessment Council members reviewed each submission, using a rubric to evaluate whether the learning outcomes are clear, distinct, and measurable for each degree program • Council provided written feedback to each department by the end of February 2004

  6. Step Two: Design, Review, and Revise Assessment Plans • Spring semester of 2004 departments submitted assessment plans • Assessment Council reviewed and provided feedback on assessment plans by May 2004

  7. Step Two: Design, Review, and Revise Assessment Plans cont’d • Plans were to include: • A minimum of four program learning outcomes • Links between the learning outcomes and departmental Mission and Goals • Tools for measuring student learning outcomes • Timeline for implementing the assessment plan • Methods for data aggregation • Descriptions of the feedback loop for program improvement

  8. Step Three: Implementation of Assessment Plans • The primary goal was to help all departments revise (if improvements were needed) and implement their assessment plans • Office of Assessment collated examples of planning tools and data aggregation on student learning at YSU

  9. Step Three: Implementation of Assessment Plans • An example of a planning tool (PASS Map) • An example of a data aggregation form (Excerpt from Spring 2004 Assessment Plan, Department of Psychology)

  10. Step Three: Implementation of Assessment Plans cont’d • Sharing models, the Assessment Council continued to send feedback and periodic reminders of deadlines so that data on student learning would be collected on an ongoing basis • These data would be summarized in the assessment reports due September 30 of every academic year

  11. Primary Goal for 2005-2006 • Implemented Step Four: Reviewed data on student learning in departmental assessment reports and provided feedback to academic departments • Assessment Council performed a careful review of incoming departmental reports that provided aggregate data on student learning in undergraduate and graduate programs • The Assessment Council continued to improve the ongoing feedback cycle

  12. Background Information • Every department used the same standard template to write assessment reports • Departments that undergo accreditation review by their primary discipline were asked to provide relevant sections of their most recent accreditation report to the Council • Assessment Council members worked with one of five teams to review the departments’ reports using a standard rubric

  13. Examples from YSU Assessment Reports

  14. Presentation of Data

  15. Presentation of Data (Sociology)(Excerpt from the Fall 2006 Assessment Report, Department of Sociology) • The results of students’ general knowledge of sociology showed that juniors and seniors did better in all Learning Outcomes than the entry level students. The results are shown in the table below:

  16. Presentation of Data (Sociology)(Excerpt from Fall 2006 Assessment Report, Department of Sociology) • A separate assessment of Learning Objective 2 also showed that participating students did significantly better on the post-test compared to the pretest. The results are shown in the table below:

  17. Presentation of Data (Telecommunications) (Excerpt from Fall 2006 Assessment Report, Telecommunications Program)

  18. Presentation of Data (Telecommunications) (Excerpt from Fall 2006 Assessment Report, Telecommunications Program)

  19. Evaluation of Data

  20. Presentation of Strengths (Dietetics)(Excerpt from Fall 2006 Assessment Report, Dietetic Technology Program) • As identified from student/graduate surveys and course evaluations: • Curriculum • University facilities and faculty • Seamless articulation from the 2 year program to the 4 year program • Program fills a need for the non-traditional student • Supervised practice sites and preceptors • Credit for prior learning and work experience • As identified by faculty/preceptors: • University support of the programs • Dedicated preceptors • Varied supervised practice sites • University support of the preceptors • As identified by employers: • The program fills the community workforce needs • Graduates have a good base of knowledge • Graduates display professionalism

  21. Presentation of Weaknesses (Dietetics)(Excerpt from Fall 2006 Assessment Report, Dietetic Technology Program) • As identified from student/graduate surveys and course evaluations: • Lack of preparation for the registration exam • Lack of clinical orientation (classroom) prior to supervised practice • Inconsistent preceptor preparation for students • Diversity education • As identified by faculty/preceptors: • Student evaluation system • Variance in preparation and knowledge base • Time constraints on university faculty • Contact time with program director • As identified by employers: • People skills such as workplace etiquette • Clinical support skills such as medical abbreviations

  22. Dissemination of Results • Faculty: • Fall Semester: Annual Departmental meeting to design and discuss implementation of action plan(s) to improve program • Spring Semester: Annual Departmental meeting to discuss results on student learning

  23. Dissemination of Results • Students: • Shared during particular courses (such as the capstone) as appropriate • Shared in the department newsletter • Shared during advisement with information on curriculum sheets

  24. Dissemination of Results • Other constituents: • Conveyed through a department newsletter • Shared on the departmental website • Shared through personal communication • Data submitted to a National Association or an accrediting body that, in turn, publishes the information

  25. Closing the Loop

  26. Closing the Feedback Loop (Counseling)(Excerpt from Fall 2006 Assessment Report, Department of Counseling)

  27. Closing the Feedback Loop (Counseling)(Excerpt from Fall 2006 Assessment Report, Department of Counseling) • Course additions and revisions identified in last year’s report have substantially enhanced program quality • The hiring of two tenure-track faculty has significantly reduced the percentage of courses taught by part-time instructors and enhanced opportunities for student learning

  28. Conclusion • A learning process • A shared collaboration • An emphasis on continual improvement

More Related