660 likes | 1.43k Views
Assessment Reports. YSU Office of Assessment October 9 & 10, 2012. Goals of Workshop. Current assessment context Assessment reportevaluation process Overview of new Higher Learning Commission Criteria Review key items on the a ssessment report template
E N D
Assessment Reports YSU Office of Assessment October 9 & 10, 2012
Goals of Workshop • Current assessment context • Assessment reportevaluation process • Overview of new Higher Learning Commission Criteria • Review key items on the assessment report template • Walk through new online reporting form
Assessment Context Accreditation Context • Last year participating in the Higher Learning Commission’s (HLC) Academy for the Assessment of Student Learning • New HLC Criteria effective January 1, 2013 Quality of Assessment Processes • Strengths • Excellent participation 2011-12 -- LO review (100%), curriculum maps (95%) • Quality/participation in reporting process improving over time – but 100% needed • Challenges • Continuous collection of data (no years off!) • Stepping back for the big picture—what is the impact on learning?
Assessment Process • Review Process • Focused rubric – emphasis on assessment priorities • Team of 2 reviewers: • Assessment Council Member • Assessment volunteer – good service opportunity • Final review by Director • Feedback via email and/or meeting • Strengths of plan or report • Suggestions for next year • Revisions, if needed • Quality levels • Exemplary, proficient – high quality • Progressing – developing expertise • Inadequate – request revision
Assessment Process, cont. Assessment Reporting Priorities • Focus on use of data • Reflect on changes and impact on learning • Continuous data collection (i.e. every year!) • Streamline reporting – focus on process vitality, not form What’s New This Year • Online reporting • Fewer questions • Focused rubrics on priority areas Future Goals • Longer reporting cycle • Possible integration with program review
Keeping a Student Learning Archive • Accreditation Archives • Departments need to keep a student learning archive for 10 years • Plan and report submissions kept in OOA for 10 years • Archive Examples • Summaries of data on student learning • Representative student work examples at different performance levels • Student work evaluation criteria, e.g., rubrics • Assessment plans and reports • Newsletters • Website screenshots • Meeting minutes on assessment
HLC New Criteria for Accreditation • New Criteria at: www.higherlearningcommission.org • Guiding Values, includes: 1. Focus on student learning 4. Culture of continuous improvement 5. Evidence-based institutional learning and self-presentation 9. Mission-centered evaluation • The Five Criteria • Mission • Integrity: Ethical and Responsible Conduct • Teaching and Learning: Quality, Resources, and Support • Teaching and Learning: Evaluation and Improvement • Resources, Planning, and Institutional Effectiveness
New HLC Criteria Relevant to Practice 4.B. The institution demonstrates a commitment to educational achievement and improvement through ongoing assessment of student learning. 1. The institution has clearly stated goals for student learning and effective processes for assessment of student learning and achievement of learning goals. 2. The institution assesses achievement of the learning outcomes that it claims for its curricular and co-curricular programs. 3. The institution uses the information gained from assessment to improve student learning. 4. The institution’s processes and methodologies to assess student learning reflect good practice, including the substantial participation of faculty and other instructional staff members.
New HLC Criteria Relevant to Practice, cont. 3A2: The institution articulates and differentiates learning goals for its undergraduate, graduate, post-baccalaureate, post-graduate, and certificate programs. 3A3:The institution’s program quality and learning goals are consistent across all modes of delivery and all locations (on the main campus, at additional locations, by distance delivery, as dual credit, through contractual or consortial arrangements, or any other modality). 5C2:The institution links its processes for assessment of student learning, evaluation of operations, planning, and budgeting
Upcoming Assessment Workshops Developing an Assessment Plan Wednesday, October 10th, 1-2 pm Completing the Assessment Report Wednesday, October 10th, 10-11 am Note: workshops/forms overlap
Assessment Plans vs. Reports Assessment Plans Assessment Reports • Plans and methods to cover all SLOs in 3-4 year cycle • Criteria for at least the 1st year • Plans for sharing results with major stakeholders • Data from previous year • Two methods and data summary for two SLOs • Analysis of student learning for strengths and challenges • Action steps based on data • Sharing of data and results • Both Plans and Reports: • Engagement of faculty • Impact on learning from previous action steps
Completing the Assessment Report Form Everything to know but didn’t want to ask
Assessment Templates Section 1: Identifying and Contact Information Section 2: Outside Accreditation Section 3: Assessment and Evaluation of Student Learning Outcomes Section 4: Use of Data
Sections 1 &2: Identifying/Accreditation Information • Save time and fill out only online! • Note the difference between degree, program, and track:
Section 3: Assessment and Evaluation of Student Learning Outcomes
Section 3: Learning Outcomes • State the student learning outcome assessed: • One per column for a total of two • OOA considers the quality of the SLO, but recognizes limitations such as accreditation restrictions • For more on revising learning outcomes, see the OOA website for the assessment plan workshop
Section 3: Methods 2. What methods/measures did you use to assess student learning? • Provide two methods for each SLO • One must be a direct method • Methods should include/attach: • where it was administered (e.g., capstone) • Performance criteria (e.g., rubrics) • Same method can span multiple SLOs (e.g., you could use the same method for both SLOs)
Assessment Method Definitions *For an example of rubrics, see the AAC&U’s VALUE Rubric Project: http://www.aacu.org/value/rubrics
Section 3: Data Summary 3. What were the data resulting from these methods? • What were your results? • Include: • Number of students • Aggregate/group data patterns • Note: data patterns may or may not point to conclusions
Section 3: Data Summary 3. (continued)
Section 3: Successes and Challenges 4. What successes and challenges do you see in students’ learning as a result of these assessments? • At least one strength and one challenge • Not about pedagogy or curriculum • Looks at data patterns (with contextual knowledge) to reach conclusions (about students’ learning)
Section 3: Successes and Challenges 4. (continued)
Section 3: Action Steps 5. How did you use the data: e.g., what recommendations and action steps to the program have resulted from reviewing the data and where is the department in this process? • Any decisions should be based on data • The data may not indicate the need for change; that is fine, just explain your conclusion • Could include pedagogy or curriculum impact or issues here
Section 3: Action Steps 5. (continued)
Section 4: Sharing Results 6. How are you sharing the results of the data discussed in section three with your students, your college, and other stakeholders? • Include both internal and external stakeholders • Examples: • Students’ review of aggregate data • College wide assessment committees • Discuss in advisory group meeting • Share with foundational subject departments (e.g., Engineering Dept. shares findings with Mathematics Dept.)
Section 4: SLOs and Curricular Maps 7. How did the assessment activities in 2011-12 (i.e., reviewing learning outcomes and completing curriculum maps) impact your program? • No correct answer; just experience of departments • Questions to consider: • Did you streamline learning outcomes? • Did they foster faculty discussion? • Were gaps in learning or assessment practices uncovered? • Did you find more efficient ways to collect data?
Section 4: Impact on Learning 8. In the past several years (e.g., 2008-11), you have analyzed data and identified action steps for learning outcomes. Considering action steps from previous years, what has been an impact on student learning as a result of (one of) those action steps? • Refer to past assessment reports (2008-11) • Focus on how action step impacted student learning • Do not need specific supporting data, just professional judgment at this stage
Section 4: Engaging Faculty 9. How is your department working to engage all faculty in the assessment process? • All department faculty should be meeting at least once per year to discuss assessment results and decide on action steps • Collective responsibility • Not just one person’s job
Section 4: Additional Information 10. Optional: Is there anything else you would like to share regarding your assessment report and/or is there any particular area on which you would like assistance or feedback? • Is there more to “the story” than reflected in the report? • Something the Office of Assessment or Assessment Council can assist you with? • Examples: • Involving students in review of data • Increasing faculty participation
web.ysu.edu/assessment/templates Template Submission link
New Online Reporting Form Online Assessment Report Submission Form: http://www.jotformpro.com/ysuassessment/2012acadreport Note: if you have new or revised undergraduate learning outcomes, they should also be sent to Jean Engle at jsengle@ysu.edu.
Thank you for your participation! To view Assessment plan or Report forms and scoring rubrics, as well as this presentation, visit: http://web.ysu.edu/assessment/templates Contact Info: Hillary fuhrman, x2453 hlfuhrman@ysu.edu Office of assessment, x2014 ysuassessment@ysu.edu