1 / 31

Course Level Assessment and Peer Review Process

Course Level Assessment and Peer Review Process. HLC/NCA Position on Assessment: Focus on Student Learning. Strategy for understanding, confirming, and improving student learning More than a response to demands for accountability More than means for curricular improvement

leola
Download Presentation

Course Level Assessment and Peer Review Process

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Course Level Assessmentand Peer Review Process

  2. HLC/NCA Position on Assessment:Focus on Student Learning • Strategy for understanding, confirming, and improving student learning • More than a response to demands for accountability • More than means for curricular improvement • Only beneficial if used to make a difference in student learning

  3. Five Fundamental Questions and Focus on Student Learning • How are your stated student learning outcomes appropriate to your mission, programs, and degrees? • What evidence do you have that students achieve your stated learning outcomes? • In what ways do you analyze and use evidence of student learning? • How do you ensure shared responsibility for assessment of student learning? • How do you evaluate and improve the effectiveness of assessment of student learning efforts?

  4. Peer Review • Provide evidence that students are achieving stated student learning outcomes • Provide snapshot of progress implementing course assessment • Identify best practices at Northern • Identify faculty development needs • Close the feedback loop • Provide mechanism to assure continuous improvement of assessment process

  5. Peer Review at Northern Pilot Project: Summer 2007 • Developed evaluation rubric • Conducted pilot project for peer review • Examined findings from process • Evaluated process and developed implementation plan

  6. Peer Review at Northern Pilot Project: Summer 2007 • Eight reports selected for pilot project review • Reports reviewed by at least 2 reviewers • Peer reviews completed according to rubric criteria • Composite reports compiled for each course reviewed • Composite reports reviewed by committee • Evaluated process

  7. Lessons Learned: Strengths • Course assessment is occurring throughout most of the College • 100+ course assessment activities completed since plan revision in 2005 • Assessment is being used to document/validate student learning • Assessment is being used to improve curriculum and instructional strategies • Reports demonstrate collaboration and shared responsibility among faculty • Some assessment activities build upon previous assessment

  8. Lessons Learned: Challenges • Strengthen assessment reporting • Insufficient information on reports: • SLOs being assessed • Performance standards • Methodology • Data not provided for all SLOs selected for assessment • Recommendations not clearly supported by data provided

  9. Lessons Learned: Challenges • Clarify purpose of report and report audience • insufficient detail for use by others not directly involved in the assessment activity • Incorporate previous assessment activities/results • Strengthen feedback loop • Standardize reporting format

  10. Peer Evaluation Rubric Four Components • Student Learning Outcomes Assessed • Method of Assessment/Data Collection • Assessment Results: Summarization and Analysis • Recommendations/Action Plan

  11. Peer Evaluation Rubric: Levels of Development • Undeveloped • Developing • Established

  12. Student Learning Outcomes Assessed • Undeveloped • Student Learning Objectives • Developing • To have an 85% pass rate on the National Exam of a 75% or above. • Established • The specific MCG SLPOs are: • included a list of specific outcomes included in assessment project

  13. Methods of Assessment/Data Collection • Undeveloped • None listed • Developing • Formative • Class discussions during assigned class periods • Post-clinical question and answer sessions • Summative • Competency evaluations • Daily Clinical Evaluations • Midterm and Final Evaluations

  14. Methods of Assessment/Data Collection • Established • The (XXX) assessment consisted of 12 problems with a total of 15 questions. These questions were a common part of the final exam for each of the (course) sections on all campuses. There was one (course) section on each campus and … taught by a full-time faculty member. A total of 43 students participated in the assessment. (Note: Questions were linked with SLOs being assessed)

  15. Assessment Results: Data Summarization and Analysis • Undeveloped • …Students were learning the vast majority of the outcomes desired. …lack of learning … came from one or two students. • Developing • 92% of the student answered the questions correctly. 2. 97% of the students answered the question correctly and 100% of the students completed the case study. 3. 94% of the students got the questions correct and 100% of the students completed the case study.

  16. Assessment Results: Data Summarization and Analysis • Established • Provided table that included: • Question/indicator used • MCO/SLPO • Number/% of students earning full credit • Number/% of students earning partial credit • Number/% of students earning no credit

  17. Recommendations/Action Plan • Undeveloped • … recommend … continue teaching in the manner that has worked for the past twenty years and still seems to work. • Developing • Content covered in the textbook and classroom supports the learning objectives. The only change at this time is that the textbook has a new edition … would recommend the new edition.

  18. Recommendations/Action Plan • Established • Since at least 74% of the students earned full credit or partial credit on 10 of the 15 questions, no recommendation is necessary for those 10 questions. The MCG SLPOs for the other 5 questions should have more emphasis placed on them and those MCG SLPOs should be evaluated again the next time (course) is scheduled to be assessed. The 5 MCG SLPOs needing more emphasis … are SLPOs… which correspond to assessment questions…respectively. • COMPARISON OF 2006 AND 2004 ASSESSMENT RESULTS: • Three of the 2006 SLPOs that were identified as needing more emphasis were also outcomes identified in the 2004 assessment…

  19. Peer Review: Samples • Application Session • Packet of 4 samples • Peer Review Scoring Rubric

  20. Sample 1 • Student Learning Outcomes Assessed (Undeveloped) • Need copy of MCG when all SLO’s utilized in assessment • No SLOs were listed • No SLOs or performance standards were listed. • Method of Assessment/Data Collection (Undeveloped) • No method of assessment was given • Section on report was blank, but refers to use of a survey in the results section.

  21. Sample 1 (continued) • Assessment Results: Summarization and Analysis (Undeveloped) • Undeveloped analysis • Vague percentage information given • Results not directly compared to earlier assessments. • No link to specific learning outcomes or performance standards…subjective statement “meet our expectations” • Reference made to prerequisite change but not clear how that impact is assessed. • Recommendations/Action Plan • Undeveloped • Given the lack of information regarding SLOs and performance standards, it is difficult to determine if the recommendation is supported by the data collected. • Developing • Developing recommendation • To support conclusion about success of prerequisites requirement, it will be necessary to reassess with some students who take the course after taking required prerequisites

  22. Sample 2 • Student Learning Outcomes Assessed (Undeveloped) • No SLO listed on report • No info on SLO being assessed, but given info in results section, SLOs could have been referenced. • Student learning outcomes not identified…This collection of assessment identified effectiveness of teaching as an outcome. • Method of Assessment/Data Collection (Developing) • Utilizes the “comparison of “A” portfolios from the past 10 years”—does not note what an “A” portfolio includes. • Use of portfolios and comparison standard set, but not clear what the standards are. Likely to be a sound assessment process, just not enough info provided.

  23. Sample 2 (continued) • Assessment Results: Summarization and Analysis (Developing) • notes the number of portfolios meeting the instructor’s assessment criteria for an “A” and also the information lacking in the other portfolios. • As noted previously, no SLO listed, therefore no results linked to SLOs • No comparison to earlier assessment results are given • Data provided … but not clear regarding linkage to SLOs and performance standards. Minimum performance standard should be stated rather than the highest standard… no comparison to earlier assessment results are given. • Stated results of comparison but no analysis of data. Data base was small… Need to expand upon this. • As a comparison with measurable results. As 33.3% of portfolios were comparable to previous years, with 66.6% lacking in the following areas: • Recommendations/Action Plan • Recommendations do address teaching and student learning. Data if identified would likely support recommendation. • Does provide recommendation for future action, but not clear as to which SLOs were targeted. • Does note comparison to earlier portfolios, but not clear if this is based on an assessment activity. • Results do not appear to be shared with other faculty or have other faculty participating in the project.

  24. Sample 3 • Student Learning Outcomes Assessed (Established) • Report includes eight clearly stated and measurable SLO. • These are clearly and concisely written describing specifically what the students will know upon course completion. • Clearly stared learning outcomes. Performance measure addressed in methods section. • Method of Assessment/Data Collection (Established) • Methods of assessment are clearly described and utilizes multiple and appropriate approaches to assess student learning. Four types of assessment methods are utilized in this course ranging from informal methods of in-class activities to formal activities including exam questions. • Methods are well defined and incorporate multiple measures. Since using embedded activities, becomes a seamless process for students.

  25. Sample 3 (continued) • Assessment Results: Summarization and Analysis (Established) • The assessment results yield data on each of the stated SLO. Specific numerical data is provided noting the number of students mastering each concept. The specific methods utilized in the assessment of each SLO are described in this area. No comparison to earlier data noted. Also, there is no benchmark level noted for mastery of each SLO. • Numerical data is clear regarding attainment. Would be great if this could be completed across all sections of (course). Attaching assessment instruments would be helpful for others using report. • Recommendations/Action Plan (Established) • Reference to the obtained data is utilized in the recommendation section. Instructor notes the SLO that has the least amount of students mastering the concept and describes specific changes to be made in the lecture during the next course offering in order to increase mastery of the concept. • No evidence of shared data, but does allude to comparing results from the next assessment cycle of this course to this current report. • Recommendation appears to be supported by the data presented and relevant to SLOs presented. • Does not appear to be an assessment activity conducted across multiple section or by other faculty.

  26. Sample 4 • Student Learning Outcomes Assessed • Developing • Given the info provided, it seems that the SLOs are likely to be well defined, but the info was not available in the report. Attaching the MCG or listing outcomes will help resolve this. • Stated that all SLO were being assessed. May need to explain the purpose of assessing all of the Student Learning Outcomes. Performance indicator identified as “covered and learned” of all objectives. • Established • The outcomes identified are clearly written, however, only the 4 lower rated outcomes are specifically identified. Should the other 52 outcomes be attached to the document rather than just numbered? • Method of Assessment/Data Collection • Developing • Clear that survey was used as indirect method of assessment; would be helpful to see instrument or sample of questions supporting the SLOs being measured. • Multiple sections used to obtain data for evaluation. • Established • This assessment is based on a combination of student surveys of covered/learned objectives and multiple choice questions. These are separately addressed statistically. What was the process for selecting the objectives to assess by multiple choice question? Was this based on prior assessment results? • Student survey method is identified and is an indirect method of assessment. Good job with identifying measurable objectives clearly of covered and learned and multiple choice method of question analysis. Identified when survey was completed.

  27. Sample 4 (continued) • Assessment Results: Summarization and Analysis (Established) • Results of the student survey of covered/learned objectives are compiled separately from the results of the assessment of multiple choice questions. Results are clearly listed with emphasis placed on objectives with lower success rates. It is recommended that re-evaluation be done in Fall 2008 on “normal 2-year cycle.” Are results available to compare these Fall 2006 results with a Fall 2004 evaluation? • Multiple sections used as sample • Data clearly presented. • Grouped by performance standards. Identified outcomes and content areas requiring additional focus • Good assessment results. Identified not only student learning of objectives but summarization of collection difficulties. Measurable, clear percentage analysis of student perception. • Recommendations/Action Plan (Established) • The recommendations effectively analyze the lecture and lab components of the objectives and take into account the lack of some material in the text. Identifies appropriate steps to be taken in relation to lecture re-emphasis. Good identification of possible negative factors (percentage of students involved and omission by adjunct faculty) and the effect this could have had on the results of the evaluation. • Recommendations supported by data. • Noted shortcomings of methodology and potential impact on analysis and recommendations. • Attempted to involve adjunct faculty in addition to full time faculty. • Recommendations were clear, reasonable, and related to student learning and assessment techniques.

  28. Peer Evaluation: Next Steps • Complete Course Assessment Peer Review (2006 Courses) • Timeline: • Individual Reviews: September 15 • Composite Course Reports: November 15 • Peer Review Summary/Recommendations • Eliminate backlog by end of Fall 2007

  29. Why Assessment? • Support student learning • Ensure student academic achievement • Improve instruction • Improve instructional programming

  30. Course Assessment • Determines student attainment of intended course level learning outcomes • Identifies need to refine course materials and learning outcomes • Involves multiple sections of same course • Examines role of course in meeting program level learning outcomes

  31. Final Notes: Course Assessment • Focus assessment efforts on key areas of concern • What do you really want to know • Don’t attempt to assess everything at once • Plan to assess learning outcomes over a brief period of time • Use multiple assessment techniques • direct and indirect measures • Assessment activities will evolve over time. • If something is not working or new questions evolve, it is OK to change assessment activities • Close the feedback loop: Note concerns and successes, share results with other faculty

More Related