220 likes | 360 Views
Tips for Writing SACS Program Assessment Reports. Office of Planning, Institutional Research, and Assessment (PIRA). March 2013. Relation Between Existing Assessment and SACS Reports. Ideally you already assess students’ learning
E N D
Tips for Writing SACS Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) March 2013
Relation Between Existing Assessment and SACS Reports • Ideally you already assess students’ learning • Ideally you already improve your program to increase student achievement • SACS reports should simply describe these activities • Reports should follow SACS guidelines and use SACS terminology to assist SACS reviewers • Data or other findings that measure student learning should be included, as should interpretation of findings • But don’t create special data collection process for SACS; just summarize existing processes • Initiatives to improve should be included
Ensure that SACS Reviewers Will See Clear Evidence that You Have . . . • defined desired mission, student learning outcomes (SLOs), and related measures, • collected and evaluated results from ongoing assessment, • undertaken actions to continuously improve learning. • Help reviewers find key components quickly & easily. Define SLOs & Measures Implement Change (Improve) Collect Findings Evaluate Results
Use PIRA Checklist to Ensure Key Elements Are Included: • mission and program outcomes (objectives) • student learning outcomes (3+) and related measures • (2+ each, 1 should be direct) • assessmentfindings: results for measures of student learning from multiple years (if feasible) • discussion of results:faculty review of findings, including whether performance of students meets expectations • discussion ofchanges: initiativesto improve student learning and/or program • evidence continuous improvement has occurred (new for 2013) • clear narrative and organization to make compliance obvious (does everything make sense?)
Program Assessment at the University of Miami Discussion for Continuous Improvement Faculty Review: Do findings show continuous improvement? Program Improvement: What changes should be made? Assessment Findings: Data for EACH measure for 2+ years Office of Planning, Institutional Research, and Assessment (Rev 3-2013) Assessment Measures Capstone reviewed with faculty-developed rating grid Student Learning Outcome 1 (Definition) Graduating Student Surveys Exam questions that clearly relate to outcomes Student Learning Outcome 2 (Definition) Course Evaluations Mission Statement & Program Outcomes/Objectives Graduate School Dissertation & Thesis Rating Grid Other Indirect Measures Student Learning Outcome 3 (Definition) Other Direct Measures • A program should have 3-5 measurable outcomes, each tied to the program mission. • Student learning outcomes relate to attainment of knowledge, skills, behaviors, or values. • Common outcomes include: knowledge of theory and research in the field, ability to think critically about the field of study, oral and written communication skills. Your mission statement and program outcomes (objectives) should align with the mission of the University and your program’s strategic plan. • For each outcome 2-3 measures are required; at least one must be a direct measure (direct - dark blue, indirect - light blue) • A single measure (e.g., rating grid) can assess more than one outcome. • Build operationally realistic assessments into your annual departmental calendar. • Assessment findings should assist in identifying areas for improvement within programs. • Identified and resolved changes should be reflected in the discussion section of reports to PIRA.
When Writing Your Mission Statement You Should . . . • tie it to UM Mission: • “The University of Miami’s mission is to • educate and nurture students, to create • knowledge, and to provide service to our • community and beyond. Committed to excellence and proud of the diversity of our University family, we strive to develop future leaders of our nation and the world.” • and your strategic plan • describe program outcomes/objectives (e.g., prepare graduates to . . ., teach gen-ed courses, research, service)
When Writing Student Learning Outcomes (SLOs) You Should . . . • describe reasonable expectations for • student learning (knowledge, skills, values, • and behaviors) • include at least 3 SLOs, each with correct • structure and language • make SLOs easy to identify (e.g., use bolding & • numbering) and clearly stated (follow expected • structure • Most common error: Programs describe what theydo. • Solution: Describe what you want students to learn.
Structure of SLOs • Start with words like • Students… Graduates… We want students to… • Include verbs or phrases like • will demonstrate… should have ability to … will analyze and synthesize… • Include words like • …breadth of understanding of… • …mastery of… • …a capacity for… • Describe expected competence (e.g., broad knowledge, communication, critical thinking)
Examples of Bad and Good SLOs: Instead of . . . “Help students develop research skills by providing opportunities for supervised laboratory practice.” write “Graduates will demonstrate the ability to conduct laboratory research.” ---------------------- “Students will participate in interpersonal, interpretative, and presentational communicative activities and be guided in the development of literacy skills in the language of study through the communicative acts of reading, writing, and creating discourse around texts of all types.” write “Students will demonstrate literacy skills in the language of study through the communicative acts of reading, writing, and creating discourse around texts of all types.”
Possible SLOs • Students should demonstrate an overall knowledge and understanding of the core concepts in [insert program here], including the essential skills to conduct research in the [insert program here]. • We want students to graduate with strong written [and/or oral] communication skills. • Our doctoral students should be able to conduct independent research worthy of publication. • Graduates should have an understanding and capability to work with the systems and hardware components that support software. • Students should demonstrate critical thinking, including the ability to analyze, synthesize, and draw valid conclusions.
When Writing Measures, You Should . . . • ensure each SLO has 2+ measures • ensure at least 1 direct measure (objective outside source—see p. 4 of Resources) • ensure indirect measure (usually self-reported measure) accompanied by direct—see p. 4 of Resources • Instead of course grades or pass rates used (SACS discourages), substitute project grades (plus description relating exam/project to SLO) • consider rating grids since easier to trend over time and 1 grid can be used for all SLOs—see pp. 8 & 9 of Resources Most common error: Programs describe how faculty provide feedback to help individual students. Solution: Describe aggregate measures used to evaluate student learning.
Examples of Bad and Good Measures: Instead of . . . “Students are given tests…” write “Grades from tests that measure the students’ ability to [describe what test is for] will be used to assess [SLO].” ---------------------- Table of grades for course use Table of grades for final paper (plus description of assignment using language of SLO)
Good Graduate Program Measures (Can Rewrite SLOs to Correspond) • Graduate School Rating Grid at final defense (already supposed to be using); fast and easy (PIRA will analyze—see pp. 8 & 9 of Resources) • Same rating grid, but used for proposal defense (and/or for each year in program)—use same standards for both to show students’ progress • Qualifying/comprehensive exam (but need to explain what’s tested so link to SLO is clear) • Rating grids from supervisors of TAs, RAs, GAs, internships • Ratings from audience for presentations on student research • Number of publications, conference presentations, grants • Graduating Master’s Student Survey (items similar to ones on p. 10 of Resources available from PIRA)
Good Undergraduate Measures (Can Rewrite SLOs to Correspond) • Graduating Senior Survey—very easy (PIRA/Toppel collect, analyze, send); small programs should use combined years (green column) rather than trends (orange columns)—see p. 10 of Resources • Rating grids for capstone papers, projects, etc. (see p. 8 of Resources for sample you can adapt) • Grades from items on tests or assignments that directly measure a given SLO • Rating grids from supervisors of internships, practica • Additional items relating to improvement in each SLO that are added to faculty evaluations or final exams • Existing items on New General Form for faculty/course evaluations that relate to critical thinking or communicating on the subject
When Writing Assessment Findings, You Should . . . • ensure each measure has corresponding findings (and no findings without earlier measure) • insert corresponding outcome/measure as heading for each set of results • ensure multiple years or insert explanation that data not provided for new program/revised measures: “As part of the major three-year “continuous improvement update” of our program assessment report in 2012, we decided to start using rating grids in conjunction with XXX [e.g., senior projects] to allow us to more easily monitor changes in student learning over time. Because this is a new measure, we have data for only the 2012-13 academic year, but we will continue to update the data in upcoming years to monitor continuous improvement in student learning.”
When Writing Assessment Findings, You Should . . . • if measure is a narrative rather than data, ensure summary plus sample evaluations or insert statement (see p. 6 of Resources) • ensure results are presented clearly (tables) • decide if appendix of data, survey instrument, etc. necessary (usually not) • put findings related to Program Outcomes (new for 2013) under new sub-heading: Findings Relating to Program Outcomes • Most common errors: Programs simply state they • evaluate student learning or omit measure(s). • Solution: You should provide evidence • of assessment activity (table/text • summary of findings).
When Writing Discussion Section, You Should Ensure . . . • statement that faculty as a group reviewed (e.g., dates/minutes of meeting) • discussion of whether faculty think students demonstrated desired level of learning • initiatives you implemented to improve student learning–see p. 6 of Resources • whenever possible, an indication of which SLO is affected • whether improvements seem to be working (new for 2013)
In Discussion Section . . . • Most common errors: • No statement indicating faculty reviewed • No statement of how faculty think students are doing • No mention of which SLO affected by improvement initiatives • No mention of whether there has been improvement over time • Solutions include: • Dates or minutes of faculty meetings • Evaluation of how well each SLO achieved • Which SLO will benefit from improvement (if relevant) • Effectiveness of prior initiatives and how learning will be improved
Format/Organization/Wording Help SACS Reviewers Find What They Need • Add bold, indents, and/or underlines to assist reviewers • Nest measures under related SLOs • Label/nest Outcomes/Measures in Findings section • Include discussion of improvements/changes in Discussion section, not in SLO or Findings sections • Remove yellow template instructions • Use SACS terminology (Student Learning Outcomes, Measures of SLOs, etc.) • Delete extraneous text and data (clarity more important than length) • Expand acronyms (e.g., RSMAS, PRISM) • Spell check; fix typos
Tips for Writing an Efficient Report • Study resources and checklist before starting • Use existing assessments and available student work whenever possible (saves time and effort) • Consider developing a rating grid with 1-2 items per each learning outcome—see p. 8 of Resources • Contact PIRA for summary of results Graduate School Rating Grid; email PIRA scanned forms for students we don’t have • Use Graduating Senior Survey (GSS) or Graduating Master’s Student Survey (GMSS) summary • Consider starting with measures and then writing SLOs to go with them instead of the other traditional order
Alert: Changes for 2013 • Need to provide evidence of improvement based on initiatives, wherever possible (though sometimes hard to see, especially with small Ns and short time periods) • New emphasis from SACS: Need to add material (Findings, Improvements, and Discussion) related to ProgramOutcomes (NOTProcesses). Examples: • Retention/graduation rates, average time to degree (from PIRA) • Ratings from GSS or GMSS (from PIRA) • Job placement (from Grad Program Review profile) • Graduate program review, professional accreditation • Efforts to improve quality, interdisciplinarity, etc.
Questions for PIRA? Contact: Dr. David E. Wiles SACS Liaison and Executive Director Assessment and Accreditation (305) 284-3276