220 likes | 342 Views
Tips for Writing SACSCOC Non-Academic Program Assessment Reports. Office of Planning, Institutional Research, and Assessment (PIRA). Fall 2014. Relation Between Existing Assessment and SACSCOC Reports. Ideally you already evaluate your unit’s effectiveness
E N D
Tips for Writing SACSCOC Non-Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2014
Relation Between Existing Assessment and SACSCOC Reports • Ideally you already evaluate your unit’s effectiveness • Program Assessment Reports should describe these activities using SACSCOC guidelines and terminology • Data or other findings that measure operational and/or student learning outcomes should be included, as should interpretation of findings • Initiatives to improve should be included
Relation Between Existing Assessment and SACSCOC Reports • But don’t create special data collection process for SACSCOC; just summarize existing processes • Save yourself time and unnecessary work by adapting your existing annual report to the SACSCOC Program Assessment Report template
Types of Non-Academic Units • Administrative support services • Academic and student support services • Research • Community/public service
Larger Administrative Units . . . may prefer to submit a Program Assessment Report (PAR) for each office within the division, particularly if outcomes are not the same across those offices.
Ensure that Reviewers Will See Clear Evidence that You Have . . . • defined desired mission, program outcomes or objectives, and related measures, • collected and evaluated results from ongoing assessment (multiple years), • undertaken actions to continuously improve outcomes. • Help reviewers find key components quickly & easily Define Outcomes & Measures Implement Change (Improve) Collect Findings Evaluate Results
Use PIRA Checklist to Ensure Key Elements Are Included: • mission and program outcomes (objectives) • operational and/or student learning outcomes (2+) and related measures (2+ each, 1 should be direct measure) • assessmentfindings: results of measures from multiple years (if feasible) • discussion of results:review of findings, including whether performance meets expectations • discussion ofchanges: initiativesto improve program and whether continuous improvement has occurred • clear narrative and organization to make compliance obvious (does everything make sense?)
When Writing Your Mission Statement You Should . . . • tie it to UM Mission: • “The University of Miami’s mission is to • educate and nurture students, to create • knowledge, and to provide service to our • community and beyond. Committed to excellence and proud of the diversity of our University family, we strive to develop future leaders of our nation and the world.” • and your strategic plan • describe program outcomes/objectives (e.g., purpose of unit, type of support for students—including any research or service components)
When Writing Operational Outcomes You Should . . . • describe reasonable expectations in • measurable terms (efficiency, accuracy, • effectiveness, comprehensiveness, etc.) • include at least 2 outcomes • make outcomes easy to identify (e.g., use bolding & • numbering) and clearly stated (follow expected • structure)
An Operational Outcome Should • Focus on a current service or process • Be under the control of or responsibility of the unit • Be measurable • Lend itself to improvements • Be singular, not “bundled” • Be meaningful and not trivial • Not lead to “yes/no” answer • Source: Mary Harrington, Univ of Mississippi
Possible Operational Outcomes • Efficiency: The Registrar’s Office processes transcript requests in a timely manner. • Accuracy: Purchasing accurately processes purchase orders. • Effectiveness: Human Resources provides effective new employee orientation services. • Comprehensiveness: Financial Aid provides comprehensive customer service. Source: Mary Harrington, Univ of Mississippi
A Student Learning Outcome (SLO)* Should *if appropriate for your area • Start with words like • Students… Graduates… We want students to… • Include verbs or phrases like • will demonstrate… should have ability to … • Include words like • …mastery of… …a capacity for… • Describe expected competence (e.g., practical skills, communication, leadership, multi-cultural awareness)
Possible Learning Outcomes (not necessarily SLOs) for Non-Academic Units • Library: Students will have basic information literacy skills. • Career Services: Students will be able to create an effective resume. • Information Technology: Staff will know to use the student information system. • Human Resources: New employees will be familiar with the benefit package. Source: Mary Harrington, Univ of Mississippi
Examples of Measures • Research: number of grants, total funding, number of peer-reviewed publications, conference presentations • Administrative support: timeliness in processing orders, budget growth (or savings), complaint tracking/resolution, public safety improvements, audits • Academic/student support: number of students counseled, job placements, scholarship awards, seminar participation, leadership training participation • Community/public service: number of patients seen, community event participation, annual volunteer commitments
Do You Have Survey Data? • Often non-academic units use survey data for their assessment • Surveys are indirect measures of student learning, but they are direct measures of customer (client, employee, patient, student) experience Source: Mary Harrington, Univ of Mississippi
When Writing Assessment Findings, You Should . . . • ensure each measure has corresponding findings (and no findings without earlier measure) • insert corresponding outcome/measure as heading for each set of results • ensure multiple years or insert explanation that data not provided for new program/revised measures: “As part of the major three-year continuous improvement update of our program assessment report in FY 2013, we decided to start using customer satisfaction surveys in conjunction with service requests. Because this is a new measure, we have data for only FY 2014, but we will continue to update the data in upcoming years to monitor continuous improvement.”
When Writing Assessment Findings . . . • if measure is a narrative rather than data, ensure summary plus sample evaluations or insert statement • ensure results are presented clearly (tables) • decide if appendix of findings, survey instrument, etc. will be necessary (usually not) • Common error: Programs simply state they evaluate outcomes or omit measure(s). • Solution: You should provide evidence of assessment activity (table/text summary of findings).
When Writing Discussion Section, You Should Provide . . . • statement as to why these particular assessment instruments were used • analysis of the assessment findings • How are periodic reviews used for improvements? • How does the use of assessment results improve your services? • What changes have been implemented or will be developed to improve your operational and/or student learning outcomes? • evidence of improvement • general trends • specifically in response to improvement initiatives
Avoid These Common Errors In Writing Your Discussion Section • When describing initiatives to improve outcomes: The report simply lists initiatives. • Solution: Include brief commentary on which outcome will benefit. • When describing continuous improvement: The report does not include any evidence of improvement over time. • Solution: At least discuss efforts to improve outcomes.
Format/Organization/Wording Help SACSCOC Reviewers Find What They Need • Add bold, indents, and/or underlines to assist reviewers • Nest measures under related outcomes • Label/nest Outcomes/Measures in Findings section • Include discussion of improvements/changes in Discussion section, not in Outcomes or Findings sections • Remove yellow template instructions • Delete extraneous text and data (clarity more important than length) • Expand acronyms (e.g., RSMAS, PRISM, UMHC) • Spell check; fix typos
Tips for Writing an Efficient Report • Study resources and template before starting • Use existing assessments, available documentation, and your current reports whenever possible (saves time and effort) • Consider starting with measures and then writing outcomes to go with them instead of the other traditional order
Questions for PIRA? Contact: Dr. David E. Wiles Executive Director, Assessment and Accreditation Institutional Accreditation Liaison (305) 284-3276