1 / 42

Institutional Effectiveness Plans: Measuring Performance using Outcomes-Based Decision Making

Institutional Effectiveness Plans: Measuring Performance using Outcomes-Based Decision Making Presented by Dr. Cindy Dutschke, Director Office of Institutional Effectiveness and Planning Spring 2012. Workshop Outline Overview of IEPs Overview of Expectations Overview of Time Frame

karis
Download Presentation

Institutional Effectiveness Plans: Measuring Performance using Outcomes-Based Decision Making

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Institutional Effectiveness Plans:Measuring Performance using Outcomes-Based Decision Making Presented by Dr. Cindy Dutschke, Director Office of Institutional Effectiveness and Planning Spring 2012

  2. Workshop Outline • Overview of IEPs • Overview of Expectations • Overview of Time Frame • Resources for IEP Preparation • How to Prepare an IEP • How to write Mission, Goals, and Outcomes • How to Choose Proper Assessment Tools • How to Set “Criteria for Success” • Discussion and Questions

  3. What is Effectiveness? • power to be effective; the quality of being able to bring about an effect • Having an intended or expected effect • The degree to which objectives or outcomes are achieved and the extent to which targeted problems are solved. In contrast to efficiency, effectiveness is determined without reference to costs and, whereas efficiency means "doing the thing right," effectiveness means "doing the right thing.

  4. What decisions did you make about your • unit last year? • What evidence did you use to inform that • decision? • What was it that you were trying to influence, • or change, about your unit when making the • decision with the stated evidence? Ask Yourself These Questions (and you have Outcomes-Based Decision Making)

  5. The 4 main purposes of creating Unit IE Plans using evidence-based decision making : • To demonstrate a commitment to systematic examination of the quality of all that the unit does to improve itself. • To inform decision-makers of the contributions and impact of the unit to the development and growth of your institution. • To prove and demonstrate what the unit is accomplishing to stakeholders. • To provide support for campus decision-making activities such as strategic planning, as well as external accountability activities such as accreditation.

  6. Effective unit IEP should answer these questions: “What are we trying to do?” “How well and how efficiently are we doing it?” “How do we use what we learn from the answers to the first two questions to improve what we are doing, to inform policy discussions and resource allocations?” “What and how does our administrative unit contribute to the development and growth of students?

  7. Costs of Not Measuring Performance by Administrative Units • Decisions could be based on assumption rather than fact • Possible failure to meet customer expectations • Reliability • Efficiency • Quality • Cost • Delivery • Possible failure to identify potential problem areas • Lack of optimum progress toward institution vision

  8. Goals for IEPs using Outcomes-Based Decision-Making for Administrative Units • Strengthen our ability to say that our unit is efficiently • and effectively carrying out its mission • Increase our confidence that we are putting our time • and energy into activities that result in the outcomes • the unit and the institution value • Gather and display data that will allow us to make a • strong case for increased university funding for our • units.

  9. Typical Components of Part I of an IEP • Unit Name • Unit Mission or Purpose • Intended Outcomes • Assessment or Measurement Methods • Both direct and indirect measures • 5. Criteria for Success • Identify a threshold level at which the unit will decide that • they have successfully met the particular outcome.

  10. Typical Components of Part II of an IE Report • Program Name • Outcomes • Assessment or Measurement Methods • Criteria for Success • Actual Results • (summarize results for each outcome) • Use of Results to Make Decisions and Recommendations • summarize the decisions/recommendations that came out of each assessment of each outcome) • Identify the groups who participated in the discussion of the evidence that led to the recommendations and decisions • Summarize the suggestions for improving the assessment process • Identify those responsible for implementing the recommended changes • Identify when the outcome will be re-evaluated (if it is to be retained) • 7. Develop an Action Plan to Implement Improvement Strategies

  11. IEP Form (Excel) a b c d e • INTENDED OUTCOME – 3 to 8 major outcomes for each unit • MEANS OF ASSESSMENT – How will you measure this outcome? • TARGET (CRITERIA FOR SUCCESS) – At what level, or to what degree do you expect this outcome to be present in order to consider the attainment level successful? • ACTUAL RESULTS OBTAINED – When you measure (or observed or surveyed, etc.) what did you find? • USE OF RESULTS FOR IMPROVEMENT – What impact did this have on the unit and how will it be used for unit improvement?

  12. Using WEAVE Online to track QAPs/IEPs • (Training on WEAVE Online will be held next Spring!) Why WEAVE Online? • User-friendly, web-based program that makes tracking of IE at AUS accessible to all • Contains areas for University Mission, unit mission, goals, outcomes, means of assessment, criteria for success, actual results, a statement of how those results will be used to improve the unit, and action plans where requests for resources can be prioritized • Allows tying unit goals and outcomes to each of the following (which greatly facilitates the accrediting process): • Specific General Education Outcomes • University Strategic Goals • CAA Standards for Program and/or University Re-licensure • Middle States Standards for University Re-accreditation

  13. Why do administrative units need to prepare an annual IEP? Administrative units should define their mission, establish goals and determine how to measure outcomes associated with those goals so that key processes that meet the needs and expectations of students, parents, employers, faculty and other stakeholders can be improved on a continual basis.

  14. Assessment relates to measuring critical administrative processes in order to gather data that provides information about how the institution is meetingstakeholders’ needs and expectations. A benefit of measuring performance in administrative units is that it provides the basis by which AUS employees can gain a sense of what is going right and what is going wrong within the institution. This process ultimately establishes direction for improving quality and constituent satisfaction.

  15. Preparing Your IEP What is a Mission Statement? • Mission: The mission statement includes the unit’s most specific values, goals and purpose; it provides the most important context for understanding the unit’s contribution to what is important at AUS; why do you do what you do?

  16. Examples of Unit Mission Statements • The mission of the human resources department is to contribute to organizational success by developing effective leaders, creating high-performance teams, and maximizing the potential of individuals. • The mission of corporate security is to provide services for the protection of corporate personnel and assets through preventives measures and investigations. • Provide excellence in information technology solutions and services that will facilitate the vision, objectives and goals of the university.

  17. Preparing Your IEP What are Outcomes? Objectives or Outcomes: Brief, clear statements that describe the desired quality (timeliness, accuracy, responsiveness, etc) of key functions and services within the unit; what you plan to achieve; Outcomes should focus on the impact or result of your efforts, and not the process that you use to get there Operational outcomes (as opposed to learning outcomes) define exactly what the services should promote (understanding, knowledge, awareness, appreciation, etc.) Outcomes can relate to operations and processes of the unit and may include a consideration of demand, quality, and efficiency and effectiveness

  18. Preparing Your IEP (cont) What are Student Learning Outcomes? Objectives or Outcomes: Brief, clear statements that describe something specific that the students will know once they complete the program; Learning outcomes should focus on the impact or result of your efforts, and not the process that you use to get there When your students graduate, what are they supposed to have learned should be demonstrable

  19. Developing Outcomes Be SMART: Specific – associate the outcome with key process and services provided; SLO: what your graduates should know Measurable – it must be feasible to collect accurate and reliable data considering your available resources Aggressive but Attainable – what types of things are you striving for? what direction do you want to move? What would you like to accomplish? Results-oriented – it should aid in identifying where unit or program improvements are needed and describe where you would like to be in a specified period of time Time-bound – it should indicate a timeframe for assessment

  20. Quality Check: Outcomes Statements • The outcome describes a result rather than a process. • The outcome describes what the recipient of the service will be able to demonstrate • The outcome is measurable. • The outcome is specific. • The outcome addresses no more than a single result (uses no conjunctions!). • The outcome uses action verbs that specify definite, observable behaviors. • The outcome is clear: faculty, students, administrators, and people outside the unit are able to understand it. • The outcome is validated by unit/program colleagues. • The outcome is clearly linked to unit goals. • The outcome is reasonable, given the staff/faculty of the unit

  21. Form and Structure of Outcomes • The SLOs are specific to the program they are associated with. • The SLOs focus on what is critical to the program. • The SLOs describe the knowledge, skills and dispositions that students are expected to gain as a result of their completion of the program. Example: “English graduates are able to…” vs. “The English program provides students with…”. • The focus is on what students should achieve and not on what faculty is going to do or what the program offers. • The SLOs are clear and understandable to both faculty and students. • The SLOs are written to an appropriate level of specificity while still allowing a certain amount interpretation leeway so that faculty members can reach consensus. Example: “English graduates are able to critique a brief draft essay pointing out the grammatical, spelling and punctuation errors and offer appropriate suggestions for correction of deficiencies” vs. “English graduates know how to provide students with feedback on written essays”. Generally, highly prescriptive curriculums have more specific outcomes while curriculums that allow students a lot of choice in how they meet the requirements usually use broader outcomes. • The SLOs use action verbs. It is better to use concrete verbs such as define, classify or formulate rather than vague verbs like understand or know. Otherwise, it may take more time for faculty to reach consensus about the criteria that need to be used to determine whether a student “knows” something. A table showing various verbs for knowledge, skills, and dispositions is available below. • The SLOs are realistic given the typical student who enters the program, the expected level of rigor in program courses, and the resources available to support student learning. • The SLOs are assessable. It should be feasible to measure the outcome. • How Many Are Too Many? A unit or a program may have only a few student learning outcomes or a long list of them. A total of five to seven student learning outcomes is typical. The number of student learning outcomes a program has is not as important as the number of student learning outcomes the program is trying to assess in any one year.

  22. Examples of Verbs for Student Learning or Operational Outcomes Knowledge Acquisition and Application • Add Apply Arrange Calculate Categorize Change Chart Choose Classify Complete Compute Construct Count Define Demonstrate Describe Discover Discuss Distinguish Divide Dramatize Draw Duplicate Employ Examine Explain Express Graph Identify Illustrate Indicate Inform Interpolate Interpret Label List Locate Manipulate Match Memorize Modify Name Operate Order Outline Point Predict Prepare Produce Quote Rank Read Recall Recite Recognize Record Relate Repeat Report Reproduce Restate Review Select Show Solve Specify State Stimulate Subtract Summarize Translate Use Higher Order Thinking Skills • Adapt Analyze Assess Calculate Categorize Classify Combine Compare Compile Compose Contrast Create Criticize Defend Design Devise Diagram Differentiate Dissect Estimate Evaluate Explain Formulate Generate Group Infer Integrate Invent Investigate Judge Justify Modify Order Organize Plan Prescribe Produce Propose Rate Rearrange Reconstruct Reflect Related Reorganize Research Review Revise Rewrite Select Separate Specify Summarize Survey Synthesize Test Transform

  23. Examples of Verbs for Student Learning or Operational Outcomes (cont) Psychomotor Skills • Activate Adapt Adjust Align Alter Apply Arrange Assemble Calibrate Change Check Choose Clean Combine Compose Conduct Connect Construct Correct Create Demonstrate Describe Design Detect Differentiate Dismantle Display Dissect Distinguish Employ Follow Identify Install Isolate Locate Make Manipulate Measure Operate Originate Perform Prepare Produce React Rearrange Relate Remove Reorganize Repair Replace Respond Revise Select Separate Set Show Sketch Sort Test Transfer Troubleshoot Tune Use Vary Attitude, Values, & Dispositions • Accept Acclaim Accommodate Act Adhere Adopt Advocate Alter Answer Applaud Approve Arrange Ask Assist Associate Assume Attend Balance Believe Challenge Change Choose Classify Combine Complete Comply Conform Cooperate Debate Defend Deny Describe Develop Differentiate Display Endorse Enjoy Establish Express Follow Form Formulate Give Greet Have Help Hold Identify Influence Initiate Integrate Interpret Invite Join Judge Justify Listen Obey Organize Participate Perform Persuade Practice Present Propose Protest Qualify Question Reflect Report Resolve Respect Revise Select Serve Share Show Solve Subscribe Support Tell Use Verify Volunteer Weigh Work

  24. Selecting Assessment Methods Be MATURE: Match – match the outcome with the appropriate assessment method; what will you measure? Appropriate – choose methods that are appropriate (direct or indirect) making sure they are good assessors of effectiveness of the service of the unit Target – each measure should specify the desired level of performance Useful – method should provide useful and useable information. Reliable – the measure is based on tested, know methods Effective and Efficient – each approach accurately and concisely measures the outcome

  25. Choosing Assessment or Measurement Methods for Operational Outcomes Direct Measures: methods that assess demand, quality, efficiency and effectiveness Example: efficiency may address completion of service, productivity of service and efficiency of individual points of service (academic advising, computer assistance, etc.) Indirect Measures: methods that measure experience (perception) rather than their knowledge or skills; the perceptions of functions and critical processes. Example: assessment of perception of services (e.g., orientation, financial aid, etc.)

  26. Direct Measures SLO Direct Measures: methods that assess knowledge learned • Using a rubric (setting your criteria ahead of time) to evaluate how well each student has achieved an outcome (either through a project, a portfolio, a specific test question, etc. Locally developed pre- and post- • Tests (freshman to senior) • Course-embedded assessment • Comprehensive exams • Major Field Achievement Tests • GRE subject exams • Certification/licensure exams • Senior thesis / major project • Portfolio evaluation • Case studies • Reflective journals • Capstone courses • Internship evaluations • External examiners/peer review Administrative Units Direct Measures • Tracking use of services (attendance, clients, etc.) • Timelines and budgets • Measuring efficiency (time to complete projects, cost of utilities, etc)

  27. Indirect Measures Indirect Measures: methods that measure experience (perception) rather than their knowledge or skills; the perceptions of functions and critical processes. • Exit interviews • Alumni survey, employer survey • Curriculum and syllabi analysis • Faculty and Staff satisfaction surveys

  28. Challenges to Selecting Assessment or Measurement Methods • Realize differences between units within a division – each unit has a unique and distinct mission, goals, and outcomes; some methods will work for one unit and not so well for others • Start small – when developing and using a new measurement method, start small and test it so you do not waste valuable time and resources if it does not work • Allow for continuous feedback – discuss methods with key staff members • Match the assessment method with the outcome and not the reverse – develop and write your unit goals and outcomes BEFORE selecting assessment methods.

  29. Examples of DIRECT Assessment or Measurement Methods • Locally developed tests • Example: tests administered before and after a service is provided • Embedded questions • Example: specific questions designed to see if clients experienced something you expected them to • Rubrics • A product or performance is done by participants and the criteria for evaluating the product have been determined ahead of time and listed with guidelines for evaluating the quality of each aspect listed • Behavioral Observations • A supervisor observes specific occurrences of a behavior that is one of the targets for a group involved in a particular experience

  30. Examples of INDIRECT Assessment or Measurement Methods • Customer Satisfaction Surveys – were the perceptions of the unit’s customers satisfied with the services • Faculty Satisfaction Surveys – one or more questions on a general faculty survey that would measure the perception of the unit’s services • Staff Satisfaction Surveys - one or more questions on a general staff survey that would measure the perception of the unit’s services • Student Satisfaction Surveys - one or more questions on a general student survey that would measure the perception of the unit’s services

  31. Setting a Criteria for Success Ask yourself: what level is acceptable as evidence of success? Must state the target or criteria at the beginning of the process, not after you have assessed it. It should be a targeted value that you feel that your unit could obtain if it were satisfactorily fulfilling its mission effectively and efficiently.

  32. Example of an IEP using WEAVE Online SoftwareAmerican University of Sharjah 2011-2012 Library Mission / Purpose The AUS Library plays an active and integral role in furthering the educational mission of the University by connecting students and faculty to the world of information and ideas. Librarians, in curricular partnership with the faculty, provide instruction, resources and services that strengthen student research skills, promote critical reflection and foster academic excellence. O 1:Basic info literacy (IL) competencies in locating, evaluating and managing information resources. AUS students will demonstrate basic information literacy (IL) competencies in locating, evaluating and managing information resources. Relevant Associations: Strategic Plans: American University of Sharjah 1.1 AUS will continuously improve the quality of undergraduate and graduate teaching and learning. Related Measure: M 1:Senior Exit Survey Senior Exit Survey. Source of Evidence: Student satisfaction survey at end of the program Target: The Senior Exit Survey shows that 80% or more students believe the Gen Ed Requirements provide the research skills that enabled the identification, location, retrieval and evaluation of information resources. M 2:LibQUAL Survey LibQUAL Survey. Source of Evidence: Academic indirect indicator of learning - other Target: The LibQUAL gap analysis survey questions about Information Literacy Outcomes are greater than 6 on a 1-9 point scale.

  33. M 3:Rubrics for WRI 102 and ENG 204 Rubrics used by WRI 102 and ENG 204 faculty. Source of Evidence: Academic direct measure of learning - other Target: Faculty in WRI 102 & ENG 204 using the library-designed rubric report a satisfactory level of student competency in IL. ****************************************************************************************** O 2:Curricular and targeted research needs of students and faculty. Curricular and targeted research needs of students and faculty will be met by acquiring and providing access to print and digital resources. Relevant Associations: Strategic Plans: American University of Sharjah 1.1 AUS will continuously improve the quality of undergraduate and graduate teaching and learning. 1.3 AUS will enhance the quality, volume and reputation of research produced by AUS faculty and students. Related Measure: M 4:LibQUAL Survey LibQUAL Survey. Source of Evidence: Academic indirect indicator of learning - other Target: The LibQUAL gap analysis survey shows that most faculty and students are satisfied with available resources. M 5:Document delivery, ILL, and acquisition of physical resources. Calculate turnaround time on document delivery, ILL, and acquisition of physical resources. Source of Evidence: Evaluations Target: 90% of ILL and document delivery will be filled within 5 working days. 90% of physical items ordered are received within 6 weeks.

  34. Gathering Data • During the majority of the academic year, you will be collecting the data that you stated you were going to use to measure your intended outcome. • This will be done periodically (weekly, monthly, by semester, etc) or just once – depending on what means of measurement you selected

  35. Reporting Actual Results Actual results include reporting and analysis of assessment findings that identify strengths and weaknesses of a unit/program Examples for SLO Results: • Senior research papers in seminars scored an average of 86 on the research rubric • No student fell below 80 on any of the ten program evaluative criteria

  36. Examples of Unit Results • 96.2% of all reports from this office were created before the stated deadline. • 72% of students requesting advising appointments were scheduled within 24 hours of the request • 81% of all applications were processed within one week of receiving them.

  37. Use of Results“Closing the Loop” Use of Results is a very important component of the assessment cycle, and it should include specific actions or changes implemented or to be implemented based on assessment findings for continuous quality assurance. This part of the Annual QA/IE Plan is completed at the end of the cycle

  38. Examples of use of results Examples of reporting the use of results of SLO assessment: • Assessment findings of the senior research papers indicated several areas to be emphasized for program improvement: (1) use of a range of resources of the assigned research, (2) ability to evaluate sources cited as to authority, accuracy, and currency, (3) ability to produce a scholarly work that is more than a collation of ideas, (4) use of oral and written language appropriate to the discipline and audience. • Results from scoring rubrics indicated that there is a need to strengthen students’ writing efficiency. The rubric results have been incorporated into a presentation at a Department meeting for the departmental faculty. Faculty of research seminars will continue to use the scoring rubric to address weaker performance areas and compare their assessment with those of others.

  39. Examples of use of results • The Academic Advisor’s Committee will develop a priority tracking process that will assist them in returning calls more quickly. Additionally, two new positions for academic advisors are requested during the budget cycle. • A new admissions module for the computing system is requested during the next budget cycle. This module will track applications from point of first contact until entry and should facilitate more rapid processing of applications.

  40. Summary Outcomes assessment is here to stay! This type of assessment will result in a process that will all the institution to continually evaluate the institutional effectiveness of all academic programs as well as all administrative units. This, in turn, allows the institution to evaluate its progress toward fulfilling its mission.

More Related