410 likes | 504 Views
Not Just an AIRS Standard Program Evaluation & Quality Assurance. Robert McKown, CIRS Director of Evaluation and Accountability Sherri Vainavicz, CIRS Manager, UW’s 2-1-1 Heart of West Michigan United Way Grand Rapids, Michigan. Workshop Objectives.
E N D
Not Just an AIRS Standard Program Evaluation & Quality Assurance
Robert McKown, CIRSDirector of Evaluation and AccountabilitySherri Vainavicz, CIRSManager, UW’s 2-1-1Heart of West Michigan United WayGrand Rapids, Michigan
Workshop Objectives • Understand and utilize quality assurance measures for managing and strengthening I&R Services
Community Impact • What does your I&R service provide your community? • What is the “impact”? • How do you know this?
Program Evaluation “…the systematic collection of information about the activities, characteristics, and outcomes or programs to make judgments about the program, improve effectiveness, and/or inform decisions about future programming” Patton, 1997 Source: Salvatore Alaimo, PhD – Grand Valley State University
Program Evaluation is… • A means for organizational learning • Time and effort well spent ensuring- • The effectiveness of programs • The organization’s ability to adapt to a changing environment Source: Salvatore Alaimo, PhD – Grand Valley State University
Program Evaluation is not… • An episodic event but an ongoing development process • Something you do only to satisfy a funder • Something you do only to promote your work • A test or a punishment Source: Salvatore Alaimo, PhD – Grand Valley State University
Purpose of Evaluation • Accountability to the public and funding entities
Terminology • Outcomes – benefits or changes for program participants • Outputs – direct products summations; volume) of program activities • Activities – what the program does • Inputs – all of the resources necessary to deliver the program Source: Salvatore Alaimo, PhD – Grand Valley State University
Logic Model MISSION • Counseling, mentoring, feeding, sheltering, building, entertaining, educating Source: Salvatore Alaimo, PhD – Grand Valley State University
Quality Assurance Tools • Call accounting data • Abandonment • Average time to answer • Average time on call • Call trends (scheduling) • Monitoring calls • Agency feedback • Secret shopper • Data base quality checks
Be Practical Quality Assurance tools can work together. No one quality assurance tool measures or demonstrates all of the components of the I&R program. A gap or question identified by one tool may be filled or answered, affirmed or contradicted by another tool in your Quality Assurance tool kit.
Standards • The I&R service has a process for examining its viability as an organization, the effectiveness of its services, its appropriate involvement in the community and its overall impact on the people it serves. (Standard 29 – Quality Indicator 1)
Accreditation Standards • …method for tracking call volume, average speed of answer, abandoned calls, average call handling time and income call patterns(Standard 29– Quality Indicator 2) • …creates internal reports to assess operational effectiveness (Standard 29– Quality Indicator 3)
Standards • …conducts an annual evaluation of I&R activities (including the resource database and website) that involve inquirers, service providers…(Standard 29– Quality Indicator 4) • The I&R conducts regular customer satisfaction surveys (Standard 29 – Quality Indicator 5)
Standards • The I&R service involves inquirers, service providers and others…in the evaluation process; and modifies the program in response to evaluation…(Standard 29 – Quality Indicator 6)
Follow-up Overview
Definition: Follow-up • Telephone call to I&R inquirers to gather information about their 2-1-1 experience • Allows for evaluating program effectiveness • Results used to make better strategic decisions about service delivery
Measurements What Does a Follow-up Measure?
How Follow-up Data is Used • Identification of community service gaps • Identification of incorrect or outdated agency/database information • Identification of reasons callers are not receiving services • Identification of I&R program strengths and potential staff training needs
Agency Survey
Definition: Agency Survey • Questionnaire mailed to a sample of community agencies to gain their perception and experience with the I&R program
What Does the Agency Survey Measure? • Accuracy of referrals • The perception and experience with the I&R program from the perspective of agencies
Agency Survey Process • Survey link mailed to 20% of the local agencies on the database • Agencies asked to track the referral source for new clients for one month and to identify those referred by I&R program (previous surveys) • Agencies complete the survey • Analyze the results
Definition: Silent Monitoring • Observation of I&R call to determine and measure how well established call standards and elements are met
What Does Silent Monitoring Measure? • Quality of the I&R communication, whether essential elements were completed, familiarity with the phone system and database and general performance of the I&R specialist • Identifies best practices and strengths • Identifies gaps in knowledge about community resources and other areas for staff development
Silent Monitoring Process • Callers are provided message that their conversation may be monitored • I&R manager logs on to be able to listen to calls carried out on an I&R specialist’s phone extension or listens to recorded calls. • I&R manager listens and records which call elements were completed during the call • I&R manager shares the observations with the I&R specialist • I&R manager and team look for trends to identify strengths or gaps
Benchmarks • Average score on silent monitoringof 80% of possible total score (88 out of a possible 110 points) • 1% of calls monitored
Silent Monitoring Outcomes • Average score on silent monitoring-91 (83% of possible total score)
Applying Quality Assurance Adjustments and changes made by presenter’s organization
Adjustments and Changes Identified by Participants Describe a change in policy or procedure in your program that was based on evaluation. What was measured and what was the change?
Adjustments • Agency presentations and site visits • Schedules adjusted to assure the right number at the right time • Increase silent monitoring to try to gain a more objective measure in response to agency survey input that referrals are not as accurate as we desire. • Added temp resource database staff to update resource database • Hired someone with bi-lingual language skills when filling a vacant position • Found additional resources for staff
Quality Assurance Data Group Analysis of: • Dashboard • Identify: • Strengths • Gaps • Next Steps • Solutions
How to Find an Evaluator… • American Evaluation Association (AEA) evaluation search – http://www.eval.org/find _an_evaluator/evaluator_search.asp • Local affiliates of AEA • Michigan Association for Evaluation • Local colleges and universities Source: Salvatore Alaimo, PhD – Grand Valley State University
Robert McKown Sr. Director of Evaluation & Accountability (616) 752-8639 rmckown@hwmuw.org For More Information Contact Sherri Vainavicz 2-1-1 Program Manager (616) 752-86341 svainavicz@hwmuw.org Thank you!