930 likes | 1.37k Views
A Continuous Quality Improvement Approach to Evaluation. Smaller World Communications Barb van Maris. Health Communications Unit Special Topics Workshop January 29, 2002. Learning Objectives. 1. Participants will understand the similarities between evaluation and quality improvement.
E N D
A Continuous Quality Improvement Approach to Evaluation Smaller World Communications Barb van Maris Health Communications Unit Special Topics Workshop January 29, 2002
Learning Objectives 1. Participants will understand the similarities between evaluation and quality improvement. 2. Participates will be able to explain the benefits of approaching evaluation from a quality improvement perspective 3. Participants will understand how evaluation ‘fits-in’ to quality improvement 4. Participations will be able to describe the Model for Improvement 5. Participations will identify different methods and tools used in a CQI approach to evaluation
Learning Objectives • Tools we will cover today • Needs assessment • SWOT analysis • Flow charts • Fishbone diagram • Affinity Diagram • Brainstorming • Prioritization Matrix • Various measurement tools
Presentation Outline • Overview of evaluation and quality improvement • Discussion…..how does this relate to your practice? • Challenges experienced with traditional approach to program evaluation • Benefits • Creating a CQI approach to evaluation • The Model for Improvement • Drawbacks • Conclusions
Smaller World Communications • Performance measurement and evaluation • Primarily in the health care and health promotion field • Clients include (health units, hospitals, not for profit organizations, professional colleges, funding agencies) • Some work in the private sector
The evolution of an evaluation practice Program Evaluation Clinical Research • Applied research skills to evaluating health promotion programs • Evaluation Design • Indicators • Goals/objectives/outcomes • Internal evaluation • Lack of experimental control • Controlled experiments in the lab • Clinical trials in hospitals • Research Design • Threats to validity/reliability • Highly controlled • Mutli-variate statistics
The evolution of an evaluation practice Evaluation of PT Clinics/ QA Programs for Professional Colleges Performance Measurement • Adapted to language of ‘standards’and RHPA • Move from QA to Continuous Learning • Demonstrate effectiveness for accountability (Insurance Co.) • Demonstrate evaluating practice for College Requirements • Applied market research skills and program evaluation in hospital setting • Indicators • Client and employee opinions • Clinical outcomes • Balance Score Cards
The evolution of an evaluation practice Organizational Development • Natural progression to increase utilization of performance measurement data • Facilitation skills • Team building • Business strategy • Leadership • Continuous Quality Improvement
Overview: Evaluation and CQI • The two disciplines developed in parallel with the same goal “to improve our services and programs”. • But, they stemmed from two different areas of study • Program evaluation - social sciences research • CQI - organizational development/business
Evaluation • Application of social science methods • Emerged from the Education and Public Health fields prior to WWI • By 1930s, applied social science and program evaluation grew at a rapid rate • Evaluation of government programs first took-off in the U.S. with Canada not far behind
Evaluation • “Evaluation research is the systematic application of social research procedures for assessing the conceptualization, design, implementation, and utility of social intervention programs.” • Rossi and Freeman, 1993
Evaluation • The increase in popularity of program evaluation emerged from the need for government programs to be accountable • Although most text books state there are many purposes or uses to evaluation the underlying tone is still to demonstrate effectiveness or proper use of resources
Evaluation Treasury Board Policy on the Evaluation of Programs….Guidelines Program evaluation in federal departments and agencies should involve the systematic gathering of verifiable information on a program and demonstrate evidence on its results and cost-effectiveness. Its purpose should be to periodically produce credible, timely, useful and objective findings on programs appropriate for resource allocation, program improvement and accountability. (1981) Treasury Board of Canada (1981 ). Guide on the Program Evaluation Function
Evaluation • Evaluation helps you to make decisions about: • - the optimal use of time and resources • - determining if the program/service is meeting the needs of participants • - ways of improving a program/service • - demonstrating the effectiveness of a program to funders and other stakeholder groups
Evaluation • Evaluation’s focus is on ‘measurement’ • In order to collect valid and reliable information, evaluators utilize, as much as possible the scientific method for collecting information. • It was quickly recognized what we need to measure depends on the programs stage of development
Evaluation • Types of Program Evaluation • - Formative • - Process • - Summative (outcome)
Evaluation • Formative - information you collect to help you plan or implement a program/service • - needs assessments • - pre-testing program materials • - audience analysis
Evaluation • Process - studies program implementation • - tracking quantity and description of people who are reached • - tracking quantity and types of services • - description of how services are provided • - quality of services provided • - was everything implemented the way you thought
Evaluation • Outcome - studies the outcomes of the program/service • - changes in attitudes, knowledge or behaviour • - changes in morbidity or mortality rates • - changes in policy • - program results in relation to program costs
Evaluation • Different approaches emerged • External evaluation - highly rigorous and objective • Internal evaluation • Participatory evaluation • Empowerment evaluation • Utilization-focused evaluation
Program Evaluation Internal Evaluation Approach Definition: Carried out by “persons who are responsible for the evaluation process” in an organization (John Mayne, 1992) Benefits • An organization better understands their own programs and environment by doing the evaluation • There is greater acceptance for the changes required
Evaluation Participatory Evaluation • engages stakeholders in all or key aspects of the evaluation • collective-learning • sharing of power
Steps in Evaluating HP Programs Step 1: Clarify your program Step 2: Engage Stakeholders Step 3: Assess Resources for the Evaluation Step 4: Design the Evaluation Step 5: Determine Appropriate Methods of Measurement and Procedures Introduction to Evaluating Health Promotion Programs - HCU
Steps in Evaluating HP Programs Step 6: Develop Work Plan, Budget and Timeline for Evaluation Step 7: Collect the Data Using Agreed-upon Methods and Procedures Step 8: Process and Analyze the Data Step 9: Interpret and Disseminate the Results Step 10: Take Action Introduction to Evaluating Health Promotion Programs - HCU
Continuous Quality Improvement • Stems from work in the organizational development field • 1950’s - Dr. E. Deming introduced Total Quality Control to Japanese manufacturers • 1980’s - Total Quality Management begins in the U.S. • 1990’s - TQM fades as a fad, yet focus is still on continually improving products and services (CQI)
Continuous Quality Improvement Principles of CQI • Develop a strong customer focus • Continually improve all processes • Involve employees • Mobilize both data and team knowledge to improve decision making Brassard M., and Ritter D., 1994. A Pocket Guide of Tools for Continuous Improvement and Effective Planning
Continuous Quality Improvement • 3 Key Questions • What are we trying to accomplish? • How will we know that a change is an improvement • What changes can we make that will result in an improvement
Plan- Do - Study - Act Integrate the lessons learned and adjust the program. Do we need to reformulate the theory? Identify what more we need to learn. Identify purpose and goals, formulate theory. Define how to measure. Plan activities. Act Plan Study Do Monitor the outcomes. We study the results for signs of progress or success or unexpected outcomes. Execute plan, applying our best knowledge to the pursuit of our desired purpose and goals Scholtes, 1998. The Leaders Handbook (Based on the work of Dr. W. Edwards Deming)
PDSA cycle creates continuous learning Act- Plan Act- Plan Plan Theories Application Do-Study Do-Study Do-Study The nature of true learning………….
Activity #1 • Select someone in your group to: • facilitate the discussion • record your ideas on the flip chart • keep track of time
Activity #1 • Identify where group members are already doing program evaluation. • What are some of the challenges you are facing?
Challenges to the Traditional Approach • Program staff are resistant! • Program staff focus on showing effectiveness rather than looking at what needs to be improved • They also get ‘hung-up’ on what design and statistical techniques are needed, which in many cases are beyond their skills or necessary for the evaluation • Programs are expected to be effective in an unrealistic time frame…..it takes time for programs to evolve
Levels of Accomplishment • Levels of accomplishment • Issue mapping • Capacity building • Environmental shift • Behaviour change • Within each Level of Accomplishment, identify relevant performance indicators that might signal progress toward health promotion goals. Michael Hayes (1999) Ontario Tobacco Research Unit
Programs Evolve 2. Quality and Effectiveness 1. Relationships & Capacity 3. Magnitude & Satisfaction IMPACT Intermediate term Outcomes Long term Outcomes Shortterm Outcomes NEED Activities Extended impact analysis Formative & Process Somesummative Summative Realistic Evaluation Kellogg Foundation - CES Conference 1999
A CQI approach to evaluation • Focus is not on showing what we did well, or whether the program passed or failed but what we can do better and the changes we can make to improve our work! • You measure what you need to know to improve your program and to determine whether it works(process and outcome) • All evaluation becomes formative in some way • Staff are encouraged to look for what is not working and why not
A CQI approach to evaluation • Methods of measurement are still the same, but there are additional tools we can adapt • root cause analysis, flow diagrams, affinity diagrams, etc... • Measurement would become a continuous aspect of any program where staff could utilize results and see the benefits • Measurement - decision making cycle is faster • The approach and in some cases the language used is different
A CQI approach to evaluation • A CQI approach doesn’t mean an ineffective program should not be terminated.
A CQI approach to evaluation • Need to create a ‘learning culture’ • “safe” • increase understanding and benefits of ongoing measurement • debunk the myth that ‘measurement’ is difficult • begin by measuring what you can in the best way possible • then improve on it as you go • utilize staff observations and hard data • empower staff to critically assess and observe their programs
A CQI approach to evaluation • Focus staff on ‘the positive change’ they are trying to create and not on their defined program and activities • Key short term evaluation questions: • What information will help us improve our program? • Think about this month or the next 6 months
Activity #2 • Review Case Study • Each table is going to focus on one of the key elements we just discussed and brainstorm about strategies or ways the staff of this program could incorporate them • 5 minutes to write ideas down independently on post-it notes • 10 minutes to put ideas on flip chart
Activity #2 1. Making it ‘safe’ to measure 2. Increase understanding and benefits of measurement 3. Debunk the myth that measurement is difficult 4. Empower staff to critically assess and observe their program 5. Focus staff on the change they are trying to create and not on the defined program and activities 6. What would some of your key short term evaluation questions
A CQI approach to evaluation • Small scale changes to make improvements • Measure both processes and monitor outcomes • Built in process for changing program based on what is learned
What to Measure WHY NOT? • Audience Reached • Who are you reaching/who are you not reaching… • Numbers reached • Activities implemented • What activities did you do? What did you not do and.. • How well did you do them? • What could you have done better? • Challenges to implementation • Client satisfaction • Outcomes achieved and not achieved • Effect of program on those reached, were there any unintended effects? • What is it changing? What is it not changing?…. • Costs (in-kind, staff time and $$) • External influences on program success WHY NOT? WHY NOT?
Program Evaluation Treasury Board Evaluation Policy The Government of Canada is committed to becoming a learning organization. Evaluation supports this aim by helping to find out what works and what does not; and by identifying cost-effective alternative ways of designing and improving policies, programs and initiatives.(February 1, 2001) Treasury Board of Canada Secretariate (2001 ). Evaluation Policy
Key Definitions • Evaluation • Performance measurement • Benchmarking • Results based management • Quality improvement
Continuous Quality Improvement Clinical Research or Academic Research Organization/Program/Policy Development Informs Benchmarking Performance Measurement Evaluation
Improvement Model What are we trying to accomplish? AIM How will we know that a change is an improvement? INDICATORS What changes will result in an improvement? ACTIVITIES or SOLUTIONS Improvement or program development Cycle Act Plan Do Study Langley, Nolan et. Al. The Improvement Guide
Setting Aims What are we trying to accomplish? 1a. Understand your program/service -client needs and expectations -goals and objectives -what are you currently doing? -strengths - what is working -challenges - what is not working...why not?
Setting Aims What are we trying to accomplish? Tools -needs assessment -SWOT analysis -Flowcharting (illustrates a process) -Fishbone diagram - Cause/Effect (getting to the root causes of a problem)