710 likes | 866 Views
Evaluation This PPT and other resources from: http://homepage.mac.com/johnovr/FileSharing2.html. John Øvretveit, Director of Research, Professor of Health Innovation and Evaluation, Karolinska Institutet, Stockholm, Sweden. POINT 1 Evaluation means different things to different people.
E N D
EvaluationThis PPT and other resources from:http://homepage.mac.com/johnovr/FileSharing2.html John Øvretveit, Director of Research, Professor of Health Innovation and Evaluation, Karolinska Institutet, Stockholm, Sweden
POINT 1 Evaluation means different things to different people
Evaluation definition (Øvretveit 1998,2002) • judging the value of something • by gathering information about it • in a systematic way • and by making a comparison, • for the purpose of making a better informed decision.
What I will cover • Who is the evaluation for and their questions? • Three approaches • Their answers to different questions • Their ways of maximising validity • Theory-driven case evaluation • When and how to document and study context
Examples • Standardisable treatment or change to organisation • New chronic care model • Breakthrough collaborative • Joint commission accrediatation
Who is the evaluation for and their questions?User focused evalaution • Who is the main customer for the evaluation and their questions? • Design you evaluation to give the information they need to make the decisions they need to make Vs literature based focus • The evaluation is to fill gaps in scientific knowledge
Key questions for evaluations • Does it work? • (outcomes caused by the intervention) • Would it work here locally? • How did they implement this change? (description) • Which context factors helped and hindered implementation (attribution) • In which range of settings and conditions did it work? (generalisation certainty) • How do I adapt it, or the context, to implement it? (adaption)
Three approaches answer different questions • Controlled experimental • Does it work? • Quaisi experimental • Does it work (less certain) (Easier and less costly) • Theory informed case study evaluation • Does it have effects which may lead to patient outcomes? • How does it work?
Case study uses model of chain of effects Programme theory ideas about sequence of actions and situation factors leading to intermediate changes and ultimate outcome Cretin et al 2004 model
. • .
Their ways of maximising validity Internal validity of the evaluation • = how certain are we that the outcomes are due to the intervention and not something else? (attribution) • Experimental: Control for other explanations • Comparison no-intervention patients or providers • Time trends • Case study: causal chain, multiple data
External validity of the evaluation • = with the intervention on other patient or providers, how certain are we that we would find the same outcomes (generalisation) • Experimental: • repeat the evaluation with other targets, or use “representative” targets • Case study: causal chain diagram is a theory which allows decision makers to think through if it would work in their setting from ONE CASE • Analytic generalisation (not statistical)
Can you grow pineapples in Sweden? Seed Gardener/planting & nurture Climate / soil Your change? Change idea Evidence 0-5? + Implementation actions 0-5? + Context 0-5? - Local - Wider
French 2009 review - factors affecting implementation success Categories and measures • Climate:, e.g., openness, respect, trust • OL culture • Vision • Leadership • Knowledge need • Acquisition of new knowledge • Knowledge sharing • Knowledge use
How do you take account of context? • Experimental • You don’t. You get rid of its interference • Should describe intervention, implementation and settings • Case study • Before: Define context factors which might influence implementation - Example in a collaborative? • Collect data on these and assess influence • Build model/theory with context not just causal chain
Summary • Who is the evaluation for and their questions? • Three approaches • Their answers to different questions • Their ways of maximising validity • Theory-driven case evaluation • When and how to document and study context
Your reactions and questions • Any surprises… • Not certain about… • This could be useful…
DETAILS DETAILS
What I will cover • How evaluation is similar and different to research and monitoring • Challenges all research faces • And how different evaluation designs address these • Naturalistic Non-experimental evaluation • Programme evaluation • Case study evaluation • Realist evaluation • Example of evaluation of HIV/AIDS programme in Zambia
EvaluationThis PPT and other resources from:http://homepage.mac.com/johnovr/FileSharing2.html John Øvretveit, Director of Research, Professor of Health Innovation and Evaluation, Karolinska Institutet, Stockholm, Sweden
All research has these five challenges • Users wants vs evaluators views about what is important • My focus is on user driven research but theory informed – data driven by users decision • Data validity • Are the data we collect valid and reliable? Reducing data bias. Replicablity • Cost of data gathering and its value for the evaluation users • How much extra value for these extra data?
All research has these challenges • Attribution • How do we know the outcomes were due to the intervention and not something else? • Generalisation • To which other patients, organisations or settings do we have confidence that the same findings might be observed?
Three types of evaluation • Experimental controlled - outcome • Compare those getting the intervention with another group • Experimental no controls - outcome • Only look at those getting the intervention before and after • Naturalistic – describe and document different impacts
Next – experimental no control groupjust before after outcome Single B/A intervention to patients Single B/A intervention to provider
3)Evaluation of Intervention to a service:-Impact on providers
3)Evaluation of Intervention to a service:Impact on patients
Next – non-experimental process or naturalistic designs • Describe the intervention • Eg a new service for people with chronic disease – multiple components • How the service evolves and why • Some effects (in the pathway towards outcomes) • Eg staff practice and work organisation changes, attitudes
Change chain or Influence Pathway“Programme theory” or “Logic Model” 1)Intervention - training on baby health care ST Result: changes nurse’s knowledge, skills motivation >>>2)Nurses then train mothers MT Result: changes mother’s knowledge, skills motivation >>>3) Mothers then behave differently LT Result: baby health better What was the intervention (three) Which intervention should you evaluate? How What is the outcome of the intervention? (three) Point - find out if each intervention carried out fully, and results
Model Helping Hindering Personnel are given time Shortage of personnel for the education Action >>>> Change 1 >>>> Change 2 (eg education about how (personnel do better Dementia to assess dementia) dementia assessment) onset slower Indicator of this? Indicator of this?
Point Many things we evaluate have change chains One thing changes another then this change changes another thing Not just a drug treatment (one intervention) But many interventions, sometimes in sequence
Case studies Programme theory and concepts for describing change Programme theory ideas about sequence of actions and situation factors leading to intermediate changes and output and outcome changes (their theory, our theory) Cretin et al 2004 model
. • .
3)Deductive hypothesis-testing (non-intervention) Revise theory Theory Specific Hypotheses Analyse data Researcher gathers data to test hypotheses, often with a survey Raw data Box is the subject area or sample of people Study start Study finish Time line 2003 2004
For key questions to plan an evaluation • What is the intervention? • Who is the evaluation for? • Which data do they need about the intervention, its effects and the situation? • How do you know the effects are due to the intervention and not something else?
Does your research study an intervention? • What is the intervention? • what are the different implementation actions you (or others) are taking • at different times?