250 likes | 264 Views
Explore approaches to measuring implementation success, understand the boundaries, and factors affecting interventions. Recognize the importance of implementation success as a distinct issue in research. Learn about conceptual and practical considerations in measuring implementation success.
E N D
Measuring Implementation Success Workshop 2008 QUERI National Meeting December 11, 2008 Carol VanDeusen Lukas, EdD VA HSR&D Center for Organization, Leadership & Management Research
Workshop objectives • Stimulate discussion about the measurement of implementation success • Explore approaches to measuring implementation success • Offer practical lessons for strategies & tools for measurement.
Questions to consider • What do we mean by measuring implementation success? • Why do we care? • How do we do it? • Conceptual considerations • Practical considerations
Implementation research is Research that supports the movement of evidence-based interventions and approaches from the experimental, controlled environment into the actual delivery contexts where the programs, tools, and guidelines will be utilized, promoted, and integrated into the existing operational culture (Rubenstein & Pugh, 2006)
What do we mean by measuring implementation success? • Implementationsuccess is the presence of the intervention in the delivery context & the consistent use of new practices. • Measuring implementation successis formally determining that presence & use
Boundaries of implementation success depend on what testing & want to learn
Why do we care? • Contributes to understanding success or failure of intervention outcomes • Implementation success as independent variable • Interventions often fail because not implemented • Intention to treat loses information • Assumption of detailed replication neither realistic or appropriate
Why do we care? • Essential to judging effectiveness of strategies for implementing innovative programs • Implementation success as dependent variable • Understanding factors affecting implementation
Inventory of HSR&D implementation studies • Began with two portfolios • Implementation & management research • Service-directed projects • Screened for implementation studies • Interventions • Studies of factors affecting • Contacted PIs • Reviewed documents • Conducted telephone interviews
Recognition of implementation success as a distinct issue varies • No recognition • Conflate implementation & outcomes • Assume innovation in place • Recognize variation but don’t measure • Know anecdotally what happening • Don’t worry about variation because expect treatment/control differences overall • Recognize and develop measures
How do we measure implementation success? • Conceptual considerations • Practical considerations
Conceptual considerations: Type of intervention • Levels of specification & complexity • Guidelines v programs v locally developed • Boundaries of the intervention • Clinical innovation • Clinical innovation + implementation strategies
Type of intervention guides focus of implementation success measure (ISM) • Guideline only • E.g., hypertension guidelines • ISM: guideline adherence • Tools/strategy to support guideline use • E.g., CRC electronic screen notification, telehealth/ IVR to prompt colonoscopy screening • ISM: guideline & strategy adherence or instead of guideline • Program or clinical area • E.g., supported employment, diabetes expert team teleconference • ISM: adherence to program elements • Strategy to implement program – • E.g., TIDES collaborative care for depression • ISM: adherence to program elements and strategy
Conceptual considerations:Operationally defining the intervention • Key elements • Replication v tailoring/ customization/ adaptation • Which features are core to the intervention? • Which features have to be present for the intervention to maintain its effectiveness? • Which can vary without compromising effectiveness?
Conceptual considerations:Standards of implementation success • How judge success? • Fidelity – extent to which intervention is in place as designed • Scope – how widely used • Intensity – quantity and depth of use • Sustainability – persistence over time • RE-AIM – reach, effectiveness, adoption, implementation, maintenance
Practical considerations:Data sources • Secondary v primary data? • Availability of existing data • Resource & logistical constraints of collecting new data • Qualitative, quantitative • Who collects? • Scale of intervention & role of researcher • Expectations/burden of implementation staff collection
Practical considerations:Creating the measures • Who scores/rates? • Participant, researcher, consultant • Incorporating different perspectives • How scored? • Summary indicators • Components of complex innovations • Role of informed judgment • Rating, ranking, categories
Practical considerations:Using the measures • How incorporate in analyses? • intervention effectiveness • success of implementation strategies • factors affecting implementation & outcomes? • Single v. multiple measures • Value & limits of composite indicators • Categorize sites • Covariates in quantitative analyses
Closing thoughts and take-away lessons… • Conceptual considerations • Type of intervention • Operationally defining the intervention • Standard of implementation success • Practical considerations • Data sources • Creating the measures • Using the measures
Elements of VISN organizational collaboration model rated by research team members