400 likes | 476 Views
Program Evaluation. Presentation by: Kathleen Tebb , PhD Assistant Professor of Pediatrics Division of Adolescent Medicine University of California, San Francisco 11.12.09. Introduction. Eval. Research Examples : Influence of formula gift packs on breastfeeding cessation
E N D
Program Evaluation Presentation by: KathleenTebb, PhD Assistant Professor of Pediatrics Division of Adolescent Medicine University of California, San Francisco 11.12.09
Introduction Eval. Research Examples: • Influence of formula gift packs on breastfeeding cessation • Intervention to Improve Infant & Toddler Immunizations • Mentoring Program for Teen Mothers & Babies • Improving Chlamydia Screening
Presentation Objectives • Review Research Program that Utilized 3 Different Eval. Components • Understand & Appreciate the Value of Different Types of Program Eval. • Emphasis on Value of Process Evaluation
Utilizes Three Types of Evaluation: 1. Formative 2. Process 3. Outcome/Summative Evaluation of Intervention to Improve CT Screening In Teens
Problem: CT the Silent Epidemic • CT is most common reportable bacterial infection • Highest among 15-25 yo females • Most infections (70-80%) are asymptomatic • Untreated CT can cause PID & sequelae • Cost US health system $3-4 bil /yr • Easy to test, easy to treat • Screening rates remain unacceptably low
What to do? Goal: Increase CT Screening Rates Setting: KP, N CA • Large Patient Population • Large # of Clinics (randomization) • Data Infrastructure, but indv’l clinics similar to small group practices • Existing Relationship with key champion
Formative Evaluation Step 1: Needs assessment: • Clinician/Staff Barriers • SA rates • CT screening rates
Formative Evaluation Clinicians Findings: • Discomfort speaking to teens about sexual activity • Difficulty establishing confidentiality • Time constraints – competing priorities • Misperceptions about teen SA & CT screening
Formative Evaluation Admin data: • Very poor overall CT screening rates • Site specific screening rates
Formative Evaluation Step 2: Pilot Test Intervention • Friendly/receptive site • Close proximity • Work out major kinks (implementation/data)
RTC: Methods • Step 3: Randomized Clinical Trial of 10 clinics • Intervention: 5 Clinics • Control: 5 Clinics
Clinical Practice Improvement Model Engage Team Building Re-Design Clinical Practice Sustain the Gain
Clinical Practice Improvement Model • Leadership • Best practices • Define gap • Raise Awareness Engage Team Building Re-Design Clinical Practice Sustain the Gain
Clinical Practice Improvement Model • ACTeam • Skills & tools Engage Team Building Re-Design Clinical Practice Sustain the Gain
Clinical Practice Improvement Model Engage • Customize • Define success & • measures for it Team Building Re-Design Clinical Practice Sustain the Gain
Clinical Practice Improvement Model Engage • Monitor performance • Continuous improvement Team Building Re-Design Clinical Practice Sustain the Gain
Rapid Cycle Changes • Establish ACTeam • Monthly Meeting • Set Goal • Identify barriers • Decide solution • Try it out • Assess • Repeat “cycle” % Change in STD Screening Rate S t a t u s Q u o Time in months
Average screening rates & 95% CI by time and group 0.6 Treated Control 0.4 Proportion screened 0.2 0.0 -2 to 0 0 to 3 3 to 6 6 to 9 Months post-intervention Shafer, Tebb, et al. JAMA. 2002
2. Process Evaluation • Process evaluation examines each component of the intervention implementation • How was intervention implemented? • Was it implemented as planned? • -- Resources used, activities, quality, etc…
Process Eval: Clinic Flow Chart Cue Charts Vitals Provider Encounter Urines To Lab Follow-Up CT+ ID eligible teens (age/gender) Charts stamped with cue Obtain & Record Sex Hx SA teens give urine sample Cue MD with Lab Slip Confidential Contact # MD gives teen CT info. Confirm Confidential Contact # MA refrigerates urine MA enters info in log book Runner takes urines to lab Lab runs CT test RN contacts CT + teen: confid. # Teen comes to clinic for Rx RN enters Rx in STD log book
Process Eval. Cont. • Admin Data – clinic records, log books, etc. • Regular mtgs with Providers/staff • SRA observations • Chart review
Process Evaluation Components CT Study: Multiple Methods Direct Observation recorded in log books 1:1 Interviews & monthly team mtgs Anonymous surveys, staff, providers, teens Admin Data: -- clinic records -- chart reviews
Chart Review Ex. • Central laboratory database – identify consecutive sample • Retrospective chart review – by independent clinician • Standardized data tracking form
Does Identification Lead to Follow-up? Tracking Form Based on CDC Guidelines • Appropriate antibiotics • Counseling on safer sex • Partner notification and treatment • Lab tests for other STI’s • Re-test at 3 months & as needed
Process Eval. Lesson Learned Revealed Important Quality Gaps • CT screening lead to Rx but… • Successful identification does not always lead to successful management • Also… only 1/3 teens WCV in given year Hwang L., Tebb K., et al. Archives of Pediatrics & Adolescent Med. 2005
NEXT STEP CT Screening in UC Similar intervention approach Similar evaluation methods
Process Evaluation cont. Determining Failure: Implementation vs. Theory Implementation failure: • Program is not implemented as planned Theory failure: • Program is implemented as planned • Intervention does not produce intermediate results, and/or desired outcome
What about the teens? • Outcome eval. info about CT screening rate, but no info from patient perspective • Anonymous post UC visit survey (N=365) • Clinician Communication& Teen Acceptability
What about the teens? • High acceptability • Sexual Health 84% • Urine CT test 80% • Acceptability significantly associated with: • Clinician explained confidentiality • Knows how to “talk to teens like me” • “Listened carefully as I explained my concerns” Miller K, Tebb K., et al. Archives of Pediatrics & Adolescent Med. 2007
Program Evaluation Challenges • Dealing with the “politics” of a program • Having program design/policy change mid course • Balancing tensions between rigor and practicality (for decision-makers) • Multiple stakeholders: clients, clinicians, parents… • Obtaining $$ & support for strong designs
Lessons Learned: Valuing the Process • Gave over-worked staff sense of importance, success & control over workplace • Flexible, one solution does not fit all • UC more challenging than WC, different settings, different results • Identification of specific component processes & resources support TRIP
Lessons Learned cont. • Multiple evaluation components: • lead to a better intervention design; • informed the interpretation of results along the way • multiple sub-studies, publications, preliminary data, funding support
Re-Aim (Glasgow, 2001) • Reach, Effectiveness, Adoption, Implementation & Maintenance • Approach to address translation of research into practice • Examines the robustness or consistency of results across patient, setting, and clinician subgroups, as well as costs • www.re-aim.org for more info & sample studies