230 likes | 327 Views
Improving the Jobs of Direct Care Workers in Long Term Care: Findings from the Better Jobs Better Care Demonstration Penn State Evaluation Team: Peter Kemper, Diane Brannon, Brigitt Heier, Amy Stott, Monika Setia, Joseph Vasey, Jungyoon Kim, and Candy Warner. June 8, 2008
E N D
Improving the Jobs of Direct Care Workers in Long Term Care: Findings from the Better Jobs Better Care DemonstrationPenn State Evaluation Team: Peter Kemper, Diane Brannon, Brigitt Heier, Amy Stott, Monika Setia, Joseph Vasey, Jungyoon Kim, and Candy Warner June 8, 2008 Presented at the annual meeting of AcademyHealth. We thank The Atlantic Philanthropies, The Robert Wood Johnson Foundation, and the Office of the Assistant Secretary for Planning and Evaluation for funding. (Contract No. HHSP23320044303EC)
BJBC Demonstration Goal: Improve direct care workers’ job quality and reduce turnover Direct care workers: Provide help with personal care Interventions: BJBC training and technical assistance intended to improve management practices Where: Five state projects 124providers Skilled nursing, assisted living, home care, adult day service
BJBC’s Intended Effects:Basic Framework • Providers • Implementation • Management Practices • Direct Care Worker • Job outcomes • Turnover BJBC Interventions
Approach to Evaluation • Evaluation not designed with a control group • Before-after evaluation design and data • Sought methods of strengthening design • “Let a thousand flowers bloom” demonstration interventions not standardized or known • Measured range of management practices • Developed measures of extent of implementation
Methods for Estimating Effects • Basic approach • Before-after comparison of means • Post-intervention trends compared with national trends • Difference-in-difference: Compare changes in: • States with and without specific interventions • Providers that did and did not implement
Analyses Presented • Providers • Implementation • Management Practices • Direct Care Workers • Job outcomes • Turnover BJBC Interventions
Data • Telephone interviews with project managers • Survey of clinical managers • Survey of frontline supervisors • Survey of direct care workers • Hiring and termination information system
Measuring Implementation in Formative Evaluations: Using Data from Multiple PerspectivesPeter KemperBrigitt HeierJoe VaseyDiane BrannonJune 8, 2008 Presented at the annual meeting of AcademyHealth. The authors are grateful for support from The Atlantic Philanthropies, The Robert Wood Johnson Foundation, and the Office of the Assistant Secretary for Planning and Evaluation (Contract No. HHSP23320044303EC)
Motivation Variation in implementation observed in early site visits Mid-course correction: Add implementation measures Goal: Develop a summary index of extent of provider implementation for use in impact analysis
Measures from Three Perspectives • Practice Manager (state project level) • Clinical Manager (provider level) • Frontline Supervisors (provider level)
Practice Manager Perspective • “Make a mark on the scale that best describes this provider’s current degree of implementation” • 0 - Implementation of interventions has not yet started • 100 - Interventions are fully implemented and sustainable
Clinical Manager Perspective • “Indicate the level of progress your organization has made in implementing the most important intervention” • 0 - Implementation of intervention has not yet started • 10 - The intervention is fully implemented and sustainable • “The programs that are part of BJBC have been well executed in your organization” • Five point scale from strongly disagree to strongly agree
Frontline Supervisor Perspective • “The programs that are part of BJBC have been well executed in your organization” • Five point scale from strongly disagree to strongly agree • Averaged across supervisors in each provider
Methods • Exploratory factor analysis • Principal components extraction method • Extracted component with an eigenvalue greater than 1 • Included items if the factor loading was .6 or greater • Imputed values when one or two items missing using maximum likelihood procedure • Sample size: 92 providers
Factor Loadings Factor has an eigenvalue of 2.2 and explains 55% of variance
Distribution of Factor Scores Mean: .00 Median: -.05 Minimum: -2.84 Maximum: 2.18 Skewness: -.19
Factor ScoreRe-scaled to 0-1 Range Mean: 0.56 Median: 0.56 Minimum: 0.00 Maximum: 1.00 Skewness: -.23
Implementation Index Is Related to Underlying Measures Y= -1.85 + .032X
How We’ll Use Index in Analyzing Effects Difference-in-difference approach Divide providers into two groups: above and below median implementation Compare difference between the two groups indifferences between Time 2 and Time 1 Extend to continuous measure in regression Note: Method does not identify effects but may identify absence of effects
Summary • Assessments of implementation from three perspectives are similar • Summary index was developed successfully • Uses of implementation index • Will be used to strengthen analysis of BJBC effects • Most useful in confirming absence of effects • Could be used to analyze factors affecting implementation