330 likes | 573 Views
A Strategic Measurement and Evaluation Framework to Support Worker Health . COMMITTEE ON DHS OCCUPATIONAL HEALTH AND OPERATIONAL MEDICINE INFRASTRUCTURE June 10-11, 2013 Ron Z. Goetzel, Ph.D. , Emory University and Truven Health Analytics.
E N D
A Strategic Measurement and Evaluation Framework to Support Worker Health COMMITTEE ON DHS OCCUPATIONAL HEALTH AND OPERATIONAL MEDICINE INFRASTRUCTURE June 10-11, 2013 Ron Z. Goetzel, Ph.D. , Emory University and Truven Health Analytics
Workplace Health Promotion/Health Protection Programs: What Should be Evaluated? • Structure • Process • Outcomes
LOGIC MODEL: WORKSITE PROGRAMS HEALTH PROMOTION/PROTECTION • STRUCTURE • PROCESS Employees • OUTCOMES Modified Worksite Health Promotion (Assessment of Health Risk with Follow-Up) Logic Model adopted by the CDC Community Guide Task Force
Program Structure Structure defines the program -- how does it work – the WHAT, HOW & WHEN? • Individual components, e.g., HRA, feedback reports, mailings, internet services, high risk counseling, referral to community resources, incentives • Environmental components, e.g., organizational policies, cafeteria/vending machine choices, time off for health promoting activities, senior management support, access to physical activity programs, walking paths, shower/change facilities, healthy company culture
Environmental Assessment Tool J Occup Environ Med. 2008 Feb;50(2):126-37.
Leading By Example Assessment Am J Health Promot. 2010 Nov-Dec;25(2):138-46..
HERO SCORECARD Sample Results Based on ABC Inc.’s response and database average as of [May 1, 2009]. http://www.the-hero.org/scorecard_folder/scorecard.htm accessed 5/12/12 . 10
CDC WORKSITE HEALTH SCORECARD http://www.cdc.gov/dhdsp/pubs/worksite_scorecard.htm
PROGRAM PROCESS Program process evaluation defines how well the program is carried out: • Participation rates • Satisfaction with the program/process/people • Completion rates
PROGRAM PROCESS COMPONENTS • GOAL: To summarize program implementation and to form hypotheses about how implementation may affect program outcomes • To monitor progress during a program implementation and to inform potential adjustments to the program to improve program quality • Program Fidelity (quality) - how the program was implemented • Dose Delivered (completeness) – frequency and intensity of the program • Dose Received (satisfaction) - how participants react to the intervention • Program Reach (participation rate) –The proportion of eligible (employees) that participated in the various components of the programs?
PROGRAM OUTCOMES • Program outcomes are evaluated by determining whether program objectives are achieved, at a given level of quality, and within a defined time framework • Health outcomes • Behavior change • Risk reduction • Medical care outcomes • Health care utilization • Health care costs • Productivity outcomes • Absenteeism • Disability • Workers’ compensation/safety • Presenteeism
RESEARCH DESIGN • Pre-experimental • Quasi-experimental • True experimental Validity of results increases as you move down this list All are tools that can help understand the impact of the program
NON-EXPERIMENTAL DESIGN(PRE-EXPERIMENTAL) Program start
GENERAL TREND OR PROGRAM EFFECT? Program start
Same people Before the Intervention Intervention Period Savings? PROBLEMS WITH A PRE-EXPERIMENTAL DESIGN:REGRESSION TO THE MEAN • The most simple analysis may produce the wrong answers
RESEARCH DESIGN: QUASI-EXPERIMENTAL Pretest posttest with comparison group 01 X 02 Experimental Group -------------------------- 01 02 Comparison Group 24
ANNUAL GROWTH IN NET PAYMENTS Annual growth in costs, Highmark, Inc.For matched-participants and non-participants over four years` Start of Program
RETURN ON INVESTMENT AND NET PRESENT VALUE Savings Return on Investment (ROI)= Program Cost = $1 break-even Net Present Value (NPV)= Savings – Program Cost = $0 break-even
Cost-Benefit (ROI) Analysis Wellness Program Costs, Highmark, inflation-adjusted to 2005 dollars
Evaluators must explicitly state the intervention pathway and metrics used to measure: The “cause” or actual intervention The “effect” – proximate and/or ultimate outcomes that result from the intervention Hypotheses that outcomes are “caused” by the HP program must be articulated and tested Assessing Causality Ultimate Outcomes HP Program Proximate Outcomes Effect Cause
CRITICAL STEPS TO SUCCESS Financial ROI Reduced Utilization Risk Reduction Behavior Change Improved Attitudes Increased Knowledge Participation Awareness
HEALTH RISKS – BIOMETRIC MEASURES -- ADJUSTED Results adjusted for age, sex, region * p<0.05 ** p<0.01
HEALTH RISKS – HEALTH BEHAVIORS -- ADJUSTED Results adjusted for age, sex, region * p<0.05 ** p<0.01
ADJUSTED MEDICAL AND DRUG COSTS VS. EXPECTED COSTS FROM COMPARISON GROUP Average Savings 2002-2008 = $565/employee/year Estimated ROI: $1.88 - $3.92 to $1.00
Summary • Evaluation of Health Promotion/Protection Programs is doable, but tricky • Know your audience – the level of sophistication in conducting financial analyses varies significantly – well done studies are complex and expensive • It’s easy to come up with the “wrong” answer if the proper research design is not used • Ask for help – good evaluation studies require a team of individuals with diverse backgrounds and skill sets • Tell the truth, the whole truth, even if it means saying the program didn’t work