660 likes | 929 Views
Conciliating scientific rigor & pragmatics in Outcome Evaluation Research for human service professionals Theoretical underpinnings and implementation techniques Network for Health & Welfare Studies Dr. Charles C. Chan Dr. Amy P. Y. Ho Mr. Kevin Chan February 25, 2004. Dr. Charles C. Chan.
E N D
Conciliating scientific rigor & pragmatics in Outcome Evaluation Research for human service professionalsTheoretical underpinnings and implementation techniquesNetwork for Health & Welfare StudiesDr. Charles C. ChanDr. Amy P. Y. HoMr. Kevin ChanFebruary 25, 2004
Dr. Charles C. Chan Convenor Network for Health & Welfare Studies Associate Professor Department of Applied Social Sciences
Applied research can be of better service to the human service industry, especially in a quality conscious era, if it promises to pay balanced attention to the dialectical relationship between the professional / researcher and the service recipient. Preamble
Strange bed-fellows or estranged couples? Quality and Effectiveness :
The overall all goal of placing priority on both quality and effectiveness of health programs is to document the synergistic contribution of its component parts to a holistic improvement in health and welfare of the population. WHO-EURO Working Group on Health Promotion Evaluation (1998) Quality and Effectiveness :
The focus should not be on documenting quality or evaluating effectiveness of isolated interventions as end points, but rather on the relationship of a given intervention to the other components of the health promotion strategy. Such an analysis may indeed provide a fresh approach to the very issue of quality. Quality and Effectiveness :
BARD is a tested approach in applied research to address the demands of scholarship of application and community benefits It recognizes problems central to applied research It contributes by directing decision making in the research process Definition of the Balanced Applied Research Development Model (BARD):
Placing value on a balanced approach to the scientific requirements and the pragmatics of outcome evaluation in human services Main features of BARD:
Emphasizing integrating research into practice by transferring skills in data capturing to professionals or lay persons in service delivery Main features of BARD:
Analyzing as much as possible, the triadic relationship between the professional, lay person, and service recipient and sustaining intervention effects through supportive supervision Main features of BARD:
Efficacy – Intervention does more good than harm under optimum conditions Effectiveness - Intervention does more good than harm under real-world conditions (Flay, 1986) Efficacy Vs Effectiveness
Reach Efficacy or Effectiveness Adoption Implementation Maintenance and cost The RE-AIM framework (Glasgow, 1999)
Health Services Research Fund (1995 - 2003) Health and Health Services Research Fund (2003 - present) Research Fund on Control for Infectious Diseases (2003 – present) Community Investment & Inclusion Fund (2003 - present) Outcome Evaluation Research in Hong Kong – Financial Landscape
Capital amount of $55M Total number of submission: 1096 Total number of approved HSRC projects=224 Health and Services Research Fund (1995-2003)
Capital amount of $10M Total number of submission: 187 Total number of approved projects=4 (ESGAA meeting in July 2003) Health and Health Services Research Fund (2003 - present )
Capital amount of $ 500M (50M given to PRC’s Ministry of Health for SARS research) Open call and commissioned research activities: tentative 30M to HKU in basic and epidemiological modeling research & 25 M to CUHK on public health and emerging infectious diseases drug and treatment development research Leaving sums for the establishment of the Centre for Health Protection Research Fund on Control for Infectious Diseases (2003 – present)
Community Investment & Inclusion Fund - Funding information -Cont’d
Dr. Amy P. Y. Ho Member Network for Health & Welfare Studies Senior Lecturer Department of Applied Social Sciences
“Promoting health and well-being of elderly patients with chronic illness: A coordinated medical and social service program” Funded by the Health Care & Promotion Fund OER in human service setting – An example
Feasibility of blinding in human service research Artifacts introduced by confounders in unblinded study Issues in unblinded study
Contrasting the notion of “experimental group” in the context of medical research and human services research “Black Box” phenomenon in health promotion research (Moore, 2003) Standardization of intervention
Between-group imbalance in the control of extraneous variables Incorporating extraneous variables in the analysis Control on extraneous variables
Staff turnover Artifact due to unblinded group assignment Reliability of self-reported outcomes Data quality and reliability
Loss to follow-up Causes of participants attrition in human services OER Effect of attrition on data analysis Issues in data analysis
Mr. Kevin Chan Research Associate Network for Health & Welfare Studies Department of Applied Social Sciences Honorary Scientific Officer The Hong Kong Childhood Injury Prevention & Research Association
OER in human services are more than often violating the assumptions required in traditional RCT convention Such difference should not be viewed as a deviation from the conventional approach, but rather a calling for applied research with stronger ecological validity Presumption of the experimentalist ideal (Smith, 1985)
The difficulties associated with the random assignment The ethical and administrative objections to randomization (de_Raeve, 1994) That inputs are rarely stable That awareness of difference introduces bias That it is almost impossible to exclude extraneous variables The research is unable to say why changes detected have occurred (Newell, 1992) In fulfilling requirements set for RCT, we often undermine the following (Rolls, 1999):
Dealing with less-than-perfect random assignment Adjusting for unstable input in human services OER Statistical control for unblinded studies The inclusion of extraneous variables as covariates Narrowing down to a closer approximation of causal relationship Troubleshooting to barriers – OER in human service setting
Balancing incentives for follow-up of control group Intention-to-treat analysis As-treated analysis Complier-Average Causal Effect (CACE) analysis Troubleshooting to barriers – Group assignment in OER for human service setting
Balancing incentives for follow-up of control group Despite the lack of intervention, control group participants could be prescribed with “non-specific” intervention or substantial support to maintain their interests to comply with the data collection process Troubleshooting to barriers – Group assignment in OER for human service setting – Cont’d
Balancing incentives for follow-up of control group “Non-specific” intervention -E.g. Delivery of printed health promotion materials Substantial support -E.g. Safety device, health supplements Troubleshooting to barriers – Group assignment in OER for human service setting – Cont’d
Troubleshooting to barriers – Group assignment in OER for human service setting – Cont’d
Complier-Average Causal Effect (CACE) analysis (Angrist, 1996; Little, 1998) Expanding the conventional regression equation with a new term – compliance with intervention Troubleshooting to barriers – Group assignment in OER for human service setting – Cont’d
Complier-Average Causal Effect (CACE) analysis (Angrist, 1996; Little, 1998) Difference of evaluation outcome (e.g. score on SF-36, probability of sustaining home injury) between compliers and non-compliers, with other missing information (e.g. covariate variables such as age, gender, health status, attitude toward the research question) adjusted and stratified by compliance category. Troubleshooting to barriers – Group assignment in OER for human service setting – Cont’d
Complier-Average Causal Effect (CACE) analysis (Angrist, 1996; Little, 1998) CACE accounts for potential interaction between randomized group assignment & compliance and allows more accurate estimate of program effectiveness by: Increasing number of “usable” cases “Subtract” the artifact of compliance that moderates the outcome variable Troubleshooting to barriers – Group assignment in OER for human service setting – Cont’d
Explicit definition of intervention protocol Behavioral intervention Substantial input Personnel input Temporal data (Date, time, duration, frequency) Define boundary of intervention Troubleshooting to barriers – Adjusting for unstable input in human services OER
Troubleshooting to barriers – Adjusting for unstable input in human services OER
Troubleshooting to barriers – Adjusting for unstable input in human services OER
Troubleshooting to barriers – Adjusting for unstable input in human services OER
Troubleshooting to barriers – Adjusting for unstable input in human services OER
Process analysis Process evaluation fills up the “Black Box” (Moore, 2003) left void in health promotion research and strength internal validity of the intervention under investigation. Troubleshooting to barriers – Adjusting for unstable input in human services OER – Cont’d
Process analysis Process analysis identifies variance in protocol implementation & participants’ receptiveness to the prescribed intervention. A process analysis covers Count of activities attended Perceived effectiveness of the intervention Compliance to intervention protocol Met and unmet needs related to the intervention Troubleshooting to barriers – Adjusting for unstable input in human services OER – Cont’d
Process analysis (Cont’d) The information gathered from the process research should be integrated into the outcome research, rather than setting aside as auxiliary document in the discussion of study results Troubleshooting to barriers – Adjusting for unstable input in human services OER – Cont’d
Quality assurance exercise Organizational requirement Staff involvement Troubleshooting to barriers – Quality Assurance in OER for human service setting
Unanticipated effect of health promotion on social capital and cohesion (Raphael, 2000) Economic analysis (including Cost-effectiveness Analysis or Cost Benefit Analysis) Maintenance and cost in OER