410 likes | 594 Views
The why behind the what: A process evaluation of Healthy@Work. Dr Theresa Doherty Dr Fiona Cocker. Seminar Aims. Introduce a methodology for process evaluation Apply this to the H@W program Identify key evaluation questions Demonstrate what existing program data can answer
E N D
The why behind the what: A process evaluation of Healthy@Work Dr Theresa Doherty Dr Fiona Cocker
Seminar Aims • Introduce a methodology for process evaluation • Apply this to the H@W program • Identify key evaluation questions • Demonstrate what existing program data can answer • Identify what other data we need • Explain how we are going to collect it
Defining Evaluation • The systematic acquisition and assessment of information to provide useful feedback about some object • ‘The process of determining the worth, merit, or value of things …or the result of that process.’ (Scriven 1991) • ‘The systematic collection of information about the activities, characteristics, and/or outcomes of programs to make judgements about the program, improve program effectiveness, and/or inform decisions about future programming.’ (Patton 1997)
Purpose of Evaluation ‘Evaluation determines the merit, worth, or value of things. The evaluation process identifies relevant values or standards that apply to what is being evaluated, performs empirical investigation (often) using techniques from the social sciences, and then integrates conclusions with the standards into an overall evaluation or set of evaluations.’ (Scriven, 1991)
Evaluation vs. Research • The main difference between research and evaluation is that research is conducted with the intent to generalize the findings from a sample to a larger population. (Priest 2001) • Evaluation focuses on an internal situation with no intent to generalize the results to other settings and situations. • Research generalizes, evaluation particularizes.
Same or Different ? ‘Evaluation and research are closely related and should be synergistic but they serve different purposes.’ (Fain)
Meaningful Evaluation The more interesting endeavour for evaluation: • how to do rigorous and robust evaluation We need a plan so we can be confident: • Conclusions are sound; • Recommendations will improve program delivery and outcomes
Evaluation Plan Checklist 1. Program clarification 2. Purpose of evaluation 3. Stakeholder assessment 4. Program logic underpinning this program 5. Scope of evaluation 6. Data collection 7. Data analysis 8. Timeline for evaluation 9. Report and dissemination 10. Budget requirements
Program Logic Approach • Clarify the purpose of a project – what change do you want to bring about? • Plan how to bring about this change • Allow yourself to explore and test the assumptions underlying your approach • Test how well your approach is working and what, if anything, needs changing • Find ways of demonstrating your approach is working
The Logic Model INPUTS OUTPUTS OUTCOMES Long-term Short-term Medium-term Program investments Activities, outputs Target What we invest Who we want to reach What we do or produce What are our results?
Program Logic Approach • Provides a program description that guides our evaluation process • Helps match the evaluation to the program • Helps identify what and when to measure • outputs and outcomes • Helps identify key data needs • Makes evaluation an integral part of program planning and implementation
Logic Model & Evaluation Types of Evaluation University of Wisconsin-Extension, Program Development and Evaluation
Process evaluation • Good ideas do not always yield good results • Process evaluation looks at how a program was implemented • It assesses the extent to which planned activities were carried out • Ideally this happens AS a a program is being implemented NOT retrospectively • It alerts us to changing conditions that may impact on the program • It allows us to make adjustments to implementation to keep on track OR • Rethink whether the program design is the right one
pH@W Process Evaluation Project • Process evaluation was not in the original pH@W project plan • No specific funding allocated BUT • Investigators decided it was an important part of understanding Healthy@Work
Purpose of the Evaluation • What do we want to know? • How was Healthy@Work implemented across the Tasmanian State Service? • What happened? • What happened, in what circumstances which led to particular outcomes? • In light of these findings, would we implement a workplace H&WB program differently?
The Issue of Scope What is achievablegiven time and resource limitations? • How was the H@W program conceptualised and developed? • To what extent did TSS Agencies engaged with and implement H@W as designed? • What are the implications for sustainable health promoting workplaces?
Process Evaluation Activities • Review and analyse H@W files – the ‘formal’ story • Interview key informants – ‘informal’ story • Synthesise these data – deeper understanding of H@W • Report the findings – what are the implications of this ‘new’ knowledge? • Disseminate the findings – who are the stakeholders?
Existing Data Sources • Annual audit data • Healthy@Work agency reports • Grants program – applications, progress/final reports • H@W project review and closure report What they reveal • Baseline H&WB capacity • The sort of activities implemented • Where the grant money was allocated
How will we find out? Data sources available • Annual audit data • Healthy@Work agency reports • Grants program – applications, progress/final reports • H@W project review and closure report What they reveal • Baseline H&WB capacity • The sort of activities implemented • Where the grant money was allocated
Performance Indicator:Needs Assessment • All agencies completed a needs assessment (2010-2012) • Were the organisation’s H&WB issues/needs identified?
Performance Indicator:Committee/staffing support? Examples of responsible positions: 2008 – Senior HR officer (1/2 hr/week); Senior HR consultant (2 hr/week) 2010 – H@W Policy & Program Officer (23 hr/week) 2011 – H&WB co-ordinator (up to 20hr/week) 2012 – OH&S Manager (7hr/week); HR co-ordinator (4hr/week)
Still to come Further examination of: • Annual audit data – Analysis by agency Examination of: • Healthy@Work agency reports • Grants program – applications, reports • H@W project review and closure report
H@W project review and closure report • Project performance: • project vision, measured outcomes, outputs, budget • Lessons learnt: • what worked well, what could be improved, challenges • recommendations to guide future TSS H&WB activities • Highlights and innovations: • Organisation change and development • Agencies with a H&WB program (3 in 2010, 14 in 2012) • All Agencies report H&WB activities (14 in 2009, 21 in 2010) • Working in partnerships: DHHS, Workcover, UTAS, Menzies Research Institute Tasmania (pH@W), Healthy Workers (NPAPH)
H@W grants program • Number of grant rounds • Who applied for money, in which round? • What type of project? • Who was successful? • How much money did they get? • Progress/final reports provided • Any sustainable outcomes?
Purposive Data Collection • What happened in the implementation of H@W is only part of the story… • The context in which a program operates can be important data for evaluation • Semi structured interviews with key informants is one method for collecting this data
What we would like to knowexamples • How did you (your agency) gain management support? • What sort of a coordination mechanism was used to manage H@W? • How much did employees engage with H@W? • How useful/relevant was the information on the H@W website? • How did your organisation review and update the H&WB Program? • How did you promote , market, communicate the H&WB Program to employees? • Has your agency continued to resource workplace H&WB? • Base on the H@W experience what factors do you believe influence employee participation in workplace H&WB programs?
Data Analysis Analysis • Qualitative and quantitative methods and descriptive statistics where appropriate Interpretation • Interpretavistmethods to analyse content of interviews and documents • What understandings or conclusions can be gained? • What does it mean for the program/project? • How does this inform the evaluation purpose? • Reveal the complexities of implementing a large scale, multi agency workplace based health promotion initiative • Identify enablers and barriers to successful implementation and implications for transferability to non state sector workplaces • Inform program design and implementation of future WHP strategies
Who is this evaluation for ? Primary audience: pH@W partners Other interested parties: SSMO, Tasmanian State Service Agencies, TSS employees, key informants
Still to come • Ethics application prepared for key informant interviews • Interviews planned for June/July 2013
Importance of Process and Implementation Evaluation • Uses scientific rigor to understand the etiology of gaps between expected results and observed outcomes • Information can be gathered during the implementation of a program or trial • Explores the relationship between process and outcomes • Generates evidence to guide future implementation and improve program outcomes JOEM, Volume 55, Number 5, May 2013