560 likes | 655 Views
Heads you’re in; tails you’re out: How RCTs have evolved in DWP. Jane Hall jane.hall@dwp.gsi.gov.uk. Background. Increasing use of RCTs Emphasis on evidence-based policy Limited UK experience in social policy arena Practical lessons not theoretical debate. Overview.
E N D
Heads you’re in; tails you’re out: How RCTs have evolved in DWP Jane Hall jane.hall@dwp.gsi.gov.uk
Background • Increasing use of RCTs • Emphasis on evidence-based policy • Limited UK experience in social policy arena • Practical lessons not theoretical debate
Overview • Chronology of RCTs in DWP • Site selection and preparation • Identifying the eligible population • Dealing with resistance • Performing the random assignment • Monitoring take-up
Chronology of RCTs • Restart • Various New Deals • Employment Zones • JRRP • ERA • JSA Intervention Pilots • ND50+ Mandatory IAP
Site Selection and Preparation • Need commitment from the top • All parties need to buy-in • Set-up is resource intensive • Personal visits • Pilot the approach
Identifying and recruiting the eligible population • Can they be easily identified • Self-selection • Suitability of the population • Monitor P & C Group characteristics • Selling techniques • Sample sizes: Sub-group analysis
Dealing with resistance • Busting the myth • Significant investment in training at all levels • Aides and FAQs • Scripts
Performing the Random Allocation • Needs to be sophisticated • Not open to sabotage/gaming • Block allocation: Maintain P:C ratio • Different techniques • NINO • Call Centre • On-line algorithm • Random numbers
Monitoring take-up • Keep track of P & C Group • Ensure only P Group receive the treatment • Monitor key characteristics of P & C Group • Be prepared to redesign the random allocation
Expect the Unexpected • Results may not be what you anticipate • A fair allocation of resources? • Participation rates can be disappointing
Operational ChallengesThe ERA Experience Jenny Carrino
Overview • The ERA Policy • Key Challenges • Random Assignment (RA) Process • Customer Understanding of RA • Creation of ‘Informal’ Refusers • Jobcentre Plus Target Structure • Technical Assistance
The ERA Policy • To test interventions to improve retention and advancement • Adviser support • Funding for training • Financial Incentives • 6 Jobcentre Plus districts • Three customer groups NDLP ND25+ WTC • To test the effectiveness of using RA to evaluate social policy in the UK
Random Assignment 1 • Issue: The random allocation process • Lessons Learnt • Importance of transparency in the allocation process • Avoiding contamination • Outcomes • Most customers and staff viewed random assignment as fair and justified
Random Assignment 2 • Issue: The Informed Consent Process • Lessons Learnt • Standardisation - adviser scripts and leaflets • Outcomes • Not everyone fully understood what they had signed up for • Too much information at initial interviews – conduct RA as a stand alone interview
Random Assignment 3 • Issue: Creation of a group of ‘informal’ refusers
What do we mean by Informal Refusers? SIGNIFICANTLY DIFFERENT ELIGIBLE POPULATION INFORMAL REFUSERS CUSTOMERS RA’D OR ON SYSTEM FORMAL REFUSERS
Informal Refusers • Why this happened • The decision to use RA • RA to ERA was voluntary • Influences from both advisers and customers • Outcomes • Creation of a ‘third’ group • Analysis to identify whether this group are different to the ERA population • Lesson Learnt • If possible monitor intake closely against eligible population
Ensuring a Treatment - Targets • Issue: The Jobcentre Plus Target Structure • Some adviser behaviour negatively affected • Senior management buy-in affected • Lessons Learnt • Policies need to reflect the organisations reward system • Need to be able to monitor and feedback to implementation managers
Technical Assistance • Issue: Ensuring the effective delivery of RA • Lessons Learnt • US model of on-site RA assistance • Avoiding contamination • Monitoring Performance • Outcomes • Advisers felt supported during the RA period • Initial confusion over the role of TAs • Some districts deferred responsibility of ERA implementation
Summary of Key Challenges • RA Process • Informed Consent • Creation of informal refusers • Ensuring a Treatment • Providing effective support to delivery agents
Jobseekers Allowance (JSA)Intervention Pilots Jayne Middlemas
JSA Intervention Pilots • JSA Intervention Regime • The Pilots • Evaluation • Random assignment • The data • Did Random Assignment work? • Results
JSA Intervention Regime • First Contact • New Jobseeker Interview (NJI) • Financial Assessor Interview • Fortnightly Jobsearch Review (FJR) • 13 week review
The Pilots • Introduced in January 2005 • 108 Jobcentres in 10 Districts took part • Each Jobcentre took part in a single pilot • Aim to deliver resource savings on the FJR without reducing unemployment off-flow rates.
The pilots (cont) Five different approaches: • Excusal of signing for first 13 wks of claim • Excusal of signing for first 7 wks of claim • Telephone signing • Shortened FJR • Group signing
The pilots (cont) Some groups excluded: • Part-time workers • 16 and 17 year olds • People with no fixed abode • People known to have had a fraudulent claim in the past
Evaluation • Customers randomly allocated • Work study to record resources used • Comparison of off-flows • Qualitative evaluation
Random Assignment 50% programme, 50% control ORC International Call Centre Two methods: • Adviser calls immediately prior to each NJI • Jobcentre calls at start of day with details of all clients due to attend an NJI that day ORC also provided random call-in date
Data • Data collected during random assignment • JUVOS data – derived from the Jobseekers Allowance Payment System (JSAPS) • HMRC Employment Data
Did Random Assignment Work? • 66,600 randomly assigned • 33,100 programme & 33,400 control • All pilots and Districts close to 50/50 split Was everyone assigned? • Difficult to answer precisely • Number randomly assigned around 90% of total new claims. Excluded groups likely to account for 8 to 12%
Were People Wrongly Assigned? • 19% had no new JSA claim during the pilot • Incorrect NI numbers may mean we can’t find some claims • Jobcentres didn’t always inform us of those who failed to attend • Can’t identify excluded groups in the data
Internal Validity • Compared characteristics for programme and control groups • Very little difference was found by gender, age or ethnic origin • Concluded that the control group is well suited to providing a counterfactual for the programme group
External Validity • Pilot Jobcentres account for small proportion of all new JSA claims across the country • Gender, age & ethnicity of new claimants in pilot areas different to country as a whole • Some difference in local unemployment rates • Weighted results to take account of differences
Results 13 week excusal pilot
Results • Average length of claim is 5.9 days greater in programme group than in control group • Weighting the results to be representative nationally suggests an increase in 6.1 days in average length of claim • No difference in the proportions who moved into work
Results (cont) 4 reasons for difference in length of claim: • Some people take longer to find work • Some people take longer to tell us they have found work • Some control group customers fail to attend and have to start a new claim • Some people fail to sign off for other reasons
Results (cont) • Work study provided estimates of savings • Extra benefits paid as a result of increase in average length of claim exceed savings • Qualitative evaluation suggested that the pilot was implemented well • Customers were happy not to have to attend every fortnight
More information • DWP Research Report 300: The Qualitative Evaluation of the JSA Intervention Regime Pilots • DWP Research Report 382: Jobseeker’s Allowance Intervention Pilots Quantitative Evaluation • Available on DWP Website: http://www.dwp.gov.uk/asd/asd5/rrs-index.asp
Job Retention and Rehabilitation Pilot Lessons learnt in running an RCT James Holland
Structure • Background to JRRP • Results of the trial • Hypotheses • Conclusions: Importance of complementary methodologies
Design • 4 - way trial • To test the effectiveness of a person centred case management approach and increased range of treatments in helping people retain work • Health care focused • Workplace focused • Combined health care and/or workplace focused • Control group • Four service providers in six parts of the country • Participants were people off work sick and unlikely to return to work without help
Routes through the trial Contact Centre Explanation Eligibility Screening Decline Project Marketing Approach Contact Centre Ineligible Screened out Screened in: Randomisation Health 25% Work 25% Combined 25% Control 25% Return to work Providers make contact Written Consent Assessment and Intervention Surveys Out of work
Evaluation Design • Impact and process evaluation • Cost benefit Analysis • Components of the evaluation • Survey of those screened out and the control group • Outcome survey • Panel study • Focus studies • Database of contacts and treatments • Costs exercise
Impact Measures • Primary impact measure – 13 week return to work • Secondary impact measures • Health • Household income • Costs and benefits • Operation of JRRP as a RCT