210 likes | 382 Views
TAKING PART SURVEY: LONGITUDINAL WEB PANEL. Alex Björkegren , Olivia Christophersen ; DCMS. The Taking Part Survey: longitudinal web panel. What is the Taking Part survey? Why did we set up the web panel? How does the web panel work? Challenges and future work.
E N D
TAKING PART SURVEY: LONGITUDINAL WEB PANEL Alex Björkegren, Olivia Christophersen; DCMS
The Taking Part Survey: longitudinal web panel • What is the Taking Part survey? • Why did we set up the web panel? • How does the web panel work? • Challenges and future work
What is the Taking Part survey? • DCMS’s flagship survey along with its three funding partners: Sport England, Arts Council England, and Historic England • Aim: estimate the number of people taking part in leisure, cultural and sporting activities in England. • ...also covers vast array of other subjects. • 2017/18 survey topics included: libraries, museums, heritage sites, archives, sports, first world war commemorations, volunteering, charitable giving, newspapers, wellbeing, employment....
What is the Taking Part survey? • Face to face survey (IpsosMORI & NatCen). • Adults (16+), youths (11-15) and children (5-10). • Households randomly selected from Post Office address list within ~700 areas. • Nationally representative (England) • Now in its 14th year (started in 2005/06). • Survey year runs from April to March (financial year). • National Statistic
What is it used for? One of DCMS’s five objectives is to maximise cultural and sporting participation, and social action This is measured using the Taking Part Survey • Only robust internal government source for social media use. • Evaluate policies e.g. First World War centenary commemorations. • ONS (child wellbeing), UK Sport (attendance at sporting events), The Football Association, the Crafts Council, The Royal Horticultural Society, charities, small businesses, other government departments, academics, journalists
Why did we set up the web panel? • Objective: identify reasons for changes in participation in leisure, cultural and sporting activities in England over time. • Longitudinal panel: interviews from 2011/12 to 2016/17 • Face-to-face interviews are expensive! • Longitudinal component of the survey was becoming increasingly biased – potentially affecting headline results.
Why did we set up the web panel – survey structure Current design 2018/19 – ??? ‘Mixed mode’ 2011/12 – 2016/17 Legacy sample Web panel Fresh sample Fresh sample Face to face survey. Source for headline measures
How does the web panel work? Asked at interview if would like to join the web panel. If email address valid, sent a link to a registration questionnaire (£5 incentive). Invited to complete a short (10 min) questionnaire each quarter (£2.50 incentive). 2 further email reminders, then a postal reminder. Active web panelist: those completing at least one quarterly questionnaire per annual cycle Inactive panelists removed at the end of the year. Must have £5 ‘banked’ before they can cash it out. Questionnaires change each quarter.
How does the web panel work? • Some questionnaire topics change each quarter • Others are ‘core’ : Subjective well-being; Free time; Life events; Sport screener • Questionnaires repeat annually
Results and Challenges • Recruitment • Retention • Data structure • Development burden
Challenges: recruitment % of all face-to-face respondents saying they would be willing to join the web panel
Recruitment: results so far Profile of adult population and those willing to join the web panel
Recruitment • Refusal ‘counter points’ 32.5% - no time * 19.4% - lacks internet skills 18.3% - has done enough already 10.8% - doesn’t complete questionnaires on the internet 10.4% - other reasons • Introduce targeted &/or unconditional incentives • Raise incentives for all • Just for target groups (e.g. youth, BAME) • Just for initial refusals • Review/improve interview briefing content • Review/improve web panel leaflet
Challenges: Retention Invited, not completed Completed on: invitation 1st (email) reminder 2nd (email) reminder 3rd (postal) reminder Still open
Retention Under investigation! Aim to address overall numbersand representation of general population Planned analysis includes: • experiment carried out with SMS reminders • link between retention and reported interest in the questions
Challenges: Data structure • A challenging dataset • Stakeholders with very different analytical needs and capabilities • Interested in: • Data from all quarters vs. one quarter per year • Different comparisons: • within year comparisons, e.g. Year 1, 1st cohort, Q1 vs Q2 • between year comparisons, e.g. 1st cohort, Q1 Year 2 vs Q1 Year 3 • between cohort comparisons, e.g. Year 1, 1st cohort, Q1 vs Year 2, 2nd cohort Q1 • ...many other permutations! • No dedicated analysts vs a full analytical team • Different software
Challenge: Development burden Q1 Q0 Q3 Q4 Reg Q2 Q1 Q0 Q3 Q4 Reg Q2 Q5 Year 3 3rd cohort (2018/19) 2nd cohort (2017/18) 1st cohort (2016/17) Q7 Q8 Q6 Q5 Q7 Q8 Q6 Q9 Q11 Q12 Q10 Q9 Q11 Q12 Q10 Q1 Q0 Q3 Q4 Reg Q2 Year 2 2nd cohort (2017/18) 1st cohort (2016/17) Q1 Q0 Q3 Q4 Reg Q2 Q5 Q7 Q8 Q6 Q5 Q7 Q8 Q6 Q1 Q0 Q3 Q4 Reg Q2 Year 1 1st cohort (2016/17) Q1 Q0 Q3 Q4 Reg Q2 Content: Ipsos Mori, November 2017 Q1 Q3 Q4 Q0 Q2 Reg Q5 Q7 Q8 Q6 Q9 Q11 Q12 Q10 Q1 Q0 Q3 Q4 Reg Q2 Q5 Q7 Q8 Q6 Q1 Q0 Q3 Q4 Reg Q2
Development burden: solutions • Agree a data structure that minimises need for repeat processing (i.e., data sets should be archivable) • Automate/structure checking as much as possible e.g., guided QA logs, RAP • Develop quarterly questionnaires which apply to all cohorts (not as simple as it sounds!)
What next? • Consider/implement strategies to improve recruitment and retention • Harmonise questionnaires • Get the datasets • Produce statistics! • Annual reports • Impact of DCMS sector engagement on wellbeing • Impact of survey mode – face to face vs. phone vs. online
Any questions? Alex.Bjorkegren@culture.gov.uk