120 likes | 129 Views
This presentation provides an overview of the methodology used in the London Business Survey (LBS) 2014, a survey commissioned by the GLA to gather information on businesses in London. It explores the reasons for commissioning the survey, the sample design, questionnaire structure, and the reliability of the results. Recommendations for future surveys are also discussed.
E N D
The newLondon Business Survey: Methodology Sarah Levy 29 September 2015
Overview • The London Business Survey (LBS) 2014 is a new, experimental survey • ONS was commissioned to design and run it, providing a service to the GLA which would ‘own’ the data • It was funded by the London LEP (SME Working Group) • The survey took place in March-April 2014 (pilot stage) and in May-July 2014 (main stage) • The reports (including an ONS Methodology Report) were published on 10 November on www.london.gov.uk • The 250+ data tables are available on the London Datastore • This presentation will look at survey methods and response
Why was the LBS commissioned? London generates over 1/5th of the UK’s GDP Mayor of London needed more information • How many businesses are there in London? • Why have they chosen London? • How much do London businesses export? • Can SMEs get enough finance? • What is the outlook for business growth?
Why was it run by the ONS? • A key requirement of the client (the GLA) was for the survey to be representative of the population of private sector businesses in London • ONS offered to draw a sample from the Inter-Departmental Business Register (IDBR) • The results can be linked to data from official business surveys and to the IDBR itself – within the Virtual Micro-data Laboratory (VML) • ONS proposed an innovative survey design (experimental) to produce results for London, which is much harder than producing UK results
Multi-site businesses Head Office inside London & sites only in London Head Office inside London & sites everywhere Head Office outside London & sites everywhere
The sample • We decided to sample at site (Local Unit) levelinstead of head office (Reporting Unit/enterprise) level • We drew a sample of 10,000+ sites from a population of 444,870 in the IDBR (relevant industry sectors, May 2014) • The sample was a stratified simple random sample with 36 ‘main strata’ based on: • The size of the enterprise to which the site belonged: • Micro enterprises: 0 to 9 employees • Other SMEs: 10-249 employees • Large enterprises: 250+ employees • The LBS industry sector of the site (12 sectors)...
Designing the LBS: Industry sectors Note: Sections A, B, D, E, T and U are excluded because they are a very small part of the London business economy. Sections O and P were excluded because they are predominantly public sector.
The questionnaires • We sent out 3 questionnaires: • Form 1: Head office level questions only for head offices in multi-site enterprises (approx 2,800) • Form 2: Site level questions only for sites in multi-site enterprises (approx 4,000) • Form 3: All questions (head office and site level) for single site enterprises (approx 6,000) • Total number of forms = nearly 13,000 • The questionnaires contain two types of question – • Categorical (yes/no) & multiple choice e.g. choices, attitudes • Numeric (values) e.g. no. of employees, value of sales, turnover
The results • All results include breakdowns by • size of the enterprise to which the business unit belongs(micro, other SME and large) • LBS industry sector • Results are reported for number (or %) of business units (sites), even if the data was collected from related head offices • Results are weighted up to the population in the IDBR, May 2014:
Reliability • There are two types of error to be aware of... • Sampling error: does the estimate from the sample reflect the ‘true value’ in the population? • 95% Confidence Intervals (CIs) were produced for all of the results – users should take note! • 95% CIs indicate that if the survey were repeated again and again, 95% of times the true value would lie between the upper and lower 95% confidence limits • Non-sampling error: • Non response - in particular we had difficulty reaching sites belonging to large firms in financial and insurance services • Mistakes by respondents - in particular reporting values for whole business rather than selected site (turnover, sales etc)
Conclusion • The survey design based on selecting sitesdoeswork well for sub-national areas like London • By collecting information directly at site level, we avoid the need to apportion values such as turnover and sales to regions • Some improvements needed if the survey is run again: • Use data from the 2014 survey to fine tune the survey design in order to improve precision of estimates (reliability) • Automate the system for handling duplicate forms • Improve instructions for turnover (etc) questions to emphasise reporting for the selected site, not for the business as a whole • Try to boost response rates for sites in large multi-site firms (in particular those in the financial and insurance services sector) • Build in time for validation of responses through re-contacting respondents