440 likes | 746 Views
APS REVIEW & online NES. Yana Litovsky Jonathan Francis Carmona. 2013 Online NES . 31 teams used the online NES. 3 teams used additional regional online NES surveys. 5 teams partially collected data with the online NES. 3 teams requested the online NES, but did not use it.
E N D
APS REVIEW & online NES Yana Litovsky Jonathan Francis Carmona
2013 Online NES • 31 teams used the online NES. • 3 teams used additional regional online NES surveys. • 5 teams partially collected data with the online NES. • 3 teams requested the online NES, but did not use it. • GEM created online surveys in 13 languages. • All teams should use online NES in 2014 • We will send samples to all teams who did not use online NES in 2013.
GEM 2013 Participation 70 economies 240,000 individual interviews
2013 Lateness Overview APS PROPOSAL APS DATA
2013 vs. 2012 Lateness APS DATA
2013 APS Proposals • National Teams consulted with GEM Data before finalizing their proposal (selection of vendor; partial survey reports) • Survey Report personalized for team; includes previous year’s Data Quality Report • More teams approved with conditions (send interim data, keep track of quotas, check sampling distribution) • Using new team of Data Quality Analysts (InnovAccer)
2013 APS Proposals: issues 1. Late proposal submission • And no communication from team 2. Missing information • Not all requested methodology data provided • Survey Vendor Proposal not submitted • Translated APS not submitted 3. Incorrect information • Outdated population data • Incorrect education categories • Proposed survey methods change after proposal approved
2013 APS Proposals: issues 4. Methodology issues • Data quality from previous years not addressed • Insufficient number of callbacks/contact attempts indicated • Biased methodology proposed • Overly strict quota-use • Over-reliance on fixed line • Not including all geographic areas
2013 APS Proposals: approval 2013: 33% (23 out of 70) proposals approved without any revisions 2012: 55%(38 out of 69) proposals approved without any revisions 2011: 73%(40 out of 55) proposals approved without any revisions - Proposal quality down only very slightly • Data Quality standards increase each year • Teams held to higher standards as they have more experience • National context changes • More thorough investigation of proposals
2013 APS Pilot/Interim Datasets • In 2013, 30% (21 out of 70) countries sent pilot or interim data (most were interim datasets) • In 2012, 56% (39 out of 69) countries • In 2011, 29% (16 out of 55) countries • Review of pilot test added to Survey Report. Teams confirmed that mistakes made during pilot testing would be avoided during the rest of APS data collection. • 1 team’s pilot data not accepted due to problems • 2 teams did not collect required pilot
2013 APS Datasets: issues • 54teams asked to revise data, due to formatting, recording or data collection issues (53 countries asked in 2012) • Missing data • Miscoded values • Incorrect education categories in survey report • Weights not submitted right away • Calculated weights not representative • Duplicate ID values • Income and education vales not defined • Income ranges poorly constructed • Callback data not collected IT IS SIMPLE & CRITICAL TO CHECK & FIX THESE ERRORS IN DATA
2013 APS Datasets: issues 2013 • 50 (out of 70) teams had no skip logic errors • 25 (out of 70) teams had no APS data format errors 2012 • 47 (out of 69) teams had no skip logic errors • 20 (out of 69) teams had no APS data format errors 2011 • 36 (out of 55) teams had no skip logic errors • 21 (out of 55) teams had no APS data format errors 2010 • 33 (out of 60) teams had no skip logic errors • 20 (out of 60) teams had no APS data format errors
2013 APS Samples • In 2013, 32 teams flagged as having unrepresentative sample • In 2012, 29 teams flagged as having unrepresentative sample • Most required further investigation to determine severity • Failure to follow approved methodology more common reason • 4teams were required to collect additional data • Used to be checked later in the cycle and problems were often not investigated further. Now checked as soon as APS data submitted MANY UNREPRESENTATIVE FINAL SAMPLES COULD HAVE BEEN AVOIDED IF DATA WAS MONITORED
2013 Weights 2013: 67(out of 70) teams provided weights and 5had to be revised 2012: 40 (out of 69) teams provided weights and 3 had to be revised 2011: 37(out of 55) teams provided weights and 3 had to be revised 2010: 39 (out of 60) teams provided weights and 20 had to be revised - Teams who do not provide weights were sent calculation formulas by GEM
Popular Team-Added Demographic Questions • Language • Marital status • Ethnicity, Religion • Number of children • Profession • Details about house • Ownership (cars, computers, credit cards, etc)
Continuing improvements in 2014 • Data Quality • Will continue to stress interim dataset submission and monitoring • Data Quality controls continue to be very strict • Deadlines will be strictly enforced • Data Team will not wait to release APS results • Teams who submit too late to give Data Team to process and review results will not be included in the Global Report • Emphasis on core APS variables
COMMUNICATION WITH GEM DATA TEAM IS KEY GEM IS ONLY AS GOOD AS ITS DATA EACH TEAM’S LATENESS AFFECTS THE ENTIRE GEM CYCLE
Survey Futures Jeff Seaman
GEM APS Requirements • Un-biased representative sample of the adult population of sufficient size to provide reliable estimates of the GEM measures • GEM does NOT mandate a particular sampling method or data collection technique • National teams (and their survey vendors) determine the solution for their unique situation • GEM Data Team must review and approve
The Issues • It is becoming harder (and more expensive) to conduct GEM APS surveys • Lower fixed line telephone coverage • “Survey fatigue” • More mobile population • Increased reliance on face-to-face interviews • Increased need for mobile telephone sampling
Background • 2007: What is the quality of GEM data? • No quality measures • Monitor and ensure quality: • Structured RFP process • Quality test of submitted APS Data • Feedback to teams • Scale the processes • Grow from 42 teams to 70+ (same staffing level)
The Evolution • Then: Improve the quality of GEM APS data without large increases to the cost. • Now: Reduce the cost of collecting GEM APS data while maintaining our commitment to quality.
How do we reduce costs? • Better tools (tablets and smartphones) • Use online • Make questionnaire shorter • Change initial attempts and callbacks • Alternative callback data collection • Sampling changes
Face-to-face surveys • Increasing proportion of APS data collection • Highest error rates of all survey types: • Skip logic errors • Respondent selection errors • Incorrectly coded variables
Tablet Requirements • Supports APS questionnaire (question types, skip logic, data validation, etc.) • Easy to add or modify questions. • Support for multiple languages • Free or low cost • Runs on readily-available devices (tablets/smartphones)
Conduct interview • GEM testing of Open Data Kit application in Malawi, Chile and South Africa. • Has great promise to reduce entry errors. • Main cost saving is in reduced need to fix errors or resample. • Many other options exist • World Bank investigation • GEM national teams
Select the Respondent • Greatest cause of error - several teams required to resample
Packaged Tablet Solution? • Majority of new teams entering GEM: • Will require face-to-face sampling • Have limited budgets • Little experience in conducting scientific samples • Limited survey vendor alternatives • Provide a packaged solution • Free or low cost software, programmed for basic GEM APS • Run on low cost hardware • Documentation and training (video)
Online Data Collection • Far less costly • Approved for two teams in 2013 • Quality of the email list is critical • Sample list requires testing and approval • Not an option for most teams – no suitable sample • GEM will do cost-sharing with teams wishing to experiment
Make questionnaire shorter • Most important for telephone surveys • Core APS has 48 questions, but only 12 are asked of everyone • Virtually all teams are adding additional modules
Shorter survey • Review additional modules for length • New optional question sets • Across all modules • Most important questions only • Better ability for global special topic reports • Overall shorter questionnaire
Attempts and callbacks • GEM has used the 3/5 rule for years • Move from the “one size fits all” model • Requires that vendor record the number of attempts and the number of callbacks for every respondent • GEM can then test sensitivity of results to number of attempts and number of callbacks • Two teams approved for reduced callbacks in 2013
Alternative Callbacks • Face-to-face leave a mail back questionnaire • Possible use of online version
Sampling changes • GEM APS sample requirements are designed to provide an estimate of critical rates (e.g., TEA). • NOT designed for detailed examination of characteristics of entrepreneurs • Most respondents only used for the denominator. • National sample: • Base sample of 2000 • Oversample of just entrepreneurs – use screening questions for block 1 and 2, only interview those who qualify • Works only for national samples – not regional • Works only for oversamples
Questions - Comments • data@gemconsortium.org