460 likes | 471 Views
Explore the rise of smartphone surveys and the utilization of Big Data in survey methodology. Understand the benefits and challenges of respondent-driven and researcher-driven use of smartphones for data collection.
E N D
Smartphone surveys, behavior tracking, and Big DataOpportunities and challenges of new modes of survey datacollection Florian Keusch MA POSM Workshop, December 13, 2016 Office fédéral de la statistique, Neuchâtel
Acknowledgement • Many of the thoughts, ideas, and concepts used in this talk originate from personal and written conversations with colleagues such as Chris Antoun, Trent Buskirk, Mario Callegaro, Mick Couper, Rachel Horwitz, Peter Lugtig, Frauke Kreuter, Bella Struminskaya, Ting Yan, and many others…
Agenda • Why We Have to Rethink Our (Web Survey) Data Collection Approaches • Respondent-Driven Use of Smartphones • Researcher-Driven Use of Smartphones • Survey Methodology in the Age of Big Data • Summary
Why We Have to Rethink our (Web Survey) Data Collection Approaches Smartphone surveys, behavior tracking, and Big Data – Opportunities and challenges of new modes of survey data collection Florian Keusch
Web Has Become an Established Survey Mode • Dropping response rates (esp. in CATI) and high costs (of CAPI) have led to vast adoption of Web surveys • In commercial sector, Web #1 data collection method in many countries • Mostly driven by rise of nonprobability online panels
Rise of Web Surveys 33% of worldwide marketing research sales come from web surveys (ESOMAR 2015) Source: https://www.adm-ev.de/zahlen/
Web Has Become an Established Survey Mode • Dropping response rates (esp. in CATI) and high costs (of CAPI) have sped up SIEGESEZUG of Web surveys • In commercial marketing research, Web clear #1 data collection method in many countries • Mostly driven by rise of nonprobability online panels • Proliferation of DIY Web survey platforms and widespread use of in-house Web survey tools • SurveyMonkey reports 90M survey completes per month (Bort 2015) • Qualtrics distributes 1B surveys annually (Callegaro & Yang in print) • More and more large scale survey projects experimenting with/adding Web survey component/supplement • UKHLS, HRS, ALLBUS, ESS, SCA, ACS, ANES • Establishment of (academic) probability online panels
(Academic) Probability Online Panels • German Internet Panel (GIP): http://reforms.uni-mannheim.de/internet_panel/home/ • Gesis Panel: http://www.gesis.org/en/services/data-collection/gesis-panel/ • Longitudinal Study by Internet for the Social Sciences (ELIPSS): https://www.elipss.fr/ • Longitudinal Internet Studies for the Social Sciences (LISS): https://www.lissdata.nl/lissdata/About_the_Panel • KnowledgePanel: http://join.knpanel.com/about.html • AmeriSpeak: http://www.norc.org/Research/Capabilities/Pages/amerispeak.aspx • American Trends Panel: http://www.pewresearch.org/methodology/u-s-survey-research/american-trends-panel/ • Understanding America Study: https://uasdata.usc.edu/
Another Technological Revolution? “If you're doing a Web survey,you're doing a mobile survey.” Michael Link, Nielsen(@AAPOR 2013)
Device Ownership Among German Adults Respondents without general qualification for university entrance are sign. more likely to only have access to Internet via smartphone. Source: GIP Wave 22 (March 2016); n=2,845
Two Separate Phenomena • Completion of Web surveys on mobile Web devices (i.e., respondent-driven use of smartphone) • PC-optimized Web surveys completed by some on mobile devices • “Unintentional mobile Rs” (Peterson 2012) • Hope to increase coverage or reduce nonresponse, without affecting data quality • Use of mobile Web for new methods of data collection (i.e., researcher-driven use of smartphones) • Examples: ecological momentary assessment (EMA), diary studies, travel studies, health monitoring, passive mobile data collection • Based on volunteers, who have to download and install an app
Respondent-Driven Use of Smartphones Smartphone surveys, behavior tracking, and Big Data – Opportunities and challenges of new modes of survey data collection Florian Keusch
Empirical Evidence for Unintentional Mobile Rs • Non-probability online panels: • Peterson (2012): between 1% and 30% of all Rs in U.S., depending on target population • Kinesis (2013): 51% of marketing research surveys started on mobile device in U.S., 10% in Europe • Revilla et al. (2015): 7.1% of all Netquest panel members used smartphones and 1.8% tablets in 2013 and 2014; strong increase over time • Probability online panels: • de Bruijne & Wijnant (2014): share of unintended mobile increased in LISS panel from 3.1% in March 2012 (0.4% smartphones, 2.6% tablets) to 10.9% in September 2013 (1.6% smartphones, 9.3% tablets) • Struminskaya et al. (2015): 8% of Rs used mobile device at least once in GESIS Pilot Panel’s 8 survey waves (2.1% mobile users in at least 7 waves) • Pew Research Center’s (2015a): 27% of Rs in American Trends Panel completed most recent survey on a smartphone, 8% used tablet
What Makes Mobile Web Different from Regular Web for Surveys? Source: Couper (2013), Antoun (2015)
Nonresponse in Mobile Web Surveys • Evidence that RR lower and break-off rate higher for mobile Web than PC Web1,2, even when optimizing for mobile devices3,4,5,6,7 • Longer response times1,2,3,6,7,8 • Higher burden for participation • Much of difference due to within-page times (i.e., answering, scrolling), not between-page times (i.e., connection speed)9 • Smartphone Rs younger1,6,7,8, female7,8, heavier mobile Web users1, and primarily rely on smartphones to access Internet7 Source: 1Mavletova (2013); 2Mavletova & Couper (2013); 3Antoun (2015); 4Buskirk & Andrus (2014); 5Stapleton (2013);6Toepoel & Lugtig (2014); 7Wells, et al. (2013); 8de Bruijne & Wijnant (2013); 9Couper & Peterson (2015)
Keusch & Yan (2016): Break-offs N=1,691 U.S. iPhone owners on MTurk
Keusch & Yan (2016): Completion Time* N=1,186 U.S. iPhone owners on Mturk, who finished the study *Differences remain significant after controlling for socio-demographics and Web survey experience
Keusch & Yan (2016) N=1,186 U.S. iPhone owners on Mturk, who finished the study
Measurement Error in Mobile Web Surveys • General cognitive processing seems to be same as in other modes1 • Survey completion on mobile device (especially smartphone) different than survey completion on desktop/laptop • Effects on item omission2,3,4 and primacy effects1,2,5 • Tablet seems to be more similar to desktop/laptop than smartphone • As long as care taken in design, very few (reliable) differences in responses to mobile Web and regular Web after controlling for self-selection and nonresponse4,6,7 Source: 1Peytchev & Hill (2010); 2Lugtig & Toepoel (2015); 3Mavletova & Couper (2014); 4de Bruijne & Wijnant (2013);5Stapleton (2013); 6Toepoel & Lugtig (2014); 7Peterson (2012)
Keusch & Yan (2016): Item Missing* N=1,186 U.S. iPhone owners on Mturk, who finished the study *Differences remain significant after controlling for socio-demographics and Web survey experience
Keusch & Yan (2016): Survey Answers and Satisficing • Mean ratings of four individual items and means of three indexes do not sign. differ across comparisons (p>0.05) • Mode differences in straightlining* • No effect on acquiescence and mid-point responding N=1,186 U.S. iPhone owners on Mturk, who finished the study *Differences remain significant after controlling for socio-demographics and Web survey experience
What We Know So Far… • For an increasing number of people, smartphones are the only way to access the Internet • If we don‘t alllow/make it difficult for them to participate in Web surveys, we might introduce nonreponse bias • Smartphones are used differently than PCs • If a Web survey design does not consider smartphone use, we might introduce measurement error • Track what device your respondents are using! • Design your web survey to be smartphone-friendly!
What We Know So Far… • For an increasing number of people, smartphones are the only way to access the Internet • If we don‘t alllow/make it difficult for them to participate in Web surveys, we might introduce nonreponse bias • Smartphones are used differently than PCs • If a Web survey design does not consider smartphone use, we might introduce measurement error • Track what device your respondents are using! • Design your web survey to be smartphone-friendly! • Do methodological research! • e.g., questionnaire design, modularizing Web surveys, location- and behavior-based survey invitations
Researcher-Driven Use of Smartphones Smartphone surveys, behavior tracking, and Big Data – Opportunities and challenges of new modes of survey data collection Florian Keusch
Using Apps for Passive Mobile Data Collection Compared to Surveys, Passive Data Collection Has Potential to… • …provide richer data • Because it is collected at higher frequencies • …decrease respondent burden • Because fewer survey questions needed • …reduce measurement error • Because of less memory errors and social desirability http://www.qualitytimeapp.com/
Using Apps for Passive Mobile Data Collection • Limited published research so far • Sonck & Fernee (2013): iPhone app modeled after Harmonized European Time Use Survey (HETUS) • Link et al. (2014): TV-viewing diary • Pew Research Center (2015b): Ecological momentary assessment • Revilla et al. (2016): Smartphone browsing behavior tracking • Sugie (2016): Job-seeking behavior of parolees • Several large scale research projects on the way • SOEP started to use mobile app to keep in touch with their refugee cohort • CBS and Utrecht University just formed a large-scale Innovation Network (WIN) for data collection innovation with smartphones • IAB will use mobile device measures for labor market research (MoDeM) • Modernizing Migration Measures – Combining survey and tracking data collection among refugees (MZES)
MoDeM • Collaboration of University of Mannheim (F. Keusch & F. Kreuter) with IAB (F. Kreuter, M. Trappmann, S. Bähr, & G.-C. Haas) • Funded by Institute for Employment Research (IAB) • Research Question: Can we use mobile passive Data collection via smartphones for labor market research? • Combining behavior and location tracking with short mobile web surveys • Both substantial and methodological questions
MoDeM: Research Design • Recruitement of participants from the "Labour Market and Social Security“ panel study (PASS) in spring 2017 • Invitation to download app to Android smartphone • App runs in background • Sends information about user activity on smartphone and other information (e.g., geolocation, signal strength, network provider) • App records activities on smartphone but not content generated • Incentives • € 10 for downloading app to smartphone • € 10 for leaving app installed on smartphone until end of field period • Push-notifications with invitations to very short surveys • Linking of app data, survey data, PASS data, and administrative data from Federal Employment Agency
MoDeM: Substantial Research Questions • Social participation of unemployed – Marienthal 2.0 • Does data from sensors built into smartphones allow inference on activities and daily routines of unemployed people? • Measuring social networks • How do measures of social embeddedness/social networks generated from app data compare to standard measures used in PASS? • Formal and informal ways of job-seeking • Can we learn more about the job-seeking process from smartphone data? • Can we use geolocation data to trigger surveys on job-seeking?
MoDeM: Methodological Research Questions • Optimizing push notifications for surveys • When and how often can and should smartphone users be invited to participate in short surveys? • Coverage error and nonresponse error of mobile passive tracking • Do PASS members who own an Android smartphone differ from PASS members who do not own an Android smartphone? • Do respondents and nonrespondents differ in our tracking study?
Challenges so far... • Legal and ethical concerns • Consent, data linkage, and privacy • Rapid advancement of technology • Constant updates of hard- and software • Limitation to Android smartphones • Coverage error? • GIP: Android users are older than users of other OS • New skills required that survey methodologists usually don‘t have • Technical know-how to set up app • Working with Big Data
Survey Methodology in the Age of Big Data Smartphone surveys, behavior tracking, and Big Data – Opportunities and challenges of new modes of survey data collection Florian Keusch
Big Data • Aka “organic” or “found” data • “Organic” (Groves 2011) – not from designed survey • “Found” – not collected by researchers Source: http://www.rosebt.com/blog/data-veracity
Forms of Big Data ? (Source: http://www.connexica.com/connexica_wp/wp-content/uploads/2014/12/Big-data-buzz-or-big-data-fuzz-blog-image.jpg) (Source: http://cdn2.business2community.com/wp-content/uploads/2016/06/internet-of-things.jpg) (Source: http://researchaccess.com/wp-content/uploads/2013/09/iStock_paper_questionnaire.jpg) http://pocketnow.com/2012/06/21/android-nfc-app-reveals-contactless-credit-card-details-should-you-be-worried#!prettyPhoto (Source: IAB) Source: Callegaro & Yang (in press)
So Will Surveys Go Away? Maybe… Probably not… Bigger data not always better data Most Big/Found/Organic data not output of instruments designed to produce valid and reliable data for scientific analysis Data generation in many cases a “black box” Surveys will be needed in the future… …in combination with other forms of data …as benchmark to gauge success of social media research …for attitudinal data • Censuses replaced by administrative records in some countries • In the US, sales from “automated digital/electronic” (35%) already far ahead of online surveys (20%) (ESOMAR 2015) • Web scraping allows predictions of, for example, inflation (e.g., Billion Price Project, Price Stats) • …
What Can Survey Methodologists Bring to the Table? Source: AAPOR (2015)
Total Big Data Error Source: AAPOR (2015)
Summary Smartphone surveys, behavior tracking, and Big Data – Opportunities and challenges of new modes of survey data collection Florian Keusch
Summary • For an increasing number of people, smartphones are the only way to access the Internet • Design your web survey to be smartphone-friendly! • Researchers can use smartphones for more than just Web surveys • Location- and behavior-based survey invitations • Combining mobile web surveys with passive data collection • We need to better understand what drives people to participate/not paticipate in passive mobile data collection • More research needed to understand influence of smartphones on data quality (coverage, nonresponse, measurement) • Survey methodologists need to embrace their role in a changing environment • Experts on data generation and data quality
Thank You! Florian Keusch University of Mannheim School of Social Sciences Statistics and Methodology f.keusch@uni-mannheim.de http://floriankeusch.weebly.com/
Literature AAPOR (2015). AAPOR Report on Big Data. Antoun, C. (2015). Mobile web surveys: A first look at measurement, nonresponse, and coverage errors. Dissertation at the University of Michigan, Ann Arbor, MI. Bort, J. (April 19, 2015). The incredible career of David Goldberg – Business Insider. http://uk.businessinsider.com/the-increadible-career-of-david-Goldberg-2015-4. Buskirk, T. D., & Andrus, C. (2014). Making mobile browser surveys smarter: Results from a randomized experiment comparing online surveys completed via computer or smartphone. Field Methods, 26, 322–342. Callegaro, M. Yang, Y. (in press). The role of surveys in the era of “Big Data.” In D.L. Vannette & J.A. Krosnick (Eds.), The Palgrave Handbook of Survey Research. New York: Palgrave. Couper, M. P., & Peterson, G. (2016). Why do web surveys take longer on smartphones? Social Science Computer Review. Published online before print February 11, 2016. doi:10.1177/0894439316629932. de Bruijne, M., & Wijnat, A. (2013). Comparing survey results obtained via mobile devices and computers: An experiment with a mobile web survey on a heterogeneous group of mobile devices versus a computer assisted web survey. Social Science Computer Review, 31, 482–504. de Bruijne, M., & Wijnat, A. (2014). Mobile response in web panels. Social Science Computer Review, 32, 728–742. ESOMAR (2015). Global Market Research 2015. An ESOMAR Industry Report. Amsterdam: ESOMAR. Groves, R. (2011). Three eras of survey research. Public Opinion Quarterly, 75(5), 861-871. Keusch, F. & Yan, T. (2016). Web Versus Mobile Web: An Experimental Study of Device Effects and Self-Selection Effects. Social Science Computer Review. Published online before print November 2, 2016, doi: 10.1177/0894439316675566.
Literature Kinesis. (2013). Online survey statistics from the mobile future. http://www.kinesissurvey.com/wp-content/uploads/2014/05/UPDATED-with-Q3-2013-Data-Mobile-whitepaper.pdf. Link, M. W., Lai, J., & Bristol, K. (2014). Not so fun? The challenges of applying gamification to smartphone measurement. In A. Marcus (Ed.), Design, user experience, and usability. User experience design practice (pp. 319–327). Cham, Switzerland: Springer. Lugtig, P., & Toepoel, V. (2016). The use of PCs, smartphones, and tablets in a probability-based panel survey: Effects on survey measurement error. Social Science Computer Review, 34, 78–95. Mavletova, A. (2013). Data quality in PC andmobile web surveys. Social Science Computer Review, 31, 725–743. Mavletova, A., & Couper, M. P. (2013). Sensitive topics in PC web and mobile web surveys: Is there a difference? Survey Research Methods, 7, 191–205. Mavletova, A., & Couper, M. P. (2014). Mobile web survey design: Scrolling versus paging, SMS versus e-mail invitations. Journal of Survey Statistics and Methodology, 2, 498–518. Peterson, G. (2012).What we can learn from unintentional mobile respondents. CASRO Journal, 2012-2013, 32-35. Pew Research Center. (2015a). Technology device ownership: 2015. http://www.pewinternet.org/files/2015/10/PI_2015-10-29_device-ownership_FINAL.pdf. Pew Research Center. (2015b). App vs. web for surveys of smartphone users. http://www.pewresearch.org/files/2015/03/2015-04-01_smartphones-METHODS_final-3-27-2015.pdf. Peytchev, A., & Hill, C. A. (2010). Experiments in mobile web survey design. Similarities to other modes and unique considerations. Social Science Computer Review, 28, 319–335.
Literature Revilla, M., Toninelli, D., Ochoa, C., & Loewe, G. (2015). Who has access to mobile devices in an online opt-in panel? An analysis of potential respondents for mobile surveys. In D. Toninelli, R. Pinter, & P. de Pedraza (Eds.), Mobile Research Methods: Opportunities and Challenges of Mobile Research Methodologies (pp. 119–139). London, England: Ubiquity Press. Revilla, M., Ochoa, C., & Loewe, G. (2016). Using passive data from a meter to complement survey data in order to study online behavior. Social Science Computer Review. Published online before print March 17, 2016. doi:10.1177/0894439316638457 Sonck, N., & Fernee, H. (2013). Using smartphones in survey research: A multifunctional tool. Implementation of a time use app: A feasibility study. The Hague, Netherlands: The Netherlands Institute for Social Research. Stapleton, C. E. (2013). The smartphone way to collect survey data. Survey Practice, 6. http://www.surveypractice.org/index.php/SurveyPractice/article/view/75/html. Struminskaya, B., Weyandt, K., & Bosnjak, M. (2015). The effects of questionnaire completion using mobile devices on data quality. Evidence from a probability-based general population panel. Methods, Data, Analyses, 9, 261–292. Sugie, N. (2016). Utilizing Smartphones to Study Disadvantaged and Hard-to-Reach Groups. Sociological Methods & Research. Published online before print January 18, 2016, doi: 10.1177/0049124115626176. Toepoel, V., & Lugtig, P. (2014). What happens if you offer a mobile option to your web panel? Evidence from a probability-based panel of Internet users. Social Science Computer Review, 32, 544–560. Wells, T., Bailey, J. T., & Link, M. W. (2013). Filling the void: Gaining a better understanding of tablet-based surveys. Survey Practice, 6. http://www.surveypractice.org/index.php/SurveyPractice/article/view/25/html.