1 / 55

How to Design Effective Web Surveys

How to Design Effective Web Surveys. 2012-2013 Workshop in Methods October 19, 2012. Kevin Tharp, Lilian Yahng, and Ashley Bowers. Outline. Overview of Exciting Opportunities and Challenges of Web Surveys Effective Layout and Design of Web Surveys

eustacia
Download Presentation

How to Design Effective Web Surveys

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How to Design Effective Web Surveys 2012-2013 Workshop in Methods October 19, 2012 Kevin Tharp, Lilian Yahng, and Ashley Bowers

  2. Outline • Overview of Exciting Opportunities and Challenges of Web Surveys • Effective Layout and Design of Web Surveys • Implementation of Web Surveys to Maximize Data Quality • Considerations in Selecting Web Survey Tool

  3. Opportunities of Web Surveys • Improve survey measurement and reduce error • Eliminates interviewer effects • Features of computer-assisted instrument (customized wording, automated skips/routing, edit checks, randomization) • Can engage respondent • Can present rich multi-media information • Anyone can build even a fairly sophisticated survey quickly and easily

  4. B6. How important have each of the following sources been in helping you learn how to perform your job? If you were offered or made aware of a source but did not use it, please select “Did not use”. If you were not offered or made aware of a source, please indicate “Not available”.

  5. Complicated Skips and Customized Wording

  6. Randomization Randomization

  7. Ease of Adding Access to Other Information • (THANK YOU SCREEN) • Thank you for your participation in this survey! • We would like to provide you with some resources that would be valuable to refer to while you are working at SRO. • For a list of SRO acronyms: please go to https://webtrak.isr.umich.edu/sro/ • For general understanding of survey research: Survey Methodology (2nd edition), by: Robert M. Groves, et al. and the Encyclopedia of Survey Research Methods, edited by: Paul Lavrakas.

  8. Completeness Check

  9. http://www.270towin.com/

  10. http://media3.surveycenter.com/FlashDevs/NPD_products/demo/black/MediaPlayers/analyzer/player.asphttp://media3.surveycenter.com/FlashDevs/NPD_products/demo/black/MediaPlayers/analyzer/player.asp

  11. http://media4.surveycenter.com/FlashDevs/WebSite/EmotionPicker/index.htmlhttp://media4.surveycenter.com/FlashDevs/WebSite/EmotionPicker/index.html

  12. Opportunities of Web Surveys (2) • Lowers cost • Greater speed • Environmentally friendly • Reduces respondent burden - complete survey at convenience and over multiple time periods • Wealth of survey process data for analysis

  13. Challenges of Web Surveys • Reduce quality of survey measurement • Lack of interviewer presence • Constrained format (versus paper) • Look and feel not under direct control and may affect measurement • Browser, connection type and speed, font size • Type of device used (smartphone, iPad, laptop, desktop) • Anyone can easily and quickly put together a survey

  14. http://www.quizrocket.com/pirate-vs-ninja-quiz

  15. Optimized for Device • Please text TOLUNA8 to the following #: 91318

  16. http://www.polleverywhere

  17. Challenges of Web Surveys (2) • Poor reporting behavior (person respond multiple times, straightlining) • Complexity – more time and money • Lower response rates • No comprehensive list of email addresses for US population

  18. Web Survey Layout and Design Strategies for maximizing response and minimizing error

  19. Case study in web survey design: NSSE (National Survey of Student Engagement) • Annual survey of college students • 600+ institutions w/ customization • Large sample (1 mil+) • Administered since 2000 • Mostly paper originally • Change in web design every few years

  20. Scales • 2008 • 2010

  21. Scales (2012)

  22. Birth year • 2003 • 2008

  23. Birth year (2012)

  24. Major (2000)

  25. Major (2003)

  26. Major (2008)

  27. Major (2012)

  28. Pagination • nsse 2000 • nsse 2003 • nsse 2008 • nsse 2010 • nsse 2012 (2.0 pilot)

  29. 2013? • Progress indicator • Submit on click • More paths • Mobile challenges

  30. Web Survey Implementation Strategies for maximizing response and minimizing error

  31. “Implementation” – data collection protocols Implementation Aims: • Maximize unit response (response rate) • Minimize “error” (deviations in survey processes that lead to inaccuracy or bias) • With regard to implementation: nonresponse error But this is not the only source of error! High response rates do not guarantee the absence of response bias. Neither do low response rates necessitate response bias. Center for Survey Research

  32. TOTAL SURVEY ERROR Groves R M , Lyberg L Public Opin Q 2010;74:849-879 © The Author 2011. Published by Oxford University Press on behalf of the American Association for Public Opinion Research. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  33. High response rates reduce the risk of response bias. So, how to turn sample members into respondents? • (without introducing more potential for error) Center for Survey Research

  34. Think strategically about your target population • What would induce them in particular to take the survey? • What would facilitate their giving accurate responses? • Monitor your data during data collection for responsive measures • Look at the survey data: response distributions by subsample, variable crosstabs, open-response items, etc. • Look at process data: track individuals, break-offs, times of completion • Construct responsive (possibly corrective) strategies • Try experimenting, particularly if multiple administrations or years Center for Survey Research

  35. Implementation Planning Checklist • Sample preparation • Data storage and archiving • Survey distribution, timing, field period • Follow-up and nonresponse protocols • Email message text and signature • Incentives • Leave time for testing • And… do a pilot, if possible! Center for Survey Research

  36. Sample Preparation • Sample preparation • How is the sample to be acquired? How current is the contact information? How reliable are the email addresses? • Need to assign case id numbers? • Append other (e.g. demographic or subsample) information? • Check the list for blank fields, duplicate emails, conflict of interest, ineligibles, etc. • Data management: storage and archiving plan • What is the level of security needed for the project/grant? • IU Libraries Data Management Resources Center for Survey Research

  37. Email Messages Write to your audience (target population): • Content/composition: Survey duration, sponsor, purpose, study contact info. First person singular/plural? Length of message? Deadline? • Branding your survey: Logo? Survey title? • Signature: Who should sign it? An individual vs. an entity? • Survey link: Placement? Masked link? • Subject and From lines • Tech considerations: html vs. plain text, images, emphasis tags, email clients, etc. Center for Survey Research

  38. Survey Distribution • Method(s) of contact? Via individual email addresses, list-serv, on a website, postal mail, phone call, text message, something else? • Make it convenient for your target population: • Timing: Month, day, time? Holidays or special schedules? • Follow-up reminders: How many, time between them? Different contact method? • Field length Center for Survey Research

  39. Incentives • Do they work? How much of a boost can be expected? • What kinds of incentives are most effective? • Differential incentives? • See university or grant policies on prizes and lotteries Center for Survey Research

  40. Nonresponse What kind of nonresponse was it? • Delivery failure (noncontact) • Refusal to participate (busy, not interested) • Inability to participate (language, visual impairment) • Mistaken perceptions of ineligibility Compare to other data? Post-survey interviews? Center for Survey Research

  41. Selecting a Web Survey Tool Some Considerations and Available Sets of Tools

  42. What Are Top 2-3 Features You Would Need or Want to Have in Web Survey Tool? Disclaimer: I do not endorse any web survey tool. This presentation is only one person’s view. Only a few tools are mentioned. This is not intended to be a comprehensive review.

  43. Some Considerations in Selecting Web Survey Tool • Survey Complexity • Layout Customization • Multilanguage Capabilities • Ease of Development/Programming • Able to Integrate with Other Modes of Data Collection • Data Output and Analysis • Security • Number of Users/Scalability • Integrated Email and Sample Management Capabilities • Cost • Help and Support

More Related