260 likes | 472 Views
Mechanical Turk. Online Sampling with Crowdsourcing. The Turk. Human Intelligence Tasks (HIT). Humans do some tasks better than machines Artificial Artificial I ntelligence Marketplace for HITs. Turk Workers & Requestors. “ Turkers ”. Requestors. 500,000 workers 190 countries
E N D
Mechanical Turk Online Sampling with Crowdsourcing
Human Intelligence Tasks (HIT) • Humans do some tasks better than machines • Artificial Artificial Intelligence • Marketplace for HITs
Turk Workers & Requestors “Turkers” Requestors • 500,000 workers • 190 countries • 60% female* • 83.5% white* • 32.2 years old* • 14.9 years of education* *Berinsky, A. J., Huber, G. A., & Lenz, G. S. (2012). Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk. Political Analysis, 20,351–368.
Why Mturk? • Low-cost paneling • Diverse sample • Convenience • Flexible • Easily managed
Turk Samples • “The MTurk sample does not perfectly match the demographic and attitudinal characteristics of the U.S. population but does not present a wildly distorted view of the U.S. population, either.”* • Numerous social science experiments replicated on Mturk** • Slightly more demographically diverse than are standard Internet samples*** • Significantly more diverse than typical American college samples*** • At least as reliable as those obtained via traditional methods*** *Berinsky, A. J., Huber, G. A., & Lenz, G. S. (2012). Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk. Political Analysis, 20,351–368. **Mason, W. & Suri, S. (2011). Conducting behavioral research on Amazon’s Mechanical Turk. Behavioral Research Methods, 44(1), 1-23. ***Buhrmester, M., Kwang, T., & Gosling, S. D. (2011). Amazon’s Mechanical Turk: A new source of inexpensive, yet high-quality data? Perspectives on Psychological Science, 6(1), 3-5.
Creating a HIT May request “master workers” of general/photo/category type Location, approval rate, and “mastery” are the only selection criteria
Linking to Third Party Software (3PS) • 3PS examples • Qualtrics, Survey Monkey, etc • Why 3PS? • Between subjects designs • Random assignment • Time data • Diverse item types • Paging • No programming knowledge • Data exportation • How to link 3PS and Mturk?
Linking to 3PS • Create a survey in 3PS • The last question of the survey should disclose an “approval code” • Copy the URL of the survey into a HIT • The HIT has one question: “what is the approval code?”
Pitfalls • Participation in multiple groups • HITs completed slowly • Too low pay • HIT time distorted • Uninteresting description • Sample bias • Time of day / week • SES, education, work, family
Best Practices • One survey, all conditions • Thoughtful description, tags • Estimate fair wage • General formula is (#items + # sentences + stimuli exposure time)*2 = seconds to complete • Figure on minimum wage rate • Limit HIT times • Completion time • Collection window • Be consistent if doing multiple collections • (Dis)Approve HITs within a day or two • Toss out multivariate outliers