320 likes | 442 Views
Crisp Answers to Fuzzy Questions: Design lessons for crowdsourcing decision inputs. Alex Quinn, Ben Bederson. R. L. Polk & Co. / Autotrader.com, Automotive Buyer Study, 2011.
E N D
Crisp Answers to Fuzzy Questions: Design lessons for crowdsourcing decision inputs Alex Quinn, Ben Bederson
R. L. Polk & Co. / Autotrader.com, Automotive Buyer Study, 2011 “Market research firm J.D. Power and Associates says […] more than 80% of buyers have already spent an average of 18 hours online researching models and prices, according to Google data.” Wall Street Journal, 2/26/2013
Building blocks for Mechanical Turk:HITs (human intelligence task)
Keep instructions short. Input labels should be unambiguous HITs must be grouped by common templates See Mechanical Turk Requester Best Practices Guide
Example #1: Find a pediatrician • Requirements • Accepts my insurance • ≥4 stars at RateMDs.com • >80% positive at HealthGrades.com • ≤15 minutes drive from home
Effort should be proportional to the reward. HITs in a group share a common base price. Information sources should be traceable. See Mechanical Turk Requester Best Practices Guide
Example #2: Buy a stroller • Requirements • Fits a 30-pound baby • Reclines for sleeping • Medium/large-sized soft tires • Can purchase online in US
Find creative ways to track sources Bonus offers allow reward to scale with effort
Design lessons • Consider effort-reward balance from the start. • Look for implicit ways of capturing sources. • Use word economy to conserve vertical space. • Choose unambiguous input labels. Alex Quinn aq@cs.umd.edu