520 likes | 687 Views
Crowdsourcing Complexity: Lessons for Software Engineering. Lydia Chilton 2 June 2014 ICSE Crowdsourcing Workshop. Clarification: Human Computation. Mechanical Turk Microtasks :. 2007: JavaScript Calculator. 2007: JavaScript Calculator. Evolution of Complexity in Human Computation.
E N D
Crowdsourcing Complexity:Lessons for Software Engineering Lydia Chilton 2 June 2014 ICSE Crowdsourcing Workshop
Clarification: Human Computation Mechanical Turk Microtasks:
Evolution of Complexity in Human Computation Task Decomposition: Cascade & Frenzy
1. Collective Intelligence 1906: 787 aggregated votes averaged 1197 lbs. Actual answer: 1198 lbs.
1. Collective Intelligence Principles: • Small tasks • Intuitive tasks • Independent answers • Simple aggregation Application: - ESP Game
2. Iterative Workflows work improve vote improve vote Collective Intelligence
2. Iterative Workflows Principles: • Use fresh eyes • Vote to ensure improvement Application: - Bug finding “given enough eyeballs, all bugs are shallow”
3. Psychological Boundaries Applications: • Manager / programmer • Writer / editor • Write code / test code • Addition / subtraction Principle: • Task switching is hard • Natural boundaries for tasks
4. Task Decomposition Legion:Scribe Real Time Audio Captioning on MTurk
4. Task Decomposition Principles: • Must be able to break apart tasks AND put them back together. • Complex aggregation • Hint: Solve backwards. Find what people can do, and build up from there.
5. Worker Choice Mobi: Trip Planning on Mturk with an open UI.
5. Worker Choice Applications: • Trip planning • Conference time table • Conference session-making Principles: • Giving workers freedom relieves requesters’ burden of task decomposition. • Workers feel more involved and empowered. • BUT complex interface that is difficult to scale.
6. Learning and Doing Applications: • Peer assessment • Do grading assignments before you do your own assignment • Task Feedback Principles: • Teaching workers makes them better. • How long will they stay?
Lessons for Software Engineering • Propose and vote • Find natural psychological boundaries between tasks • Find the tasks people can do, then assemble them using complex aggregation techniques. • Teach. 221 + 473 -221 + 473
Evolution of Complexity in Human Computation Task Decomposition: Cascade & Frenzy
Task decomposition is the key to crowdsourcing software engineering
Cascade Crowdsourcing Taxonomy Creation • Lydia Chilton (UW), Greg Little (oDesk), Darren Edge (MSR Asia), • Dan Weld (UW), James Landay (UW)
1000 eGovernment suggestions • 50 top product reviews • 100 employee feedback comments • 1000 answers to “Why did you decide to major in Computer Science?” Machines can’t analyze it People don’t have time to analyze it • time consuming • overwhelming • no right answer
Iterative Improvement Problems The hierarchy grows and becomes overwhelming Workers have to decide what to do Lesson Break up the task more
Initial Approach 2:Category Comparison • Problem • Without context, it’s hard to judge relationships • flying vs. flights • TSA liquids vs. removing liquids • Packing vs. what to bring • Lesson • Don’t compare abstractions to abstractions • Instead compare data to abstractions
Use Lesson #3 Find the tasks people can do. Assemble them using complex aggregation techniques. Categorize Select Best Labels Generate Labels
Cascade Algorithm For a subset of items Generate Labels Select Best Labels {good labels} Categorize For all items, for all good labels, Then recurse
Aggregate Data into Taxonomy redundant • Blue: • Light Blue: • Green: • Other: nested Green Blue Light Blue singletons
How can we get a global picture from workers who see only subsets of the data?
Propose, Vote, Test Workers have good heuristics. Let them propose categories. Vote on categories to weed out bad ones. Test the heuristics by verifying it on data. Propose Vote Test
Lesson Propose, Vote, Test.
Deploy Cascade to Real Needs • CHI 2013 Program Committee Organize 430 accepted papers to help session making • 40 CrowdCamp Hack-a-thon Participants Organize 100 hack-a-thon ideas to help organize teams
430 CHI Papers: Good Results, but… Patina: Dynamic Heatmaps for Visualizing Application Usage', Effects of Visualization and Note-Taking on Sensemaking and Analysis', Contextifier: Automatic Generation of Messaged Visualizations', Interactive Horizon Graphs: Improving the Compact Visualization of Multiple Time Series', Quantity Estimation in Visualizations of Tagged Text', Motif Simplification: Improving Network Visualization Readability with Fan, Connector, and Clique Glyphs', Evaluation of Alternative Glyph Designs for Time Series Data in a Small Multiple Setting', Individual User Characteristics and Information Visualization: Connecting the Dots through Eye Tracking', "Without the Clutter of Unimportant Words": Descriptive Keyphrases for Text Visualization']], Direct Space-Time Trajectory Control for Visual Media Editing Your eyes will go out of the face: Adaptation for virtual eyes in video see-through HMDs Swifter: Improved Online Video Scrubbing Direct Manipulation Video Navigation in 3D NoteVideo: Facilitating Navigation of Blackboard-style Lecture Videos Ownership and Control of Point of View in Remote Assistance EyeContext: Recognition of High-level Contextual Cues from Human Visual Behaviour Your eyes will go out of the face: Adaptation for virtual eyes in video see-through HMDs Still Looking: Investigating Seamless Gaze-supported Selection, Positioning, and Manipulation of Distant Targets Individual User Characteristics and Information Visualization: Connecting the Dots through Eye Tracking Quantity Estimation in Visualizations of Tagged Text • Visualization (19) • evaluating infovis(9) • text (2) • video (6) • visualizing time data (5) • gaze (4) • gaze tracking (3) • user requirements (3) • color schemes (2)
“Don’t treat me like a Turker.” “I just want to see all the data”
Lesson Authority and Responsibility should be aligned.
Frenzy: Collaborative Data Organization for Creating Conference Sessions Lydia Chilton (UW), Juho Kim (MIT), Paul Andre (CMU), Felicia Cordeiro (UW), James Landay (Cornell?), Dan Weld (UW), Steven Dow (CMU), Rob Miller (MIT), Haoqi Zhang (NW)
Groupware Creating conference sessions is a social process. Grudin: Social process are often guided by personalities, tradition, convention. Challenge: support to the process without seeking to replace these behaviors. Challenge: remain flexible and do not improve rigid structures.
Light-weight contributions Label Vote Categorize
2-Stage Workflow Stage 1 Stage 2 Set-up • Collect Meta Data • 60 PC members • Low authority • Low responsibility • Session Making • 11 PC members • High authority • High responsibility
Goals Collect data: labels, votes Session-Making
Results Sessions created in record-setting 88 minutes.