1 / 15

WG2 Task Force “Crowdsourcing”

WG2 Task Force “Crowdsourcing”. 8th Qualinet General Meeting, Delft , 2014 Tobias Hossfeld. https:// www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd:berlinmeeting2014. Agenda (90min). Summary of major activies and achievements in Qualinet (15min, Tobias Hossfeld)

Download Presentation

WG2 Task Force “Crowdsourcing”

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. WG2 Task Force“Crowdsourcing” 8th Qualinet General Meeting, Delft, 2014 Tobias Hossfeld https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd:berlinmeeting2014 WG2 Mechanisms and Models

  2. Agenda (90min) • Summary of major activies and achievements in Qualinet(15min, Tobias Hossfeld) • Recent results since Berlin meeting • Comparison between lab and crowdsourcing results on aesthetic appeal (15 min, ErnestasiaSiahaan, Judith Redi) • Crowdsourcing study on real streaming experience (15 min, Christian Timmerer) • Experience reports from the crowdsourcing task force • Lessons learned from CS task force (10 min, Filippo Mazza) • Best practices introduced by Qualinet CS task force (15 min, Bruno Gardlo) • Future activities and research topics beyond Qualinet • Predicting Result Quality in Crowdsourcing Using Application Layer Monitoring (10 min, Matthias Hirth) • Results from the summer school: eye-tracking and affective crowdsourcing (10 min, Matthias Hirth) WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

  3. Major Activities and Achievements of CS TF (1/2) • CS TF established in Feb 2012 • Joint Qualinet Crowdsourcing experiment (JOC): Further results were achieved and crowdsourcing experiments conducted. • Crowdsourcing-based multimedia subjective evaluations: a case study on image recognizability and aesthetic appeal • Crowdlibactivity finished by joint survey paper: Survey of Web-based Crowdsourcing Frameworks for Subjective Quality Assessment. MMSP 2014.  • Best practices • Best Practices for QoECrowdtesting: QoE Assessment with Crowdsourcing. Transactions on Multimedia, 16(2), Feb 2014 • Qualinet Recommendation Whitepaper: “Lessons Learned from Crowdsourcing - Best practices and recommendations from the Qualinet WG2 TF Crowdsourcing”, Oct 2014 WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

  4. Workshops and Events (2/2) • ACM Workshop on Crowdsourcing for Multimedia • CrowdMM2013, Organizer: Tobias Hossfeld • CrowdMM2014, Chairs: Judith Redi, Mathias Lux • Summerschool on Crowdsourcing, Patras, 2013 • Very nice event, lectures, group work • Joint papers published • Dagstuhl seminar “Crowdsourcing: From Theory to Practice and Long-Term Perspectives”, Sep 2013 • Special Session Crowdsourcing and Crowdsourcing Applications, IEEE ICCE 2014 • ACM CrowdRec2013: Workshop on Crowdsourcing and Human Computation for Recommender Systems WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

  5. Short Term Scientific Mission (STSM) • Tobias Hoßfeld (University of Würzburg, now at University of Duisburg-Essen) at FTW (Raimund Schatz): “Modeling YouTube QoE based on Crowdsourcing and Laboratory User Studies”, 2011 • Bruno Gardlo (University of Zilina, now FTW) at UniWürzburg (Tobias Hoßfeld): “Improving Reliability for Crowdsourcing-­‐Based QoETesting”, 2012 • Lea Skorin-Kapov (University of Zagreb) at VTT (Martin Varela): “Multidimensional modeling of Web QoE”, 2012 • Christian Sieber (University of Würzburg) at University of Klagenfurt (Christian Timmerer): “Holistic Evaluation of Novel Adaptation Logics for DASH and SVC”, 2012 WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

  6. Publications from the Qualinet CS TF • In total, 35 different publications were reported in the wiki • 17 joint publications (with at least 2 Qualinet institutes) • Basic statistics for joint papers • Mean number of authors: 4.5 • Median: 4 authors • Mean number of institutes: 3.2 • For all papers • Mean number authors: 4.0 • Very strong collaboration WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

  7. Paper Highlights (1/3) • Output from summer school • Pierre Lebreton, EvangelosSkodras, Toni Mäki, Isabelle Hupont Torres, and Matthias Hirth. Bridging the gap between eye tracking and crowdsourcing. Accepted for publication in: Proceedings of the SPIE 9394, Human Vision and Electronic Imaging XX . San Francisco, USA, February 2015 • Isabelle Hupont, Pierre Lebreton, Toni Mäki, EvangelosSkodras, Matthias Hirth. Is affective crowdsourcing reliable?. 5th International Conference on Communications and Electronics (ICCE 2014), Da Nang, Vietnam, July 2014 • Pavel Korshunov, Hiromi Nemoto, AthanassiosSkodras, and TouradjEbrahimi. The effect of HDR images on privacy: crowdsourcing evaluation. SPIE Photonics Europe 2014, Optics, Photonics and Digital Technologies for Multimedia Applications, April 2014, Brussels, Belgium. WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

  8. Paper Highlights (2/3) • Output from joint Qualinet crowdsourcing (JOC) experiment • Judith Redi and Isabel Povoa. Crowdsourcing for Rating Image Aesthetic Appeal: Better a Paid or a Volunteer Crowd? 3rd International ACM workshop on Crowdsourcing for Multimedia (CrowdMM 2014), Orlando, USA, November 2014 • Judith Redi, Tobias Hossfeld, Pavel Korshunov, Filippo Mazza, Isabel Povoa and Christian KeimelCrowdsourcing-based multimedia subjective evaluations: a case study on image recognizability and aesthetic appeal.ACM CrowdMM 2013, Barcelona, Spain, October 2013. • Output from Crowdlib activity on Crowdsourcing Frameworks • Tobias Hoßfeld, Matthias Hirth, Pavel Korshunov, Philippe Hanhart, Bruno Gardlo, Christian Keimel, Christian Timmerer. Survey of Web-based Crowdsourcing Frameworks for Subjective Quality Assessment. 16th International Workshop on Multimedia Signal Processing, Jakarta, Indonesia, September 2014.  WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

  9. Paper Highlights (3/3) • Best practices, recommendations, lessons learned • Qualinet whitepaper: Best Practices and Recommendations for CrowdsourcedQoE Lessons learned from the Qualinet WG2 Task Force Crowdsourcing, Whitepaper version: October 7, 2014, by Tobias Hosseld, Matthias Hirth, Judith Redi, Filippo Mazza, Pavel Korshunov, BabakNaderi, Michael Seufert, Bruno Gardlo, Sebastian Egger, Christian Keimel • Tobias Hossfeld, Christian Keimel, Matthias Hirth, Bruno Gardlo, Julian Habigt, Klaus Diepold, Phuoc Tran-GiaBest Practices for QoECrowdtesting: QoE Assessment with Crowdsourcing.Transactions on Multimedia, 16(1), 2014 • Tobias Hoßfeld, Christian TimmererQuality of Experience Assessment using Crowdsourcing .IEEE COMSOC MMTC R-Letter , 5, 2014. WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

  10. Future work and activities • Forum paper (in form of interview of experts) • In a different form, own experiences and best practices will be shared to raise discussions, e.g. DSP Forum, IEEE Signal Processing Magazine • Initiated by Patrick le Callet and Tobias Hossfeld • Editorial board: Tobias Hossfeld, Judith Redi, Sebastian Egger, Patrick le Callet, Ian Burnett • JOC: still some open questions • impact of the crowdsourcing environment on QoEassessment • impact of incentives on data quality of QoEassessments • … • Future activities and research topics beyond Qualinet • Predicting Result Quality in Crowdsourcing Using Application Layer Monitoring (10 min, Matthias Hirth) • Results from the summer school: eye-tracking and affective crowdsourcing (10 min, Matthias Hirth) WG2 Task Force „Crowdsourcing“

  11. CfP: Elsevier Special Issue Crowdsourcing • Using crowdsourcing for testing and measurements, e.g., of networked applications • System testing, performance measurements, e.g., flash crowds, realistic user behavior patterns • Subjective tests on user perceived quality • Experience reports and studies of crowdsourcing (i.e., services, testing, measurements) • Survey papers and reviews on different aspects of crowdsourcing • Important Dates • Paper Submission Due: October 15, 2014 • First Round Notification: January 15, 2015 • Revised Paper Submission Due: February 15, 2015 • Second Round Notification: April 15, 2015 • Tentative Publication Date: 3rd Quarter, 2015 • CfP: http://www.journals.elsevier.com/computer-networks/call-for-papers/special-issue-on-crowdsourcing/ • Submission: http://ees.elsevier.com/comnet/ WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

  12. Status: CS Recommendation Paper – Lessons learned • Consistent writing style • some definitions in the beginning  Judith? • using uniform terms throughout the paper • Sections to be finalized by main editor • Editor may ask another author of the section to help:) • Deadline: 17 October • Document ready for publication • on Qualinet homepage, archived in HAL • 27 October WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

  13. Publications and Results in 2014 (1/2) • Judith Redi and Isabel Povoa. Crowdsourcing for Rating Image Aesthetic Appeal: Better a Paid or a Volunteer Crowd? 3rd International ACM workshop on Crowdsourcing for Multimedia (CrowdMM 2014), Orlando, USA, November 2014 • BabakNaderi, Ina Wechsung, Tim Polzehl and Sebastian Möller. Development and Validation of Extrinsic Motivation Scale for Crowdsourcing Micro-task Platforms. 3rd International ACM workshop on Crowdsourcing for Multimedia (CrowdMM 2014), Orlando, USA, November 2014 • Tobias Hoßfeld, Matthias Hirth, Pavel Korshunov, Philippe Hanhart, Bruno Gardlo, Christian Keimel, Christian Timmerer.Survey of Web-based Crowdsourcing Frameworks for Subjective Quality Assessment. 16th International Workshop on Multimedia Signal Processing, Jakarta, Indonesia, September 2014. Author version available. • Tobias Hoßfeld, Michael Seufert, Christian Sieber, Thomas Zinner. Assessing Effect Sizes of Influence Factors Towards a QoE Model for HTTP Adaptive Streaming. 6th International Workshop on Quality of Multimedia Experience (QoMEX), Singapore, September 2014. • BabakNaderi, Tim Polzehl, André Beyer, Tibor Pilz, Sebastian Möller. Crowdee: Mobile Crowdsourcing Micro-task Platform - for Celebrating the Diversity of Languages. Proc. 15th Ann. Conf. of the Int. Speech Communication Assoc. (Interspeech 2014). IEEE, Singapore, September 2014. • Simon Oechsner, Boris Bellalta, DesislavaDimitrova, Tobias Hoßfeld. Visions and Challenges for Sensor Network Collaboration in the Cloud. The Eighth International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing (IMIS-2014), Birmingham, UK, July 2014. Author version available. • Isabelle Hupont, Pierre Lebreton, Toni Mäki, EvangelosSkodras, Matthias Hirth. Is affective crowdsourcing reliable?. 5th International Conference on Communications and Electronics (ICCE 2014), Da Nang, Vietnam, July 2014. Author version available. • Matthias Hirth, Sven Scheuring, Tobias Hoßfeld, Christian Schwartz, Phuoc Tran-Gia. Predicting Result Quality in Crowdsourcing Using Application Layer Monitoring. 5th International Conference on Communications and Electronics (ICCE 2014), Da Nang, Vietnam, July 2014. Author version available. WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

  14. Publications and Results in 2014 (2/2) • Tobias Hoßfeld, Christian Timmerer. Quality of Experience Assessment using Crowdsourcing.IEEE COMSOC MMTC R-Letter, 5, June 2014. • Tobias Hoßfeld. On Training the Crowd for Subjective Quality Studies.VQEG eLetter, 1, March 2014. • Tobias Hossfeld, Christian Keimel, Matthias Hirth, Bruno Gardlo, Julian Habigt, Klaus Diepold, Phuoc Tran-Gia. Best Practices for QoECrowdtesting: QoE Assessment with Crowdsourcing. Transactions on Multimedia, 16, 2014. • Tobias Hoßfeld, Christian Keimel. Crowdsourcing in QoE Evaluation in Quality of Experience. Editor(s):Sebastian Möller, Alexander Raake, Springer, ISBN 978-3-319-02680-0, Advanced Concepts, Applications and Methods Series: T-Labs Series in Telecommunication Services, March 2014. • Valentin Burger, Matthias Hirth, Christian Schwartz, Tobias Hoßfeld. Increasing the Coverage of Vantage Points in Distributed Active Network Measurements by Crowdsourcing. Measurement, Modelling and Evaluation of Computing Systems (MMB 2014), Bamberg, Germany, March 2014. • Bruno Gardlo, Sebastian Egger, Michael Seufert, Raimund Schatz. Crowdsourcing 2.0: Enhancing Execution Speed and Reliability of Web-based QoE Testing. IEEE International Conference on Communications (ICC 2014), Sydney, Australia, June 2014. • Filippo Mazza, MatthieuPerreira Da Silva, Patrick Le Callet. Would you hire me? Selfie portrait images perception in a recruitment context SPIE HVEI XIX 2014, San Francisco, CA USA, February 2014. • Pavel Korshunov, Hiromi Nemoto, AthanassiosSkodras, and TouradjEbrahimi. The effect of HDR images on privacy: crowdsourcing evaluation. SPIE Photonics Europe 2014, Optics, Photonics and Digital Technologies for Multimedia Applications, April 2014, Brussels, Belgium. WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

More Related