180 likes | 203 Views
Data Collection for the CHIL CLEAR 2007 Evaluation Campaign. N. Moreau 1 , D. Mostefa 1 , R. Stiefelhagen 2 , S. Burger 3 , K. Choukri 1 1 ELDA, 2 UKA-ISL, 3 CMU E-mails: {moreau;mostefa;choukri}@elda.org, stiefel@ira.uka.de, sburger@cs.cmu.edu
E N D
Data Collectionfor the CHIL CLEAR 2007 Evaluation Campaign N. Moreau1, D. Mostefa1, R. Stiefelhagen2, S. Burger3, K. Choukri1 1ELDA, 2UKA-ISL, 3CMU E-mails: {moreau;mostefa;choukri}@elda.org, stiefel@ira.uka.de, sburger@cs.cmu.edu Evaluations and Language resources distribution agency (ELDA) www.elda.org
Plan • CHIL project • Evaluation campaigns • Data recordings • Annotations • Evaluation package • Conclusion
CHIL Project • CHIL: Computers in the Human Interaction Loop • Integrated project funded by the European Commission (FP6) • January 2004 – August 2007 • 15 partners, 9 countries (ELDA responsible for data collection and evaluations) • Multimodal and perceptual user interface technologies • Context: • Real-life meetings (small meeting rooms) • Activities and interactions of attendees
CHIL evaluation campaigns • June 2004: Dry run • January 2005: Internal evaluation campaign • February 2006: CLEAR 2006 campaign • February 2007: CLEAR 2007 campaign • CLEAR = Classification of Events, Activities and Relationships • Opened to external participants • Supported by CHIL and NIST (VACE Program) • Co-organized with the NIST RT (Rich Transcription) Evaluation
CLEAR 2007 evaluation campaign • 9 technologies evaluated • Vision technologies • Face Detection and Tracking • Visual Person Tracking • Visual Person Identification • Head Pose Estimation • Acoustic technologies • Acoustic Person Tracking • Acoustic Speaker Identification • Acoustic Event Detection • Mutlimodal technologies • Multimodal Person Tracking • Multimodal Speaker Identification
CHIL Scenarios Non Interactive Lectures Interactive Seminars
CHIL Data Sets CLEAR 2007 Data Collection: • 25 highly interactive seminars • Attendees: between 3 and 7 • Events: several presenters, discussions, coffee breaks, people entering / leaving the room, ...
Recording set up • 5 recording rooms • Sensors: • Audio • 64-channel microphone array • 4-channel T-shaped microphones • Table-top microphones • Close talking microphones • Video • 4 fixed corner cameras • 1 ceiling wide-angle camera • Pan-tilt-zoom (PTZ) cameras
Quality Santards • Recording of 25 seminars in 2007 (5 per CHIL room) • Audio-visual clap at beginning and end • Cameras (JPEG files at 15, 25 or 30 fps) • Max. desynchronisation = 200 ms • Microphone array • Max. desynchronisation = 200 ms • Other microphones (T-shape, table) • Max. desynchronisation = 50 ms • If desynchronisation > max => recording to be remade
Annotations CLEAR 2007 Annotations: • Audio: transcriptions, acoustic events • Video: facial features, head pose
Audio Annotations • Orthographic transcriptions • 2 channels • Based on near filed recordings (close-talking microphones) • Compared with one far-field recording • Speaker turns • Non verbal events (laugh, pauses...) • See: S. Burger “The CHIL RT07 Evaluation Data” • Acoustic events • Based on one microphone array channel • 15 categories of sounds: • Speech, door slam, step, chair moving, cup jingle, applause, laugh, key jingle, cough, keyboard, phone, music, knock, paper wrapping, unknown
Video Annotations • Facial Features (Face detection, Person tracking) • annotations every 1 second • all attendees • 4 camera views • facial labels • head centroïd • left and right eyes • nose bridge • face bounding box • 2D head centroïds 3D ”ground truth” • Person Identification Database • 28 persons to identify • audio-visual excerpts for each person ID • video labels every 200 ms
Head Pose Data Set • Persons captured with different head orientations • standing in the middle of a CHIL room (ISL) • captured by the 4 corner cameras • Annotations: • Head bounding box • Head orientation: Pan, Tilt, Roll • 10 persons for development • 5 persons for evaluation
Evaluation package • The CLEAR 2007 evaluation package is publicly available through the ELRA catalog • Enable external players to evaluate their system offline • For each of the evaluated technologies: • Data sets (development/evaluation) • Evaluation and scoring tools • Results of the official campaign
Conclusion • 9 technologies evaluated during the 3rd CHIL evaluation campaign • The CHIL 2007 evaluation package available through the ELRA catalog: http://catalog.elra.info/ • For more on the evaluations see: CLEAR 2007: http://www.clear-evaluation.org/ RT 2007: http://www.nist.gov/speech/tests/rt/