CHIL 2007+ Evaluation Package

Full Official Name: CHIL 2007+ Evaluation Package
Submission date: Jan. 24, 2014, 4:22 p.m.

The CHIL2007+ includes 1) CHIL 2007 Evaluation Package (see ELRA-E0033) and 2) additional annotations which have been created within the scope of the Metanet4u Project (ICT PSP No 270893), sponsored by the European Commission. The CHIL 2007 Evaluation Package was produced within the CHIL Project (Computers in the Human Interaction Loop), in the framework of an Integrated Project (IP 506909) under the European Commission's Sixth Framework Programme. The objective of this project is to create environments in which computers serve humans who focus on interacting with other humans as opposed to having to attend to and being preoccupied with the machines themselves. Instead of computers operating in an isolated manner, and Humans [thrust] in the loop [of computers] we will put Computers in the Human Interaction Loop (CHIL). In this context, the CHIL project produced CHIL Seminars. The CHIL Seminars are scientific presentations given by students, faculty members or invited speakers in the field of multimodal interfaces and speech processing. During the talks, videos of the speaker and the audience from 4 fixed cameras, frontal close ups of the speaker, close talking and far-field microphone data of the speaker’s voice and ambient sounds were recorded. The CHIL 2007 Evaluation Package consists of the following contents: 1) A set of audiovisual recordings of interactive seminars. The number of people present in the recording was fixed to be between 3 and 7. The recordings were done between June and September 2006 according to the “CHIL Room Setup” specification. 2) Video annotations: the 3-D coordinates of each participant. 3) Orthographic transcriptions of speech, identity of the speaker, and acoustic events. The additional annotations have been designed as a complementary extension of the CHIL 2007 Evaluation Package. The list of annotation categories included in that database has been largely extended so that: 1) the database can also be used with other speech technologies, and 2) it includes richer information about human activity. The set of additional annotations includes: 1) Movement 2) Individual focus of attention 3) Hand gestures 4) Head gestures 5) Spatial role labeling 6) Activity 7) Emotion 8) Named entities 9) Topics 10) Links between tiers, when more than one modality is required to resolve ambiguities. The resultant extended database is called CHIL2007+.

Creator(s)
Distributor(s)
Right Holder(s)