DCASE is a data challenge whose purpose is to produce new scientific knowledge about methods for detection and classification of sound scenes and events. It provides an open and reproducible evaluation platform, thus fostering collaborative research between participants. DCASE consists of independent tasks organized by teams who are, within the schedule defined by the DCASE Steering Group, responsible for producing the required data and instructions for participation, scoring the task submissions, providing a summary of the results, and presenting the results at the DCASE Workshop.
The DCASE Steering Group invites proposals to organize tasks in the DCASE 2024 Challenge. The goal for DCASE 2024 Challenge is to foster gradual technology improvements through continuity of tasks, but also to bring novel research directions into the field, while keeping the number of tasks suitably low. The main evaluation criteria that the Steering Group will use to select tasks are 1) scientific and technological importance of a task, 2) quality, quantity, relevance, and readiness of data to be used, and 3) feasibility of the task proposal. In the interest of reducing the overlap and fragmentation of tasks, organizers may be requested to bring amendments to their task proposals before final acceptance. Collaborative tasks are strongly encouraged: task bidders from different labs but preparing tasks on similar topics are encouraged to team up before the deadline to prepare stronger collaborative proposals. For that purpose, the DCASE mailing list or the DCASE Slack channel can be used to express interest in teaming up with other task bidders. You email can be sent to task coordinators (firstname.lastname@example.org, email@example.com) who will more privately share the contact information of other task bidders interested in teaming up. Challenge coordinators can also provide a first feedback on early versions of the proposal if the proposals are sent to them ahead of the proposal submission deadline (no feedback will be provided after November 1st).
Task bidders should submit a task proposal for review by the Steering Committee. The proposal should be a maximum of two pages (pdf format) including the following information:
- Task name
- Keywords (e.g. tagging, classification, detection, timestamps, scenes, events, birds, etc)
- Definition (one sentence, e.g. classify a short test audio into one of 40 predefined classes)
- Short description (incl. research question the task is tackling, mention if it is a follow-up of previous years' task)
- Dataset description: development, evaluation (short description, how much data is already available and prepared, how long would it take to prepare the rest, mention if you allow external data/transfer learning or not)
- Evaluation method / metric
- Baseline system (2 sentences, planned method if you do not have one from previous challenge)
- Contact person (for main communication, website)
The proposal (both the full version and the public summary) should be sent to both DCASE Challenge coordinators firstname.lastname@example.org and email@example.com by 13th of November 2023. The Steering Group will review the proposals and provide conditional decisions in the beginning of December. In case task amendments are requested, the task acceptance decision will be finalized by the beginning of January 2024.
Pitching Session (optional)
The steering group will organize an optional pitching session during the workshop in Tampere. During this session, prospective organizers can present their proposal and get feedback from the steering group members. If you are interested in participating in such a session, please contact firstname.lastname@example.org before September 1st.
|1 Sept 2023||Registration to the pitching session|
|1 Nov 2023||Request for feedback from the coordinators|
|13 Nov 2023||Submission of task proposals|
|1 Dec 2023||Conditional decisions from the Steering Group|
|15 Dec 2023||Resubmission of conditionally accepted proposals|
|4 Jan 2024||Final notification of acceptance|
|15 Jan 2024||Task descriptions on website|
|1 April 2024||Challenge opening: datasets and baseline methods published|
|1 June 2024||Evaluation set release|
|15 June 2024||Challenge submission deadline|
|30 June 2024||Challenge results published|