DCASE is a data challenge whose purpose is to produce new scientific knowledge about methods for detection and classification of sound scenes and events. It provides an open and reproducible evaluation platform, thus fostering collaborative research between participants. DCASE consists of independent tasks organized by teams who are, within the schedule defined by the DCASE Steering Group, responsible for producing the required data and instructions for participation, scoring the task submissions, providing a summary of the results, and presenting the results at the DCASE Workshop. The DCASE Steering Group invites proposals to organize tasks in the DCASE 2025 Challenge. The goal for DCASE 2025 Challenge is to foster gradual technology improvements through continuity of tasks, but also to bring novel research directions into the field, while keeping the number of tasks suitably low. The main evaluation criteria that the Steering Group will use to select tasks are 1) scientific and technological importance of a task, 2) quality, quantity, relevance, and readiness of data to be used, and 3) feasibility of the task proposal. In the interest of reducing the overlap and fragmentation of tasks, organizers may be requested to bring amendments to their task proposals before final acceptance. Collaborative tasks are strongly encouraged: task bidders from different labs but preparing tasks on similar topics are encouraged to team up before the deadline to prepare stronger collaborative proposals. For that purpose, the DCASE mailing list or the DCASE Slack channel can be used to express interest in teaming up with other task bidders. You can also email the challenge coordinators (annamaria.mesaros@tuni.fi, romain.serizel@loria.fr) who will more privately share the contact information of other task bidders interested in teaming up. Challenge coordinators can also provide a first feedback on early versions of the proposal if the proposals are sent to them ahead of the proposal submission deadline (no feedback will be provided after November 10th).
Proposal
Task bidders should submit a task proposal for review by the Steering Committee. The proposal should be a maximum of two pages (pdf format) including the following information:
- Task name
- Coordinators
- Keywords (e.g. tagging, classification, detection, timestamps, scenes, events, birds, etc)
- Definition (one sentence, e.g. classify a short test audio into one of 40 predefined classes)
- Short description (incl. research question the task is tackling, mention if it is a follow-up of previous years' task)
- Dataset description: development, evaluation (short description, how much data is already available and prepared, how long would it take to prepare the rest, mention if you allow external data/transfer learning or not)
- Evaluation method / metric
- Baseline system (2 sentences, planned method if you do not have one from previous challenge)
- Contact person (for main communication, website)
Submission
Potential task organizers should to declare their intention to submit a task by 13th November in an email including
- Task name
- Task description in 2 sentences
- Task organization team
The proposal (both the full version and the public summary) should be sent to both DCASE Challenge coordinators annamaria.mesaros@tuni.fi and romain.serizel@loria.fr by 13th of November 2024. The Steering Group will review the proposals and provide conditional decisions in the beginning of December. In case task amendments are requested, the task acceptance decision will be finalized by the beginning of January 2025.
Note that we won’t accept any task proposal after 13th November if we did not receive receive your intent to submit.
Tentative schedule
10 Nov 2024 | Request for feedback from the coordinators |
13 Nov 2024 | Declaration of intent to submit (updated) |
16 Nov 2024 | Submission of task proposals (updated) |
1 Dec 2024 | Conditional decisions from the Steering Group |
15 Dec 2024 | Resubmission of conditionally accepted proposals |
6 Jan 2025 | Final notification of acceptance |
20 Jan 2025 | Task descriptions on website |
1 April 2025 | Challenge opening: datasets and baseline methods published |
1 June 2025 | Evaluation set release |
15 June 2025 | Challenge submission deadline |
30 June 2025 | Challenge results published |