DCASE is a data challenge whose purpose is to produce new scientific knowledge about methods for detection and classification of sound scenes and events. It provides an open and reproducible evaluation platform, thus fostering collaborative research between participants. DCASE consists of independent tasks organized by teams who are, within the schedule defined by the DCASE Steering Group, responsible for producing the required data and instructions for participation, scoring the task submissions, providing a summary of the results, and presenting the results at the DCASE Workshop.
The DCASE Steering Group invites proposals to organize tasks in the DCASE2023 Challenge. The goal for DCASE2023 Challenge is to foster gradual technology improvements through continuity of tasks, but also to bring novel research directions into the field, while keeping the number of tasks suitably low. The main evaluation criteria that the Steering Group will use to select tasks are 1) scientific and technological importance of a task, 2) quality, quantity, relevance, and readiness of data to be used, and 3) feasibility of the task proposal. In the interest of reducing the overlap and fragmentation of tasks, organizers may be requested to bring amendments to their task proposals before final acceptance. Collaborative tasks are strongly encouraged: task bidders from different labs but preparing tasks on similar topics are encouraged to team up before the deadline to prepare stronger collaborative proposals. For that purpose, the DCASE mailing list (firstname.lastname@example.org) can be used to express interest in teaming up with other task bidders, or an email can be sent to task coordinators (email@example.com, firstname.lastname@example.org) who will more privately share the contact information of other task bidders interested in teaming up.
Task bidders should submit a task proposal for review by the Steering Committee. The proposal should be a maximum of two pages (pdf format) including the following information:
- Task name
- Keywords (e.g. tagging, classification, detection, timestamps, scenes, events, birds, etc)
- Definition (one sentence, e.g. classify a short test audio into one of 40 predefined classes)
- Short description (incl. research question the task is tackling, mention if it is a follow-up of previous years' task)
- Dataset description: development, evaluation (short description, how much data is already available and prepared, how long would it take to prepare the rest, mention if you allow external data/transfer learning or not)
- Evaluation method / metric
- Baseline system (2 sentences, planned method if you do not have one from previous challenge)
- Contact person (for main communication, website)
Public summary of the proposal
This year DCASE Steering Group will collect feedback from the DCASE community about submitted proposals, which will be taken into account when evaluating them. To implement this, task bidders should prepare a half-page summary of their proposal, which will be made publicly available for collecting feedback. The public summary should be a shortened version of the full proposal, with the title “Public summary for evaluation”. The proposals will be evaluated blindly, and therefore they should not include the names of bidders. One of the main goals of the public feedback collection will be in estimating the interest of the community in each proposal.
The proposal (both the full version and the public summary, as separate pdf files) should be sent to both DCASE Challenge coordinators email@example.com and firstname.lastname@example.org by 13th of November 2022. The Steering Group will review the proposals and provide conditional decisions in the beginning of December. In case task amendments are requested, the task acceptance decision will be finalized by the beginning of January 2023.
|13 Nov 2022||Submission of task proposals|
|14-20 Nov 2022||Public feedback collection|
|9 Dec 2022||Conditional decisions from the Steering Group|
|21 Dec 2022||Resubmission of conditionally accepted proposals|
|4 Jan 2023||Final notification of acceptance|
|15 Jan 2023||Task descriptions on website|
|1 March 2023||Challenge opening: datasets and baseline methods published|
|1 May 2023||Evaluation set release|
|15 May 2023||Challenge submission deadline|
|31 May 2023||Challenge results published|