Statement of interest
You do not need to register in order to participate. All challenge submissions done before the deadline will be accepted.
All participants are required to submit a technical report that describes in sufficient detail the system. The technical report should be in pdf format, with a maximum length of (4+1) pages. The report is submitted at the same time as the system output and metadata files, as part of the submission package.
In some tasks, a file list containing train/test folds is provided. For obtaining fully comparable results between participants, please report development set results using the provided setup. Furthermore, if you are using this dataset for research, please also publish results using the given setup; this way it is possible to compare fairly the performance using different methods and approaches.
- Manipulation of provided training and development data is allowed. Task specific modifications of this rule are indicated in the individual task pages.
- Participants are not allowed to make subjective judgments of the evaluation data, nor to annotate it.
- The evaluation dataset cannot be used to train the submitted system. The use of statistics about the evaluation data in the decision making is also forbidden.
- Persons from the institutes of DCASE organizing teams are eligible to participate the challenge, if they can have an arrangement which prevents them from having any unfair advantage in the challenge. This arrangement should be briefly documented in the technical report describing the submission.
- By submitting their results, the participants give the rights to have their system output be evaluated and the results be published on the challenge web page. In order to support open science and make the challenge fair, withdrawing the submission is allowed only before the results are published. Once the evaluation results are published, withdrawal is not allowed.
- Other task specific rules may apply. These are indicated and highlighted in the individual task pages.
- Organizers reserve the right to make changes to the rules and schedule of the challenge.
- The system outputs that do not respect the challenge rules will be evaluated on request, but they will not be officially included in the challenge rankings.
Use of external data
Information on use of external data is available in each task page.
Where allowed, use of external data must obey the following conditions:
- The used external resource is clearly referenced and freely accessible to any other research group in the world. External data refers to public datasets or trained models. The dataset/models must be public and freely available before 1st of April 2020.
- Participants submit at least one system without external training data so that we can study the contribution of such resources. This condition applies only to cases where external audio datasets are used. In case of external data being pretrained models or embeddings, this condition does not apply. The list of external data sources used in training must be clearly indicated in the technical report.
- Participants inform the organizers in advance about such data sources, so that all competitors know about them and have equal opportunity to use them; please send an email to the task coordinators; we will update the list of external datasets on the webpage accordingly. Once the evaluation set is published, the list of allowed external data resources is locked (no further external sources allowed).
Participants can choose to participate in only one task or multiple.
Official challenge submission consists of:
- System output file (*.csv)
- Technical report explaining in sufficient detail the method (*.pdf)
- Metadata file (*.yaml)
Participants are requested to submit results in the required submission package. For submission, participants are allowed to train their system using any subset or complete set of the available development dataset.
Multiple system outputs can be submitted (maximum 4 per participant). If submitting multiple systems, the individual system output files should be packaged into same submission package. Please follow instructions carefully when forming the submission package.
The evaluation of the submitted results will be done by the organizers. Each task has its own metrics; for details please refer to the individual task pages.
The evaluation results will be announced online, and discussed at DCASE 2020 Workshop.