Rules


Challenge

Statement of interest

You do not need to register in order to participate. All challenge submissions done before the deadline will be accepted.

Technical report

All participants are required to submit a technical report that describes in sufficient detail the system. The technical report should be in pdf format, with a maximum length of (4+1) pages. The report is submitted at the same time as the system output and metadata files, as part of the submission package.

Terms and conditions

By submitting to the challenge, participants agree for the system output to be evaluated and to be published together with the results and the technical report on the DCASE Challenge website under CC-BY license.

Cross-validation setup

In some tasks, a file list containing train/test folds is provided. For obtaining fully comparable results between participants, please report development set results using the provided setup. Furthermore, if you are using this dataset for research, please also publish results using the given setup; this way it is possible to compare fairly the performance using different methods and approaches.

Challenge rules

  • Manipulation of provided training and development data is allowed. Task specific modifications of this rule are indicated in the individual task pages.
  • Participants are not allowed to make subjective judgments of the evaluation data, nor to annotate it.
  • The evaluation dataset cannot be used to train the submitted system. The use of statistics about the evaluation data in the decision making is also forbidden.
  • Organizers of a challenge task should not participate in the task. Participation of other researchers from the institute that is (co-)organizing the task is allowed, provided that there is a clear arrangement that prevents them from receiving unfair advantage in the task. The arrangement should be documented in the technical report.
  • By submitting their results, the participants give the rights to have their system output be evaluated and the results be published on the challenge web page. In order to support open science and make the challenge fair, withdrawing the submission is allowed only before the results are published. Once the evaluation results are published, withdrawal is not allowed.
  • Other task specific rules may apply. These are indicated and highlighted in the individual task pages.
  • Task organizers can exclude a submission if it violates the task rules.
  • The system outputs that do not respect the challenge rules will be evaluated on request, but they will not be officially included in the challenge rankings.
  • Organizers reserve the right to make changes to the rules and schedule of the challenge.

Use of external data

Information on use of external data is available in each task page.

Where allowed, use of external data must obey the following conditions:

  • The used external resource is clearly referenced and freely accessible to any other research group in the world. External data refers to public datasets or trained models. The dataset/models must be public and freely available before 1st of April 2020.
  • Participants submit at least one system without external training data so that we can study the contribution of such resources. This condition applies only to cases where external audio datasets are used. In case of external data being pretrained models or embeddings, this condition does not apply. The list of external data sources used in training must be clearly indicated in the technical report.
  • Participants inform the organizers in advance about such data sources, so that all competitors know about them and have equal opportunity to use them; please send an email to the task coordinators; we will update the list of external datasets on the webpage accordingly. Once the evaluation set is published, the list of allowed external data resources is locked (no further external sources allowed).

Submission

Participants can choose to participate in only one task or multiple.

Official challenge submission consists of:

  • System output file (*.csv)
  • Technical report explaining in sufficient detail the method (*.pdf)
  • Metadata file (*.yaml)

Participants are requested to submit results in the required submission package. For submission, participants are allowed to train their system using any subset or complete set of the available development dataset.

Multiple system outputs can be submitted (maximum 4 per participant). If submitting multiple systems, the individual system output files should be packaged into same submission package. Please follow instructions carefully when forming the submission package.

Evaluation

The evaluation of the submitted results will be done by the organizers. Each task has its own metrics; for details please refer to the individual task pages.

The evaluation results will be announced online, and discussed at DCASE 2020 Workshop.