Task description
The goal of this task is to identify whether a machine is normal or anomalous using only normal sound data under domain shifted conditions. The main difference from DCASE 2021 Task 2 is that the domain (source domain / target domain ) of data are not available during evaluation. Therefore, the participants are expected to develop domain generalization techniques in which the output anomaly scores are not affected by the domain shifts.
More detailed task description can be found in the task description page
Teams ranking
Table including only the best performing system per submitting team.
Rank | Submission Information | Evaluation Dataset | Development Dataset | |||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Submission Code |
Technical Report |
Official Rank |
Official Score |
ToyCar (AUC) |
ToyCar (pAUC) |
ToyTrain (AUC) |
ToyTrain (pAUC) |
Fan (AUC) |
Fan (pAUC) |
Gearbox (AUC) |
Gearbox (pAUC) |
Bearing (AUC) |
Bearing (pAUC) |
Slider (AUC) |
Slider (pAUC) |
Valve (AUC) |
Valve (pAUC) |
ToyCar (AUC) |
ToyCar (pAUC) |
ToyTrain (AUC) |
ToyTrain (pAUC) |
Fan (AUC) |
Fan (pAUC) |
Gearbox (AUC) |
Gearbox (pAUC) |
Bearing (AUC) |
Bearing (pAUC) |
Slider (AUC) |
Slider (pAUC) |
Valve (AUC) |
Valve (pAUC) |
|
DCASE2022_baseline_task2_MNV2 | DCASE2022baseline2022 | 68 | 54.01722421174122 ± 0.0002954806671840702 | 42.79 | 53.44 | 51.22 | 50.98 | 50.34 | 55.22 | 51.34 | 48.49 | 58.23 | 52.16 | 62.42 | 53.07 | 72.77 | 65.16 | 55.54 | 52.27 | 51.57 | 51.51 | 59.48 | 56.89 | 62.70 | 56.03 | 60.25 | 57.14 | 51.69 | 54.67 | 62.14 | 62.41 | |
Bai_JLESS_task2_1 | BaiJLESS2022 | 26 | 63.948583467398024 ± 0.0003628043584427405 | 92.17 | 76.25 | 51.78 | 51.22 | 46.52 | 50.73 | 71.80 | 62.91 | 64.02 | 55.80 | 77.31 | 65.19 | 83.15 | 68.73 | 82.50 | 64.68 | 64.52 | 55.60 | 67.28 | 67.25 | 86.77 | 72.67 | 79.82 | 61.04 | 86.73 | 71.41 | 91.63 | 80.01 | |
Kuroyanagi_NU-HDL_task2_2 | KuroyanagiNU-HDL2022 | 5 | 68.22312028564296 ± 0.0003608131407524257 | 81.67 | 72.16 | 55.02 | 51.78 | 57.19 | 53.02 | 84.51 | 65.10 | 72.30 | 57.90 | 80.62 | 58.09 | 91.01 | 83.23 | 79.50 | 63.49 | 69.20 | 51.90 | 68.27 | 65.16 | 81.25 | 67.65 | 77.81 | 63.87 | 87.41 | 73.64 | 91.33 | 90.05 | |
LEE_KNU_task2_1 | LEEKNU2022 | 82 | 49.26732013587548 ± 0.00023044907066044446 | 53.35 | 51.66 | 45.06 | 49.79 | 49.71 | 50.27 | 49.71 | 49.50 | 47.57 | 49.93 | 47.46 | 49.28 | 49.32 | 51.78 | 89.38 | 75.09 | 79.74 | 63.55 | 90.64 | 78.73 | 91.45 | 80.92 | 95.40 | 75.02 | 88.07 | 75.06 | 87.40 | 75.02 | |
Narita_AIT_task2_2 | NaritaAIT2022 | 69 | 53.84966232209122 ± 0.000292419115994451 | 60.99 | 56.58 | 51.03 | 50.39 | 58.25 | 54.56 | 53.37 | 54.03 | 56.33 | 50.51 | 49.68 | 51.77 | 52.31 | 53.47 | 81.44 | 61.58 | 62.88 | 60.51 | 34.90 | 61.65 | 81.74 | 61.67 | 77.33 | 72.39 | 80.39 | 68.05 | 74.19 | 60.73 | |
Du_NERCSLIP_task2_1 | DuNERCSLIP2022 | 33 | 63.30309472782353 ± 0.0002976991866654528 | 63.81 | 59.09 | 58.13 | 51.68 | 53.00 | 51.60 | 82.48 | 60.29 | 67.11 | 54.22 | 67.35 | 56.49 | 87.17 | 75.85 | 80.09 | 63.52 | 69.06 | 58.73 | 90.34 | 80.25 | 86.70 | 66.71 | 86.66 | 67.75 | 92.16 | 83.38 | 90.62 | 83.29 | |
Jinhyuk_SNU_task2_1 | JinhyukSNU2022 | 78 | 51.02701922567823 ± 0.00029370058470223036 | 51.58 | 53.99 | 42.74 | 49.03 | 41.33 | 50.88 | 58.43 | 51.27 | 59.48 | 53.77 | 58.51 | 53.66 | 49.23 | 50.97 | 49.15 | 45.50 | 44.26 | 43.59 | 55.14 | 54.69 | 56.00 | 56.00 | 47.30 | 47.28 | 45.05 | 43.94 | 47.34 | 47.91 | |
Hu_NJU_task2_1 | HuNJU2022 | 65 | 55.07461026550378 ± 0.0003040740154131298 | 69.31 | 56.58 | 50.23 | 50.99 | 49.11 | 51.18 | 63.70 | 54.36 | 62.48 | 56.76 | 59.69 | 52.65 | 46.49 | 49.66 | 76.28 | 55.98 | 65.55 | 53.96 | 59.98 | 56.22 | 70.49 | 60.50 | 57.48 | 55.01 | 81.11 | 64.49 | 62.77 | 57.83 | |
Wilkinghoff_FKIE_task2_1 | WilkinghoffFKIE2022 | 28 | 63.724437030531156 ± 0.000372854346756348 | 79.69 | 65.59 | 61.14 | 56.16 | 40.37 | 48.88 | 82.74 | 58.66 | 75.91 | 65.40 | 67.10 | 55.12 | 84.54 | 70.37 | 79.60 | 63.99 | 73.80 | 55.77 | 88.48 | 78.88 | 86.79 | 68.64 | 74.10 | 63.38 | 90.48 | 75.81 | 86.58 | 80.34 | |
Wei_HEU_task2_2 | WeiHEU2022 | 14 | 67.12235197698574 ± 0.0003768613481780468 | 97.17 | 88.16 | 57.08 | 53.13 | 49.46 | 51.72 | 85.17 | 60.84 | 70.46 | 58.32 | 74.39 | 60.99 | 84.47 | 67.11 | 82.75 | 60.14 | 69.05 | 59.37 | 69.70 | 59.59 | 81.02 | 65.29 | 69.47 | 54.25 | 89.71 | 76.08 | 93.28 | 72.92 | |
Guan_HEU_task2_4 | GuanHEU2022 | 6 | 68.0372804969291 ± 0.00039326172923566144 | 90.90 | 78.42 | 61.66 | 55.16 | 55.35 | 53.72 | 81.56 | 58.08 | 67.23 | 52.71 | 75.15 | 60.63 | 89.29 | 79.23 | 84.12 | 63.48 | 71.06 | 60.23 | 77.50 | 65.93 | 84.13 | 66.68 | 77.14 | 66.55 | 90.63 | 78.20 | 93.20 | 72.46 | |
Li_CTRI_task2_1 | LiCTRI2022 | 59 | 58.12785559862876 ± 0.0002695870626290191 | 75.84 | 59.32 | 48.76 | 49.93 | 40.15 | 50.15 | 69.07 | 53.18 | 63.30 | 51.01 | 64.95 | 51.74 | 81.71 | 73.11 | 80.35 | 57.21 | 72.38 | 52.78 | 65.49 | 61.14 | 77.96 | 64.19 | 66.88 | 56.04 | 84.48 | 69.36 | 73.16 | 60.35 | |
Morita_SECOM_task2_1 | MoritaSECOM2022 | 15 | 66.82507994516816 ± 0.0003343147179148136 | 92.52 | 74.39 | 58.30 | 54.78 | 50.27 | 52.52 | 79.90 | 59.71 | 74.06 | 63.18 | 68.54 | 62.22 | 84.96 | 69.42 | 89.55 | 71.91 | 68.58 | 50.36 | 78.26 | 63.83 | 85.07 | 75.64 | 64.44 | 58.21 | 90.82 | 72.99 | 94.05 | 90.64 | |
Yamashita_GU_task2_3 | YamashitaGU2022 | 39 | 62.387167480540676 ± 0.00028161700622855696 | 76.86 | 68.41 | 55.68 | 49.19 | 46.68 | 49.88 | 75.61 | 59.78 | 62.97 | 52.53 | 69.83 | 57.82 | 86.10 | 73.07 | 80.25 | 54.49 | 77.16 | 54.19 | 66.27 | 56.57 | 76.52 | 61.55 | 62.94 | 55.89 | 77.32 | 65.36 | 83.92 | 75.23 | |
CHO_SG_task2_1 | CHOSG2022 | 67 | 54.468778744613665 ± 0.0003074487298355346 | 72.27 | 56.57 | 54.74 | 51.63 | 49.55 | 50.78 | 63.57 | 55.41 | 52.65 | 51.27 | 47.70 | 50.74 | 53.74 | 54.42 | 46.56 | 49.26 | 56.16 | 52.77 | 47.20 | 53.47 | 59.80 | 55.65 | 56.00 | 52.87 | 74.58 | 63.16 | 61.77 | 60.39 | |
Li_JAIST_task2 | LiJAIST2022 | 83 | 48.93884539917759 ± 0.0002825946078999199 | 58.61 | 56.48 | 47.77 | 50.47 | 45.13 | 50.09 | 44.38 | 48.53 | 43.61 | 50.33 | 32.41 | 47.97 | 82.78 | 75.78 | 54.06 | 50.35 | 43.38 | 49.70 | 59.67 | 54.43 | 53.31 | 52.48 | 62.09 | 55.29 | 44.45 | 49.80 | 80.90 | 74.47 | |
Gou_UESTC_task2_4 | GouUESTC2022 | 51 | 59.16904508333896 ± 0.0002810411852165545 | 68.89 | 65.40 | 51.23 | 50.50 | 43.26 | 49.41 | 78.54 | 65.82 | 57.49 | 52.30 | 60.97 | 53.67 | 81.11 | 68.44 | 70.96 | 62.07 | 55.03 | 53.49 | 70.17 | 69.61 | 81.69 | 63.25 | 66.77 | 57.34 | 68.61 | 57.60 | 75.07 | 67.91 | |
PENG_NJUPT_task2_1 | PENGNJUPT2022 | 57 | 58.47341574821306 ± 0.0002990941526070191 | 57.29 | 57.37 | 48.52 | 51.03 | 51.04 | 51.16 | 58.64 | 51.13 | 62.47 | 54.62 | 75.49 | 58.51 | 78.03 | 71.20 | 55.55 | 51.02 | 56.84 | 51.95 | 59.26 | 56.68 | 56.38 | 54.52 | 67.42 | 59.08 | 84.36 | 63.45 | 80.88 | 72.00 | |
Nejjar_ETH_task2_1 | NejjarETH2022 | 32 | 63.34598205922427 ± 0.00033579696860218286 | 89.62 | 79.68 | 61.70 | 54.23 | 48.90 | 50.76 | 71.71 | 59.57 | 68.94 | 58.63 | 60.14 | 52.73 | 76.28 | 62.45 | 87.50 | 68.34 | 64.16 | 53.89 | 80.06 | 70.76 | 85.85 | 71.23 | 74.91 | 64.75 | 80.93 | 68.92 | 83.53 | 71.66 | |
Verbitskiy_DS_task2_1 | VerbitskiyDS2022 | 27 | 63.83127217235942 ± 0.0003519023225400419 | 71.62 | 65.38 | 52.16 | 51.01 | 44.81 | 50.95 | 86.45 | 74.88 | 65.44 | 57.07 | 72.53 | 57.33 | 92.60 | 80.80 | 75.01 | 59.82 | 61.47 | 52.03 | 64.37 | 66.88 | 69.97 | 59.86 | 80.91 | 69.41 | 81.78 | 63.58 | 89.62 | 79.29 | |
Cohen_Technion_task2_2 | CohenTechnion2022 | 63 | 56.763333657296656 ± 0.0002975695814466658 | 75.61 | 65.47 | 50.50 | 50.02 | 43.47 | 50.14 | 67.34 | 51.86 | 64.40 | 58.88 | 63.78 | 53.55 | 55.09 | 51.75 | 77.70 | 55.22 | 55.96 | 49.89 | 62.80 | 55.19 | 69.72 | 56.91 | 59.51 | 50.99 | 80.22 | 59.78 | 54.43 | 50.34 | |
Deng_THU_task2_4 | DengTHU2022 | 7 | 67.62358685887781 ± 0.00038777202086470904 | 88.48 | 70.07 | 60.00 | 53.70 | 44.48 | 50.01 | 85.33 | 66.41 | 71.30 | 60.33 | 81.55 | 67.04 | 89.68 | 84.10 | 87.09 | 66.28 | 81.47 | 69.14 | 88.07 | 77.00 | 89.11 | 75.58 | 87.21 | 75.10 | 91.94 | 84.87 | 98.17 | 94.83 | |
Liu_BUPT_task2_2 | LiuBUPT2022 | 49 | 59.767720255838206 ± 0.00027644210188000623 | 53.83 | 57.79 | 55.34 | 51.27 | 43.14 | 50.25 | 69.68 | 51.03 | 69.97 | 57.28 | 71.37 | 59.82 | 85.14 | 80.49 | 78.48 | 56.66 | 62.40 | 54.65 | 71.34 | 58.11 | 76.72 | 59.61 | 68.31 | 56.65 | 88.28 | 73.78 | 83.97 | 70.11 | |
Kazakova_ITMO_task2_1 | KazakovaITMO2022 | 80 | 49.89612147323122 ± 0.0002773039817690667 | 58.35 | 54.92 | 51.25 | 50.68 | 48.80 | 50.47 | 52.70 | 49.56 | 39.56 | 49.89 | 39.76 | 49.96 | 62.81 | 55.73 | 51.56 | 53.23 | 52.11 | 52.84 | 57.85 | 54.60 | 50.13 | 53.15 | 47.47 | 53.81 | 50.10 | 56.40 | 17.68 | 49.27 | |
Kodua_ITMO_task2_1 | KoduaITMO2022 | 74 | 52.96330674177412 ± 0.00029557613542138784 | 53.66 | 51.39 | 48.84 | 49.30 | 41.08 | 48.93 | 59.15 | 51.66 | 56.48 | 54.50 | 63.19 | 52.88 | 61.24 | 51.89 | 56.25 | 51.75 | 55.47 | 52.13 | 59.31 | 52.22 | 61.08 | 53.33 | 55.48 | 51.47 | 68.85 | 56.83 | 59.08 | 50.82 | |
Liu_CQUPT_task2_4 | LiuCQUPT2022 | 1 | 70.97462888079512 ± 0.0003427217026837889 | 88.45 | 81.83 | 70.46 | 61.14 | 57.34 | 57.33 | 86.04 | 64.22 | 68.85 | 54.45 | 78.26 | 66.39 | 83.87 | 75.22 | 81.50 | 67.12 | 76.22 | 64.27 | 70.11 | 65.86 | 89.28 | 76.04 | 81.96 | 70.89 | 90.08 | 79.80 | 93.84 | 86.80 | |
Siang_NTHU_task2_1 | SiangNTHU2022 | 79 | 50.59102827288451 ± 0.00027303186490778593 | 57.84 | 52.44 | 44.31 | 49.56 | 46.32 | 50.42 | 53.15 | 50.25 | 38.27 | 50.91 | 58.50 | 51.98 | 60.33 | 58.42 | 78.39 | 61.24 | 64.60 | 55.08 | 70.73 | 61.34 | 82.91 | 65.22 | 75.63 | 65.16 | 89.87 | 76.54 | 88.56 | 79.06 | |
Tozicka_NSW_task2_1 | TozickaNSW2022 | 38 | 62.50004793576084 ± 0.00034772455431082163 | 88.28 | 74.95 | 58.44 | 51.01 | 42.75 | 52.14 | 87.23 | 72.17 | 60.35 | 55.15 | 62.65 | 52.16 | 74.38 | 70.55 | 88.24 | 67.80 | 72.19 | 63.80 | 63.51 | 60.40 | 80.22 | 64.80 | 70.66 | 61.00 | 78.06 | 62.60 | 88.73 | 76.20 | |
Almudevar_UZ_task2_2 | AlmudevarUZ2022 | 45 | 60.60499062666268 ± 0.00031377932481308624 | 89.58 | 77.07 | 47.34 | 49.83 | 47.90 | 52.47 | 75.43 | 56.23 | 64.84 | 54.36 | 68.26 | 55.78 | 67.92 | 55.97 | 82.62 | 56.20 | 60.16 | 51.18 | 70.45 | 60.20 | 73.88 | 55.34 | 63.86 | 59.15 | 89.83 | 77.74 | 61.77 | 53.07 | |
Jalalia_AIT_task2_1 | JalaliaAIT2022 | 77 | 51.338525836051105 ± 0.00029471041821909645 | 51.35 | 55.59 | 44.78 | 49.55 | 41.06 | 50.21 | 58.20 | 51.69 | 60.13 | 55.52 | 58.51 | 53.13 | 48.68 | 51.11 | 63.47 | 52.75 | 52.36 | 50.68 | 59.38 | 57.85 | 63.91 | 57.90 | 54.11 | 51.72 | 58.28 | 55.23 | 48.69 | 50.48 | |
Zorin_AIRI_task2_1 | ZorinAIRI2022 | 84 | 43.16915219364365 ± 0.0002620788289881202 | 37.17 | 49.65 | 45.02 | 49.74 | 43.23 | 49.83 | 40.98 | 49.49 | 38.40 | 50.13 | 40.83 | 48.68 | 39.49 | 48.92 | 54.64 | 52.77 | 60.43 | 51.17 | 54.28 | 50.10 | 56.27 | 50.32 | 46.42 | 48.66 | 63.60 | 53.43 | 47.75 | 51.18 | |
Venkatesh_MERL_task2_3 | VenkateshMERL2022 | 8 | 67.56529995172603 ± 0.000353158822539437 | 93.88 | 78.67 | 58.23 | 54.73 | 48.17 | 50.34 | 86.76 | 79.43 | 72.54 | 61.86 | 73.64 | 60.70 | 83.72 | 62.93 | 83.66 | 65.19 | 66.38 | 57.27 | 64.72 | 61.61 | 87.47 | 70.70 | 73.16 | 57.59 | 90.16 | 77.60 | 97.35 | 90.76 |
Supplementary metrics (recall, precision, and F1 score)
Rank | Submission Information | Evaluation Dataset | ||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Submission Code |
Technical Report |
Official Rank |
ToyCar (F1 score) |
ToyCar (Recall) |
ToyCar (Precision) |
ToyTrain (F1 score) |
ToyTrain (Recall) |
ToyTrain (Precision) |
Fan (F1 score) |
Fan (Recall) |
Fan (Precision) |
Gearbox (F1 score) |
Gearbox (Recall) |
Gearbox (Precision) |
Bearing (F1 score) |
Bearing (Recall) |
Bearing (Precision) |
Slider (F1 score) |
Slider (Recall) |
Slider (Precision) |
Valve (F1 score) |
Valve (Recall) |
Valve (Precision) |
|
DCASE2022_baseline_task2_MNV2 | DCASE2022baseline2022 | 68 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 55.14 | 52.94 | 57.52 | 43.31 | 34.77 | 57.41 | 42.52 | 31.43 | 65.69 | 0.00 | 0.00 | 0.00 | |
Bai_JLESS_task2_1 | BaiJLESS2022 | 26 | 85.28 | 94.80 | 77.50 | 0.00 | 0.00 | 0.00 | 25.73 | 16.50 | 58.35 | 66.70 | 62.01 | 72.15 | 64.14 | 71.76 | 57.98 | 71.97 | 80.66 | 64.97 | 61.21 | 49.20 | 80.99 | |
Kuroyanagi_NU-HDL_task2_2 | KuroyanagiNU-HDL2022 | 5 | 75.29 | 74.95 | 75.63 | 52.38 | 49.58 | 55.52 | 54.05 | 49.01 | 60.26 | 80.57 | 80.34 | 80.81 | 68.50 | 68.20 | 68.80 | 76.59 | 76.34 | 76.84 | 82.14 | 80.83 | 83.49 | |
LEE_KNU_task2_1 | LEEKNU2022 | 82 | 57.67 | 65.42 | 51.56 | 32.71 | 26.10 | 43.79 | 60.10 | 72.26 | 51.44 | 50.39 | 50.78 | 50.00 | 60.40 | 79.18 | 48.82 | 47.79 | 46.75 | 48.88 | 50.26 | 49.83 | 50.70 | |
Narita_AIT_task2_2 | NaritaAIT2022 | 69 | 66.67 | 100.00 | 50.00 | 66.67 | 100.00 | 50.00 | 66.67 | 100.00 | 50.00 | 66.67 | 100.00 | 50.00 | 66.67 | 100.00 | 50.00 | 66.67 | 100.00 | 50.00 | 66.67 | 100.00 | 50.00 | |
Du_NERCSLIP_task2_1 | DuNERCSLIP2022 | 33 | 64.19 | 86.32 | 51.10 | 66.74 | 99.32 | 50.25 | 63.40 | 83.11 | 51.25 | 77.05 | 87.94 | 68.56 | 69.12 | 97.30 | 53.60 | 50.17 | 39.31 | 69.29 | 0.00 | 0.00 | 0.00 | |
Jinhyuk_SNU_task2_1 | JinhyukSNU2022 | 78 | 41.62 | 32.89 | 56.66 | 66.67 | 100.00 | 50.00 | 0.00 | 0.00 | 0.00 | 69.04 | 91.91 | 55.29 | 62.33 | 75.94 | 52.85 | 47.11 | 39.15 | 59.14 | 21.80 | 13.76 | 52.41 | |
Hu_NJU_task2_1 | HuNJU2022 | 65 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 13.47 | 7.66 | 55.52 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 51.63 | 45.59 | 59.51 | 0.00 | 0.00 | 0.00 | |
Wilkinghoff_FKIE_task2_1 | WilkinghoffFKIE2022 | 28 | 67.19 | 100.00 | 50.59 | 67.57 | 98.18 | 51.51 | 56.01 | 63.98 | 49.81 | 66.82 | 100.00 | 50.17 | 67.11 | 100.00 | 50.51 | 68.61 | 99.31 | 52.41 | 74.10 | 98.95 | 59.22 | |
Wei_HEU_task2_2 | WeiHEU2022 | 14 | 85.41 | 78.80 | 93.24 | 50.75 | 46.23 | 56.26 | 36.26 | 26.45 | 57.65 | 78.80 | 76.60 | 81.13 | 61.32 | 53.04 | 72.66 | 67.36 | 63.28 | 71.99 | 70.96 | 67.71 | 74.54 | |
Guan_HEU_task2_4 | GuanHEU2022 | 6 | 78.27 | 93.46 | 67.33 | 55.12 | 47.75 | 65.18 | 54.13 | 56.84 | 51.67 | 72.66 | 95.58 | 58.60 | 16.21 | 9.34 | 61.31 | 64.39 | 65.53 | 63.29 | 78.75 | 92.22 | 68.71 | |
Li_CTRI_task2_1 | LiCTRI2022 | 59 | 69.87 | 74.12 | 66.09 | 39.57 | 34.10 | 47.12 | 0.00 | 0.00 | 0.00 | 54.27 | 43.07 | 73.34 | 67.26 | 80.16 | 57.94 | 29.58 | 19.25 | 63.87 | 0.00 | 0.00 | 0.00 | |
Morita_SECOM_task2_1 | MoritaSECOM2022 | 15 | 69.69 | 56.47 | 90.99 | 0.00 | 0.00 | 0.00 | 15.71 | 9.09 | 57.80 | 42.25 | 29.92 | 71.86 | 0.00 | 0.00 | 0.00 | 67.97 | 64.01 | 72.44 | 78.36 | 82.04 | 75.00 | |
Yamashita_GU_task2_3 | YamashitaGU2022 | 39 | 61.39 | 49.75 | 80.16 | 44.19 | 34.76 | 60.63 | 35.00 | 26.11 | 53.10 | 63.93 | 56.20 | 74.13 | 57.97 | 50.23 | 68.55 | 60.61 | 53.71 | 69.55 | 67.46 | 55.67 | 85.60 | |
CHO_SG_task2_1 | CHOSG2022 | 67 | 69.65 | 84.99 | 59.01 | 66.33 | 90.90 | 52.21 | 63.47 | 87.45 | 49.81 | 66.56 | 96.20 | 50.89 | 0.00 | 0.00 | 0.00 | 62.14 | 85.54 | 48.79 | 65.45 | 93.44 | 50.37 | |
Li_JAIST_task2 | LiJAIST2022 | 83 | 66.58 | 96.96 | 50.69 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 14.90 | 9.71 | 32.03 | 64.53 | 92.72 | 49.48 | 0.00 | 0.00 | 0.00 | 30.99 | 19.20 | 80.23 | |
Gou_UESTC_task2_4 | GouUESTC2022 | 51 | 63.10 | 62.91 | 63.29 | 46.35 | 42.30 | 51.26 | 45.61 | 43.84 | 47.53 | 72.48 | 72.02 | 72.96 | 54.16 | 52.31 | 56.14 | 65.07 | 72.26 | 59.18 | 71.51 | 70.75 | 72.29 | |
PENG_NJUPT_task2_1 | PENGNJUPT2022 | 57 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 54.22 | 51.13 | 57.70 | 67.33 | 78.30 | 59.06 | 76.81 | 90.14 | 66.92 | 65.46 | 57.15 | 76.59 | |
Nejjar_ETH_task2_1 | NejjarETH2022 | 32 | 72.84 | 98.95 | 57.63 | 40.59 | 32.74 | 53.40 | 0.00 | 0.00 | 0.00 | 64.47 | 81.18 | 53.47 | 66.74 | 100.00 | 50.08 | 66.67 | 100.00 | 50.00 | 37.11 | 26.87 | 60.00 | |
Verbitskiy_DS_task2_1 | VerbitskiyDS2022 | 27 | 66.08 | 72.32 | 60.83 | 56.54 | 64.15 | 50.55 | 21.91 | 13.70 | 54.72 | 78.58 | 79.34 | 77.84 | 60.34 | 57.99 | 62.89 | 65.83 | 59.64 | 73.45 | 83.13 | 82.92 | 83.34 | |
Cohen_Technion_task2_2 | CohenTechnion2022 | 63 | 62.28 | 58.49 | 66.58 | 49.46 | 46.88 | 52.33 | 45.49 | 41.23 | 50.74 | 64.84 | 64.57 | 65.11 | 59.83 | 58.20 | 61.55 | 60.24 | 58.87 | 61.68 | 54.21 | 54.14 | 54.27 | |
Deng_THU_task2_4 | DengTHU2022 | 7 | 67.64 | 100.00 | 51.11 | 37.00 | 28.94 | 51.28 | 43.49 | 38.91 | 49.30 | 70.70 | 95.25 | 56.21 | 66.55 | 98.57 | 50.24 | 68.75 | 98.24 | 52.88 | 77.22 | 94.72 | 65.17 | |
Liu_BUPT_task2_2 | LiuBUPT2022 | 49 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 14.99 | 8.57 | 59.60 | 8.52 | 4.55 | 66.09 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | |
Kazakova_ITMO_task2_1 | KazakovaITMO2022 | 80 | 64.79 | 92.43 | 49.87 | 63.01 | 79.09 | 52.37 | 55.94 | 63.85 | 49.77 | 66.70 | 84.90 | 54.93 | 53.50 | 68.94 | 43.71 | 49.85 | 60.32 | 42.48 | 63.49 | 75.63 | 54.71 | |
Kodua_ITMO_task2_1 | KoduaITMO2022 | 74 | 7.02 | 3.75 | 55.05 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 20.07 | 11.93 | 63.05 | 16.27 | 9.12 | 75.43 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | |
Liu_CQUPT_task2_4 | LiuCQUPT2022 | 1 | 77.69 | 75.86 | 79.62 | 64.69 | 71.15 | 59.30 | 45.11 | 37.68 | 56.19 | 79.35 | 78.82 | 79.89 | 65.25 | 63.81 | 66.75 | 70.52 | 68.76 | 72.38 | 71.03 | 68.47 | 73.80 | |
Siang_NTHU_task2_1 | SiangNTHU2022 | 79 | 66.74 | 100.00 | 50.08 | 65.15 | 95.93 | 49.33 | 56.77 | 67.61 | 48.93 | 57.32 | 59.96 | 54.90 | 60.10 | 77.17 | 49.22 | 58.21 | 65.32 | 52.50 | 66.03 | 93.61 | 51.00 | |
Tozicka_NSW_task2_1 | TozickaNSW2022 | 38 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | |
Almudevar_UZ_task2_2 | AlmudevarUZ2022 | 45 | 79.18 | 76.79 | 81.72 | 48.67 | 45.94 | 51.74 | 46.85 | 43.62 | 50.60 | 71.63 | 71.16 | 72.09 | 61.79 | 58.21 | 65.85 | 64.93 | 64.03 | 65.86 | 62.56 | 61.63 | 63.53 | |
Jalalia_AIT_task2_1 | JalaliaAIT2022 | 77 | 36.45 | 27.61 | 53.63 | 0.00 | 0.00 | 0.00 | 14.09 | 8.37 | 44.56 | 37.97 | 30.24 | 51.03 | 59.71 | 67.08 | 53.79 | 27.09 | 16.71 | 71.46 | 10.14 | 5.60 | 53.76 | |
Zorin_AIRI_task2_1 | ZorinAIRI2022 | 84 | 7.46 | 3.89 | 88.89 | 0.00 | 0.00 | 0.00 | 12.32 | 7.09 | 46.88 | 4.05 | 2.18 | 27.91 | 12.78 | 7.21 | 56.46 | 0.00 | 0.00 | 0.00 | 6.48 | 3.48 | 47.06 | |
Venkatesh_MERL_task2_3 | VenkateshMERL2022 | 8 | 81.19 | 100.00 | 68.34 | 42.50 | 31.38 | 65.84 | 58.09 | 69.75 | 49.78 | 0.00 | 0.00 | 0.00 | 67.10 | 96.92 | 51.31 | 67.88 | 77.42 | 60.43 | 75.78 | 82.49 | 70.08 |
Systems ranking
Rank | Submission Information | Evaluation Dataset | Development Dataset | |||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Submission Code |
Technical Report |
Official Rank |
Official Score |
ToyCar (AUC) |
ToyCar (pAUC) |
ToyTrain (AUC) |
ToyTrain (pAUC) |
Fan (AUC) |
Fan (pAUC) |
Gearbox (AUC) |
Gearbox (pAUC) |
Bearing (AUC) |
Bearing (pAUC) |
Slider (AUC) |
Slider (pAUC) |
Valve (AUC) |
Valve (pAUC) |
ToyCar (AUC) |
ToyCar (pAUC) |
ToyTrain (AUC) |
ToyTrain (pAUC) |
Fan (AUC) |
Fan (pAUC) |
Gearbox (AUC) |
Gearbox (pAUC) |
Bearing (AUC) |
Bearing (pAUC) |
Slider (AUC) |
Slider (pAUC) |
Valve (AUC) |
Valve (pAUC) |
|
DCASE2022_baseline_task2_AE | DCASE2022baseline2022 | 75 | 52.941609218413156 ± 0.0002952567128951358 | 61.18 | 60.21 | 43.14 | 49.36 | 41.16 | 50.12 | 61.92 | 51.95 | 59.93 | 53.95 | 58.95 | 54.16 | 54.26 | 51.30 | 62.61 | 52.74 | 49.83 | 50.48 | 62.89 | 57.52 | 65.78 | 58.49 | 56.40 | 51.98 | 62.81 | 55.78 | 50.73 | 50.36 | |
DCASE2022_baseline_task2_MNV2 | DCASE2022baseline2022 | 68 | 54.01722421174122 ± 0.0002954806671840702 | 42.79 | 53.44 | 51.22 | 50.98 | 50.34 | 55.22 | 51.34 | 48.49 | 58.23 | 52.16 | 62.42 | 53.07 | 72.77 | 65.16 | 55.54 | 52.27 | 51.57 | 51.51 | 59.48 | 56.89 | 62.70 | 56.03 | 60.25 | 57.14 | 51.69 | 54.67 | 62.14 | 62.41 | |
Bai_JLESS_task2_1 | BaiJLESS2022 | 26 | 63.948583467398024 ± 0.0003628043584427405 | 92.17 | 76.25 | 51.78 | 51.22 | 46.52 | 50.73 | 71.80 | 62.91 | 64.02 | 55.80 | 77.31 | 65.19 | 83.15 | 68.73 | 82.50 | 64.68 | 64.52 | 55.60 | 67.28 | 67.25 | 86.77 | 72.67 | 79.82 | 61.04 | 86.73 | 71.41 | 91.63 | 80.01 | |
Bai_JLESS_task2_2 | BaiJLESS2022 | 44 | 61.101918636073826 ± 0.0003308162422994587 | 53.39 | 53.29 | 50.72 | 52.00 | 52.82 | 55.74 | 72.85 | 66.55 | 64.93 | 55.63 | 77.31 | 65.19 | 78.15 | 66.94 | 73.40 | 62.27 | 62.12 | 51.43 | 60.61 | 65.86 | 85.53 | 71.33 | 72.70 | 56.38 | 86.73 | 71.41 | 87.33 | 79.32 | |
Bai_JLESS_task2_3 | BaiJLESS2022 | 42 | 62.207778577741045 ± 0.00032278441727734837 | 92.17 | 76.25 | 51.78 | 51.22 | 46.52 | 50.73 | 71.14 | 50.61 | 68.22 | 55.34 | 66.64 | 55.43 | 83.15 | 68.73 | 82.50 | 64.68 | 64.52 | 55.60 | 67.28 | 67.25 | 67.00 | 60.46 | 73.91 | 69.92 | 82.80 | 68.20 | 91.63 | 80.01 | |
Kuroyanagi_NU-HDL_task2_1 | KuroyanagiNU-HDL2022 | 20 | 66.5026110256853 ± 0.0003508467181733295 | 65.39 | 71.51 | 56.62 | 51.92 | 48.94 | 52.66 | 82.20 | 64.80 | 78.56 | 61.55 | 80.79 | 58.82 | 92.48 | 83.89 | 74.62 | 61.57 | 62.71 | 50.69 | 66.86 | 64.67 | 80.98 | 66.27 | 82.00 | 65.76 | 82.24 | 70.71 | 86.50 | 89.94 | |
Kuroyanagi_NU-HDL_task2_2 | KuroyanagiNU-HDL2022 | 5 | 68.22312028564296 ± 0.0003608131407524257 | 81.67 | 72.16 | 55.02 | 51.78 | 57.19 | 53.02 | 84.51 | 65.10 | 72.30 | 57.90 | 80.62 | 58.09 | 91.01 | 83.23 | 79.50 | 63.49 | 69.20 | 51.90 | 68.27 | 65.16 | 81.25 | 67.65 | 77.81 | 63.87 | 87.41 | 73.64 | 91.33 | 90.05 | |
Kuroyanagi_NU-HDL_task2_3 | KuroyanagiNU-HDL2022 | 52 | 59.034000365177555 ± 0.0003364721172887524 | 76.33 | 71.84 | 55.39 | 52.77 | 58.59 | 54.80 | 87.93 | 68.57 | 73.72 | 56.55 | 26.82 | 56.92 | 90.70 | 84.45 | 87.99 | 71.32 | 74.90 | 58.75 | 80.02 | 72.53 | 91.62 | 76.43 | 91.66 | 82.19 | 93.41 | 87.71 | 95.39 | 93.31 | |
Kuroyanagi_NU-HDL_task2_4 | KuroyanagiNU-HDL2022 | 12 | 67.14222981034025 ± 0.00036474217704024595 | 85.61 | 73.71 | 45.81 | 51.42 | 53.79 | 55.96 | 83.67 | 63.00 | 76.85 | 56.89 | 81.34 | 59.82 | 92.46 | 87.91 | 91.46 | 71.20 | 79.84 | 57.05 | 83.97 | 74.89 | 92.16 | 78.24 | 93.35 | 86.65 | 95.25 | 86.04 | 97.63 | 94.83 | |
LEE_KNU_task2_1 | LEEKNU2022 | 82 | 49.26732013587548 ± 0.00023044907066044446 | 53.35 | 51.66 | 45.06 | 49.79 | 49.71 | 50.27 | 49.71 | 49.50 | 47.57 | 49.93 | 47.46 | 49.28 | 49.32 | 51.78 | 89.38 | 75.09 | 79.74 | 63.55 | 90.64 | 78.73 | 91.45 | 80.92 | 95.40 | 75.02 | 88.07 | 75.06 | 87.40 | 75.02 | |
Narita_AIT_task2_1 | NaritaAIT2022 | 70 | 53.51232629097276 ± 0.0002912597694015261 | 58.99 | 53.44 | 47.73 | 51.05 | 59.56 | 54.92 | 56.28 | 55.62 | 55.42 | 49.76 | 48.62 | 51.04 | 51.32 | 58.32 | 48.59 | 53.21 | 55.64 | 50.69 | 35.43 | 61.81 | 86.03 | 63.17 | 74.29 | 70.92 | 81.00 | 69.67 | 85.19 | 75.78 | |
Narita_AIT_task2_2 | NaritaAIT2022 | 69 | 53.84966232209122 ± 0.000292419115994451 | 60.99 | 56.58 | 51.03 | 50.39 | 58.25 | 54.56 | 53.37 | 54.03 | 56.33 | 50.51 | 49.68 | 51.77 | 52.31 | 53.47 | 81.44 | 61.58 | 62.88 | 60.51 | 34.90 | 61.65 | 81.74 | 61.67 | 77.33 | 72.39 | 80.39 | 68.05 | 74.19 | 60.73 | |
Narita_AIT_task2_3 | NaritaAIT2022 | 76 | 52.358564643955404 ± 0.00028732610320381395 | 57.63 | 52.21 | 49.37 | 50.53 | 54.55 | 52.98 | 50.82 | 52.00 | 55.78 | 51.06 | 50.64 | 50.94 | 50.04 | 54.77 | 69.32 | 62.21 | 63.99 | 56.50 | 75.35 | 72.01 | 86.58 | 74.05 | 53.95 | 63.92 | 88.66 | 77.59 | 68.04 | 58.23 | |
Narita_AIT_task2_4 | NaritaAIT2022 | 72 | 53.27439248823605 ± 0.0002960338887133308 | 60.99 | 56.58 | 51.03 | 50.39 | 54.55 | 52.98 | 50.82 | 52.00 | 56.33 | 50.51 | 50.64 | 50.94 | 51.32 | 58.32 | 81.44 | 61.58 | 62.88 | 60.51 | 75.35 | 72.01 | 86.58 | 74.05 | 77.33 | 72.39 | 88.66 | 77.59 | 85.19 | 75.78 | |
Du_NERCSLIP_task2_1 | DuNERCSLIP2022 | 33 | 63.30309472782353 ± 0.0002976991866654528 | 63.81 | 59.09 | 58.13 | 51.68 | 53.00 | 51.60 | 82.48 | 60.29 | 67.11 | 54.22 | 67.35 | 56.49 | 87.17 | 75.85 | 80.09 | 63.52 | 69.06 | 58.73 | 90.34 | 80.25 | 86.70 | 66.71 | 86.66 | 67.75 | 92.16 | 83.38 | 90.62 | 83.29 | |
Du_NERCSLIP_task2_2 | DuNERCSLIP2022 | 37 | 62.579607866831424 ± 0.0003032188907543935 | 56.08 | 57.79 | 59.10 | 52.34 | 53.00 | 51.60 | 81.70 | 60.78 | 67.11 | 54.22 | 67.61 | 54.95 | 87.78 | 78.13 | 78.19 | 62.00 | 69.30 | 59.41 | 90.34 | 80.25 | 85.25 | 67.03 | 86.66 | 67.75 | 92.18 | 82.76 | 90.48 | 82.95 | |
Du_NERCSLIP_task2_3 | DuNERCSLIP2022 | 35 | 62.78654798621621 ± 0.00029667454336750667 | 56.08 | 57.79 | 59.36 | 52.93 | 53.73 | 51.49 | 82.02 | 60.27 | 68.15 | 53.81 | 67.61 | 54.95 | 87.78 | 78.13 | 78.19 | 62.00 | 68.70 | 58.52 | 90.25 | 79.99 | 85.25 | 67.03 | 86.66 | 67.75 | 92.18 | 82.76 | 90.48 | 82.95 | |
Du_NERCSLIP_task2_4 | DuNERCSLIP2022 | 36 | 62.74927690762567 ± 0.0003068930687963723 | 51.45 | 57.36 | 58.52 | 53.67 | 56.65 | 52.70 | 80.77 | 58.61 | 68.27 | 51.92 | 71.64 | 55.84 | 88.56 | 80.11 | 76.31 | 61.25 | 68.59 | 58.94 | 90.44 | 80.15 | 84.09 | 66.65 | 86.01 | 67.88 | 91.97 | 80.86 | 90.06 | 82.94 | |
Jinhyuk_SNU_task2_1 | JinhyukSNU2022 | 78 | 51.02701922567823 ± 0.00029370058470223036 | 51.58 | 53.99 | 42.74 | 49.03 | 41.33 | 50.88 | 58.43 | 51.27 | 59.48 | 53.77 | 58.51 | 53.66 | 49.23 | 50.97 | 49.15 | 45.50 | 44.26 | 43.59 | 55.14 | 54.69 | 56.00 | 56.00 | 47.30 | 47.28 | 45.05 | 43.94 | 47.34 | 47.91 | |
Hu_NJU_task2_1 | HuNJU2022 | 65 | 55.07461026550378 ± 0.0003040740154131298 | 69.31 | 56.58 | 50.23 | 50.99 | 49.11 | 51.18 | 63.70 | 54.36 | 62.48 | 56.76 | 59.69 | 52.65 | 46.49 | 49.66 | 76.28 | 55.98 | 65.55 | 53.96 | 59.98 | 56.22 | 70.49 | 60.50 | 57.48 | 55.01 | 81.11 | 64.49 | 62.77 | 57.83 | |
Wilkinghoff_FKIE_task2_1 | WilkinghoffFKIE2022 | 28 | 63.724437030531156 ± 0.000372854346756348 | 79.69 | 65.59 | 61.14 | 56.16 | 40.37 | 48.88 | 82.74 | 58.66 | 75.91 | 65.40 | 67.10 | 55.12 | 84.54 | 70.37 | 79.60 | 63.99 | 73.80 | 55.77 | 88.48 | 78.88 | 86.79 | 68.64 | 74.10 | 63.38 | 90.48 | 75.81 | 86.58 | 80.34 | |
Wilkinghoff_FKIE_task2_2 | WilkinghoffFKIE2022 | 30 | 63.55709677474374 ± 0.000361118136614277 | 78.05 | 65.44 | 60.99 | 56.58 | 41.85 | 48.82 | 80.53 | 58.69 | 74.77 | 64.43 | 67.68 | 54.78 | 82.31 | 68.87 | 79.14 | 61.97 | 74.46 | 54.54 | 88.22 | 76.90 | 87.21 | 70.09 | 73.00 | 62.40 | 89.94 | 74.48 | 85.38 | 78.03 | |
Wei_HEU_task2_1 | WeiHEU2022 | 29 | 63.603340574118775 ± 0.0003498881455140153 | 72.75 | 61.29 | 62.95 | 53.68 | 44.98 | 51.64 | 79.07 | 57.22 | 67.62 | 54.69 | 72.24 | 55.56 | 85.74 | 80.03 | 48.62 | 49.91 | 52.09 | 49.62 | 64.42 | 60.95 | 75.97 | 64.21 | 78.58 | 64.28 | 79.45 | 63.22 | 70.80 | 63.32 | |
Wei_HEU_task2_2 | WeiHEU2022 | 14 | 67.12235197698574 ± 0.0003768613481780468 | 97.17 | 88.16 | 57.08 | 53.13 | 49.46 | 51.72 | 85.17 | 60.84 | 70.46 | 58.32 | 74.39 | 60.99 | 84.47 | 67.11 | 82.75 | 60.14 | 69.05 | 59.37 | 69.70 | 59.59 | 81.02 | 65.29 | 69.47 | 54.25 | 89.71 | 76.08 | 93.28 | 72.92 | |
Wei_HEU_task2_3 | WeiHEU2022 | 17 | 66.65219954899092 ± 0.0003672678022080277 | 95.88 | 87.26 | 57.08 | 53.13 | 49.05 | 50.06 | 85.26 | 62.20 | 67.62 | 54.69 | 74.59 | 60.55 | 86.27 | 70.31 | 83.79 | 61.99 | 69.05 | 59.37 | 74.16 | 63.39 | 82.34 | 66.99 | 78.58 | 64.28 | 90.05 | 77.09 | 93.56 | 74.19 | |
Guan_HEU_task2_1 | GuanHEU2022 | 22 | 66.3946192973576 ± 0.0003745835334668857 | 89.11 | 78.69 | 58.18 | 53.09 | 49.20 | 51.73 | 83.53 | 63.01 | 71.70 | 58.95 | 74.30 | 60.35 | 82.27 | 64.62 | 80.65 | 58.12 | 67.81 | 59.07 | 71.44 | 62.13 | 80.25 | 64.69 | 69.41 | 54.74 | 87.96 | 73.33 | 92.61 | 70.23 | |
Guan_HEU_task2_2 | GuanHEU2022 | 19 | 66.53202926501359 ± 0.00038128464064027007 | 94.13 | 85.69 | 58.18 | 53.11 | 49.20 | 51.73 | 83.89 | 60.90 | 71.51 | 57.88 | 73.59 | 60.27 | 82.13 | 62.86 | 84.69 | 60.55 | 68.02 | 59.69 | 71.44 | 62.13 | 82.22 | 65.85 | 70.80 | 54.22 | 89.91 | 76.74 | 91.68 | 66.02 | |
Guan_HEU_task2_3 | GuanHEU2022 | 11 | 67.41857863156892 ± 0.00036908458449298254 | 85.05 | 70.92 | 61.82 | 55.11 | 55.35 | 53.72 | 81.40 | 58.39 | 67.20 | 52.99 | 75.19 | 61.29 | 88.22 | 78.67 | 82.12 | 63.07 | 71.03 | 59.99 | 77.50 | 65.93 | 83.43 | 64.82 | 77.08 | 67.03 | 89.50 | 74.94 | 93.43 | 73.48 | |
Guan_HEU_task2_4 | GuanHEU2022 | 6 | 68.0372804969291 ± 0.00039326172923566144 | 90.90 | 78.42 | 61.66 | 55.16 | 55.35 | 53.72 | 81.56 | 58.08 | 67.23 | 52.71 | 75.15 | 60.63 | 89.29 | 79.23 | 84.12 | 63.48 | 71.06 | 60.23 | 77.50 | 65.93 | 84.13 | 66.68 | 77.14 | 66.55 | 90.63 | 78.20 | 93.20 | 72.46 | |
Li_CTRI_task2_1 | LiCTRI2022 | 59 | 58.12785559862876 ± 0.0002695870626290191 | 75.84 | 59.32 | 48.76 | 49.93 | 40.15 | 50.15 | 69.07 | 53.18 | 63.30 | 51.01 | 64.95 | 51.74 | 81.71 | 73.11 | 80.35 | 57.21 | 72.38 | 52.78 | 65.49 | 61.14 | 77.96 | 64.19 | 66.88 | 56.04 | 84.48 | 69.36 | 73.16 | 60.35 | |
Morita_SECOM_task2_1 | MoritaSECOM2022 | 15 | 66.82507994516816 ± 0.0003343147179148136 | 92.52 | 74.39 | 58.30 | 54.78 | 50.27 | 52.52 | 79.90 | 59.71 | 74.06 | 63.18 | 68.54 | 62.22 | 84.96 | 69.42 | 89.55 | 71.91 | 68.58 | 50.36 | 78.26 | 63.83 | 85.07 | 75.64 | 64.44 | 58.21 | 90.82 | 72.99 | 94.05 | 90.64 | |
Morita_SECOM_task2_2 | MoritaSECOM2022 | 25 | 65.0882650554134 ± 0.00035810965057196366 | 93.71 | 77.55 | 61.29 | 53.34 | 45.76 | 50.34 | 88.80 | 72.63 | 67.22 | 53.43 | 66.19 | 62.21 | 73.01 | 66.63 | 88.80 | 71.00 | 77.29 | 57.24 | 77.94 | 65.34 | 88.26 | 72.98 | 68.45 | 55.99 | 91.73 | 74.07 | 93.48 | 85.61 | |
Morita_SECOM_task2_3 | MoritaSECOM2022 | 21 | 66.39941023196549 ± 0.00034462400453683404 | 92.52 | 74.39 | 61.29 | 53.34 | 45.76 | 50.34 | 88.80 | 72.63 | 72.14 | 60.08 | 66.19 | 62.21 | 80.49 | 69.16 | 89.55 | 71.91 | 77.29 | 57.24 | 77.94 | 65.34 | 88.26 | 72.98 | 69.40 | 59.98 | 91.73 | 74.07 | 94.06 | 91.62 | |
Morita_SECOM_task2_4 | MoritaSECOM2022 | 16 | 66.72449290579897 ± 0.00030788314474219496 | 92.52 | 74.39 | 61.29 | 53.34 | 46.72 | 51.65 | 88.80 | 72.63 | 72.14 | 60.08 | 66.19 | 62.21 | 82.14 | 67.44 | 89.55 | 71.91 | 77.29 | 57.24 | 80.13 | 64.80 | 88.26 | 72.98 | 69.40 | 59.98 | 91.73 | 74.07 | 97.61 | 88.66 | |
Yamashita_GU_task2_1 | YamashitaGU2022 | 53 | 58.94125019412099 ± 0.0003148043944963588 | 63.39 | 59.68 | 55.68 | 49.19 | 42.27 | 49.61 | 67.60 | 55.88 | 56.07 | 50.26 | 72.37 | 60.04 | 86.10 | 73.07 | 75.81 | 54.58 | 77.16 | 54.19 | 62.56 | 56.32 | 71.25 | 56.75 | 59.74 | 53.68 | 76.90 | 63.60 | 83.92 | 75.23 | |
Yamashita_GU_task2_2 | YamashitaGU2022 | 61 | 57.96512591514972 ± 0.00031675252816225114 | 76.86 | 68.41 | 52.99 | 49.70 | 43.57 | 50.34 | 75.61 | 59.78 | 57.07 | 51.10 | 69.83 | 57.82 | 57.13 | 51.88 | 80.25 | 54.49 | 75.04 | 52.53 | 54.49 | 55.26 | 76.52 | 61.55 | 48.48 | 54.42 | 77.32 | 65.36 | 60.12 | 50.54 | |
Yamashita_GU_task2_3 | YamashitaGU2022 | 39 | 62.387167480540676 ± 0.00028161700622855696 | 76.86 | 68.41 | 55.68 | 49.19 | 46.68 | 49.88 | 75.61 | 59.78 | 62.97 | 52.53 | 69.83 | 57.82 | 86.10 | 73.07 | 80.25 | 54.49 | 77.16 | 54.19 | 66.27 | 56.57 | 76.52 | 61.55 | 62.94 | 55.89 | 77.32 | 65.36 | 83.92 | 75.23 | |
Yamashita_GU_task2_4 | YamashitaGU2022 | 48 | 60.209216780036876 ± 0.000283227668146682 | 68.38 | 62.67 | 55.46 | 49.28 | 38.98 | 49.71 | 75.61 | 59.78 | 62.19 | 52.26 | 71.92 | 60.20 | 86.10 | 73.07 | 78.98 | 56.69 | 77.16 | 54.26 | 66.62 | 56.09 | 76.52 | 61.55 | 63.01 | 55.85 | 77.78 | 64.36 | 83.92 | 75.23 | |
CHO_SG_task2_1 | CHOSG2022 | 67 | 54.468778744613665 ± 0.0003074487298355346 | 72.27 | 56.57 | 54.74 | 51.63 | 49.55 | 50.78 | 63.57 | 55.41 | 52.65 | 51.27 | 47.70 | 50.74 | 53.74 | 54.42 | 46.56 | 49.26 | 56.16 | 52.77 | 47.20 | 53.47 | 59.80 | 55.65 | 56.00 | 52.87 | 74.58 | 63.16 | 61.77 | 60.39 | |
CHO_SG_task2_2 | CHOSG2022 | 81 | 49.28781187514513 ± 0.00027811585615677086 | 49.35 | 51.09 | 49.41 | 50.52 | 46.33 | 50.95 | 49.88 | 51.96 | 50.24 | 50.29 | 45.40 | 50.29 | 50.21 | 49.76 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | |
Li_JAIST_task2 | LiJAIST2022 | 83 | 48.93884539917759 ± 0.0002825946078999199 | 58.61 | 56.48 | 47.77 | 50.47 | 45.13 | 50.09 | 44.38 | 48.53 | 43.61 | 50.33 | 32.41 | 47.97 | 82.78 | 75.78 | 54.06 | 50.35 | 43.38 | 49.70 | 59.67 | 54.43 | 53.31 | 52.48 | 62.09 | 55.29 | 44.45 | 49.80 | 80.90 | 74.47 | |
Gou_UESTC_task2_1 | GouUESTC2022 | 56 | 58.722651329838406 ± 0.00030480919406067754 | 68.89 | 65.40 | 51.23 | 50.50 | 51.62 | 51.44 | 78.54 | 65.82 | 55.59 | 50.83 | 64.37 | 57.99 | 61.39 | 52.48 | 70.96 | 62.07 | 55.03 | 53.49 | 62.90 | 64.45 | 81.69 | 63.25 | 61.21 | 53.49 | 67.00 | 62.02 | 61.60 | 51.96 | |
Gou_UESTC_task2_2 | GouUESTC2022 | 58 | 58.434451862538516 ± 0.0003140236598222822 | 61.39 | 58.43 | 48.16 | 49.85 | 49.16 | 52.70 | 71.40 | 62.27 | 57.16 | 51.89 | 60.78 | 55.76 | 81.11 | 68.44 | 53.20 | 52.75 | 53.44 | 52.23 | 61.50 | 64.40 | 62.86 | 54.72 | 57.56 | 51.80 | 68.35 | 58.09 | 75.07 | 67.91 | |
Gou_UESTC_task2_3 | GouUESTC2022 | 60 | 58.0929101644006 ± 0.00030998986104600966 | 70.88 | 62.96 | 50.24 | 51.22 | 43.26 | 49.41 | 74.65 | 66.89 | 65.83 | 55.69 | 66.71 | 58.58 | 55.65 | 54.02 | 64.85 | 57.15 | 48.53 | 50.94 | 70.17 | 69.61 | 72.80 | 59.44 | 55.12 | 48.91 | 63.30 | 58.38 | 58.17 | 55.50 | |
Gou_UESTC_task2_4 | GouUESTC2022 | 51 | 59.16904508333896 ± 0.0002810411852165545 | 68.89 | 65.40 | 51.23 | 50.50 | 43.26 | 49.41 | 78.54 | 65.82 | 57.49 | 52.30 | 60.97 | 53.67 | 81.11 | 68.44 | 70.96 | 62.07 | 55.03 | 53.49 | 70.17 | 69.61 | 81.69 | 63.25 | 66.77 | 57.34 | 68.61 | 57.60 | 75.07 | 67.91 | |
PENG_NJUPT_task2_1 | PENGNJUPT2022 | 57 | 58.47341574821306 ± 0.0002990941526070191 | 57.29 | 57.37 | 48.52 | 51.03 | 51.04 | 51.16 | 58.64 | 51.13 | 62.47 | 54.62 | 75.49 | 58.51 | 78.03 | 71.20 | 55.55 | 51.02 | 56.84 | 51.95 | 59.26 | 56.68 | 56.38 | 54.52 | 67.42 | 59.08 | 84.36 | 63.45 | 80.88 | 72.00 | |
PENG_NJUPT_task2_2 | PENGNJUPT2022 | 62 | 57.37883429943842 ± 0.0002972201845669457 | 53.23 | 56.22 | 51.88 | 51.59 | 50.03 | 51.08 | 56.69 | 51.14 | 65.22 | 52.09 | 76.71 | 60.96 | 63.61 | 68.86 | 59.28 | 55.92 | 59.53 | 50.84 | 51.48 | 61.75 | 64.38 | 58.02 | 70.55 | 58.48 | 87.44 | 70.20 | 80.25 | 67.41 | |
Nejjar_ETH_task2_1 | NejjarETH2022 | 32 | 63.34598205922427 ± 0.00033579696860218286 | 89.62 | 79.68 | 61.70 | 54.23 | 48.90 | 50.76 | 71.71 | 59.57 | 68.94 | 58.63 | 60.14 | 52.73 | 76.28 | 62.45 | 87.50 | 68.34 | 64.16 | 53.89 | 80.06 | 70.76 | 85.85 | 71.23 | 74.91 | 64.75 | 80.93 | 68.92 | 83.53 | 71.66 | |
Nejjar_ETH_task2_2 | NejjarETH2022 | 54 | 58.84971221021993 ± 0.00033803435798248585 | 89.62 | 79.68 | 61.70 | 54.23 | 30.59 | 48.65 | 71.71 | 59.57 | 68.94 | 58.63 | 60.14 | 52.73 | 76.28 | 62.45 | 87.50 | 68.34 | 64.16 | 53.89 | 80.06 | 70.76 | 85.85 | 71.23 | 74.91 | 64.75 | 80.93 | 68.92 | 83.53 | 71.66 | |
Verbitskiy_DS_task2_1 | VerbitskiyDS2022 | 27 | 63.83127217235942 ± 0.0003519023225400419 | 71.62 | 65.38 | 52.16 | 51.01 | 44.81 | 50.95 | 86.45 | 74.88 | 65.44 | 57.07 | 72.53 | 57.33 | 92.60 | 80.80 | 75.01 | 59.82 | 61.47 | 52.03 | 64.37 | 66.88 | 69.97 | 59.86 | 80.91 | 69.41 | 81.78 | 63.58 | 89.62 | 79.29 | |
Verbitskiy_DS_task2_2 | VerbitskiyDS2022 | 40 | 62.38199546681628 ± 0.00033876186498202777 | 71.62 | 65.38 | 51.94 | 50.99 | 44.81 | 50.95 | 78.84 | 63.40 | 65.44 | 57.07 | 67.00 | 55.58 | 92.60 | 80.80 | 75.01 | 59.82 | 64.25 | 53.23 | 64.37 | 66.88 | 84.85 | 72.82 | 80.91 | 69.41 | 83.10 | 66.38 | 89.62 | 79.29 | |
Verbitskiy_DS_task2_3 | VerbitskiyDS2022 | 34 | 62.90263916891926 ± 0.00034513471368625894 | 76.93 | 64.53 | 53.45 | 52.63 | 43.61 | 50.53 | 78.84 | 63.40 | 65.44 | 57.07 | 67.00 | 55.58 | 93.88 | 83.97 | 77.64 | 62.00 | 62.03 | 53.34 | 70.81 | 63.73 | 84.85 | 72.82 | 81.02 | 69.31 | 83.17 | 66.33 | 90.91 | 80.59 | |
Verbitskiy_DS_task2_4 | VerbitskiyDS2022 | 31 | 63.392855035610566 ± 0.0003502721194045189 | 78.13 | 64.99 | 54.50 | 52.24 | 43.57 | 50.62 | 78.84 | 63.27 | 67.91 | 57.72 | 67.00 | 55.58 | 93.81 | 85.70 | 77.91 | 62.06 | 64.22 | 54.00 | 70.81 | 63.73 | 84.88 | 72.82 | 82.03 | 68.84 | 83.17 | 66.33 | 91.31 | 83.05 | |
Cohen_Technion_task2_1 | CohenTechnion2022 | 64 | 55.64021966150642 ± 0.0002945840301639103 | 71.76 | 58.89 | 52.66 | 51.12 | 44.49 | 49.54 | 62.52 | 50.25 | 58.24 | 55.89 | 54.40 | 49.33 | 66.19 | 54.01 | 64.31 | 54.80 | 55.67 | 49.64 | 55.64 | 49.98 | 66.61 | 55.99 | 57.96 | 53.69 | 81.36 | 63.37 | 54.34 | 50.48 | |
Cohen_Technion_task2_2 | CohenTechnion2022 | 63 | 56.763333657296656 ± 0.0002975695814466658 | 75.61 | 65.47 | 50.50 | 50.02 | 43.47 | 50.14 | 67.34 | 51.86 | 64.40 | 58.88 | 63.78 | 53.55 | 55.09 | 51.75 | 77.70 | 55.22 | 55.96 | 49.89 | 62.80 | 55.19 | 69.72 | 56.91 | 59.51 | 50.99 | 80.22 | 59.78 | 54.43 | 50.34 | |
Deng_THU_task2_1 | DengTHU2022 | 9 | 67.52738146990447 ± 0.00032810483165450423 | 79.52 | 67.45 | 62.91 | 55.52 | 43.93 | 53.34 | 87.78 | 69.90 | 66.54 | 58.18 | 81.75 | 66.45 | 93.08 | 86.30 | 90.13 | 72.91 | 83.44 | 71.47 | 88.69 | 78.80 | 91.58 | 80.48 | 90.62 | 82.57 | 92.02 | 85.03 | 98.81 | 96.74 | |
Deng_THU_task2_2 | DengTHU2022 | 18 | 66.53259340769795 ± 0.0003299058303031156 | 80.16 | 68.33 | 61.93 | 55.16 | 40.78 | 52.18 | 87.78 | 69.90 | 66.33 | 58.83 | 80.88 | 64.86 | 93.27 | 86.33 | 90.11 | 72.88 | 83.16 | 70.73 | 88.30 | 77.94 | 91.58 | 80.48 | 90.66 | 82.43 | 91.75 | 83.98 | 98.80 | 96.62 | |
Deng_THU_task2_3 | DengTHU2022 | 13 | 67.13366725064618 ± 0.0003089590683108787 | 78.34 | 67.08 | 59.49 | 54.77 | 43.89 | 52.95 | 88.99 | 71.15 | 67.01 | 58.24 | 81.99 | 66.54 | 93.08 | 86.30 | 88.67 | 72.02 | 82.63 | 70.52 | 88.61 | 78.26 | 91.66 | 80.31 | 90.54 | 82.38 | 91.90 | 85.18 | 98.81 | 96.74 | |
Deng_THU_task2_4 | DengTHU2022 | 7 | 67.62358685887781 ± 0.00038777202086470904 | 88.48 | 70.07 | 60.00 | 53.70 | 44.48 | 50.01 | 85.33 | 66.41 | 71.30 | 60.33 | 81.55 | 67.04 | 89.68 | 84.10 | 87.09 | 66.28 | 81.47 | 69.14 | 88.07 | 77.00 | 89.11 | 75.58 | 87.21 | 75.10 | 91.94 | 84.87 | 98.17 | 94.83 | |
Liu_BUPT_task2_1 | LiuBUPT2022 | 55 | 58.80715749913074 ± 0.0003208251356634862 | 56.43 | 52.25 | 45.74 | 49.79 | 45.69 | 50.15 | 69.68 | 51.03 | 69.97 | 57.28 | 71.37 | 59.82 | 85.14 | 80.49 | 61.34 | 52.58 | 51.82 | 51.37 | 61.33 | 56.72 | 76.72 | 59.61 | 68.31 | 56.65 | 88.28 | 73.78 | 83.97 | 70.11 | |
Liu_BUPT_task2_2 | LiuBUPT2022 | 49 | 59.767720255838206 ± 0.00027644210188000623 | 53.83 | 57.79 | 55.34 | 51.27 | 43.14 | 50.25 | 69.68 | 51.03 | 69.97 | 57.28 | 71.37 | 59.82 | 85.14 | 80.49 | 78.48 | 56.66 | 62.40 | 54.65 | 71.34 | 58.11 | 76.72 | 59.61 | 68.31 | 56.65 | 88.28 | 73.78 | 83.97 | 70.11 | |
Kazakova_ITMO_task2_1 | KazakovaITMO2022 | 80 | 49.89612147323122 ± 0.0002773039817690667 | 58.35 | 54.92 | 51.25 | 50.68 | 48.80 | 50.47 | 52.70 | 49.56 | 39.56 | 49.89 | 39.76 | 49.96 | 62.81 | 55.73 | 51.56 | 53.23 | 52.11 | 52.84 | 57.85 | 54.60 | 50.13 | 53.15 | 47.47 | 53.81 | 50.10 | 56.40 | 17.68 | 49.27 | |
Kazakova_ITMO_task2_2 | KazakovaITMO2022 | 86 | 37.56266598852081 ± 0.0002693154734263868 | 31.34 | 47.66 | 49.82 | 50.69 | 43.36 | 50.15 | 52.95 | 51.45 | 48.47 | 52.31 | 45.70 | 52.35 | 13.61 | 47.90 | 52.20 | 50.60 | 51.70 | 51.30 | 55.20 | 50.60 | 64.80 | 51.60 | 50.65 | 50.60 | 57.00 | 53.20 | 54.30 | 52.90 | |
Kazakova_ITMO_task2_3 | KazakovaITMO2022 | 85 | 39.578820200597065 ± 0.00024604960892591796 | 58.35 | 54.92 | 51.25 | 50.68 | 48.80 | 50.47 | 52.95 | 51.45 | 39.56 | 49.89 | 45.70 | 52.35 | 13.61 | 47.90 | 51.56 | 53.23 | 52.11 | 52.84 | 57.85 | 54.60 | 64.80 | 51.60 | 47.47 | 53.81 | 57.00 | 53.20 | 54.30 | 52.90 | |
Kodua_ITMO_task2_1 | KoduaITMO2022 | 74 | 52.96330674177412 ± 0.00029557613542138784 | 53.66 | 51.39 | 48.84 | 49.30 | 41.08 | 48.93 | 59.15 | 51.66 | 56.48 | 54.50 | 63.19 | 52.88 | 61.24 | 51.89 | 56.25 | 51.75 | 55.47 | 52.13 | 59.31 | 52.22 | 61.08 | 53.33 | 55.48 | 51.47 | 68.85 | 56.83 | 59.08 | 50.82 | |
Liu_CQUPT_task2_1 | LiuCQUPT2022 | 2 | 70.16666847945943 ± 0.00036313767268785537 | 88.45 | 81.83 | 70.46 | 61.14 | 54.05 | 55.61 | 86.04 | 64.22 | 68.85 | 54.45 | 75.40 | 64.40 | 86.31 | 75.35 | 81.50 | 67.12 | 76.22 | 64.27 | 71.40 | 68.42 | 89.28 | 76.04 | 81.96 | 70.89 | 89.98 | 79.53 | 94.95 | 88.25 | |
Liu_CQUPT_task2_2 | LiuCQUPT2022 | 4 | 69.7042789452307 ± 0.00038117558144412655 | 88.45 | 81.83 | 70.46 | 61.14 | 54.05 | 55.61 | 86.04 | 64.22 | 68.85 | 54.45 | 74.73 | 63.62 | 85.28 | 68.74 | 81.50 | 67.12 | 76.22 | 64.27 | 71.40 | 68.42 | 89.28 | 76.04 | 81.96 | 70.89 | 90.08 | 79.80 | 94.62 | 88.66 | |
Liu_CQUPT_task2_3 | LiuCQUPT2022 | 3 | 69.78709634825285 ± 0.0003603881386802032 | 88.45 | 81.83 | 70.46 | 61.14 | 54.05 | 55.61 | 86.04 | 64.22 | 68.85 | 54.45 | 78.26 | 66.39 | 77.04 | 73.84 | 81.50 | 67.12 | 76.22 | 64.27 | 71.40 | 68.42 | 89.28 | 76.04 | 81.96 | 70.89 | 90.08 | 79.80 | 84.14 | 84.76 | |
Liu_CQUPT_task2_4 | LiuCQUPT2022 | 1 | 70.97462888079512 ± 0.0003427217026837889 | 88.45 | 81.83 | 70.46 | 61.14 | 57.34 | 57.33 | 86.04 | 64.22 | 68.85 | 54.45 | 78.26 | 66.39 | 83.87 | 75.22 | 81.50 | 67.12 | 76.22 | 64.27 | 70.11 | 65.86 | 89.28 | 76.04 | 81.96 | 70.89 | 90.08 | 79.80 | 93.84 | 86.80 | |
Siang_NTHU_task2_1 | SiangNTHU2022 | 79 | 50.59102827288451 ± 0.00027303186490778593 | 57.84 | 52.44 | 44.31 | 49.56 | 46.32 | 50.42 | 53.15 | 50.25 | 38.27 | 50.91 | 58.50 | 51.98 | 60.33 | 58.42 | 78.39 | 61.24 | 64.60 | 55.08 | 70.73 | 61.34 | 82.91 | 65.22 | 75.63 | 65.16 | 89.87 | 76.54 | 88.56 | 79.06 | |
Tozicka_NSW_task2_1 | TozickaNSW2022 | 38 | 62.50004793576084 ± 0.00034772455431082163 | 88.28 | 74.95 | 58.44 | 51.01 | 42.75 | 52.14 | 87.23 | 72.17 | 60.35 | 55.15 | 62.65 | 52.16 | 74.38 | 70.55 | 88.24 | 67.80 | 72.19 | 63.80 | 63.51 | 60.40 | 80.22 | 64.80 | 70.66 | 61.00 | 78.06 | 62.60 | 88.73 | 76.20 | |
Tozicka_NSW_task2_2 | TozickaNSW2022 | 41 | 62.32235855220123 ± 0.00036151978830945957 | 90.16 | 73.71 | 55.90 | 50.17 | 43.55 | 52.91 | 83.62 | 72.21 | 64.06 | 55.97 | 60.47 | 50.81 | 74.38 | 70.55 | 89.69 | 71.14 | 71.31 | 61.76 | 63.03 | 58.87 | 78.36 | 54.68 | 68.73 | 61.00 | 81.89 | 68.44 | 88.73 | 76.21 | |
Tozicka_NSW_task2_3 | TozickaNSW2022 | 46 | 60.461558807688654 ± 0.00035101867648351854 | 77.78 | 73.76 | 52.98 | 49.95 | 42.86 | 51.22 | 83.61 | 68.49 | 60.02 | 56.72 | 58.44 | 51.70 | 74.38 | 70.55 | 86.72 | 65.40 | 71.34 | 60.54 | 68.24 | 60.60 | 72.75 | 59.56 | 65.45 | 53.53 | 78.44 | 58.70 | 88.73 | 76.20 | |
Tozicka_NSW_task2_4 | TozickaNSW2022 | 43 | 62.19684397021945 ± 0.00036574848023408654 | 90.16 | 73.71 | 58.44 | 51.01 | 42.86 | 51.22 | 87.23 | 72.17 | 60.35 | 55.15 | 60.47 | 50.81 | 74.38 | 70.55 | 89.69 | 71.14 | 72.19 | 63.81 | 68.24 | 60.60 | 80.22 | 64.82 | 70.66 | 60.95 | 81.89 | 68.44 | 88.73 | 76.21 | |
Almudevar_UZ_task2_1 | AlmudevarUZ2022 | 50 | 59.37719756976433 ± 0.0002989255880098096 | 82.76 | 68.46 | 51.58 | 49.51 | 47.83 | 49.35 | 73.35 | 52.35 | 68.07 | 55.00 | 63.13 | 53.36 | 64.57 | 54.00 | 79.14 | 54.10 | 56.41 | 51.29 | 68.86 | 56.94 | 79.44 | 61.48 | 71.27 | 59.34 | 85.43 | 68.20 | 59.95 | 53.33 | |
Almudevar_UZ_task2_2 | AlmudevarUZ2022 | 45 | 60.60499062666268 ± 0.00031377932481308624 | 89.58 | 77.07 | 47.34 | 49.83 | 47.90 | 52.47 | 75.43 | 56.23 | 64.84 | 54.36 | 68.26 | 55.78 | 67.92 | 55.97 | 82.62 | 56.20 | 60.16 | 51.18 | 70.45 | 60.20 | 73.88 | 55.34 | 63.86 | 59.15 | 89.83 | 77.74 | 61.77 | 53.07 | |
Almudevar_UZ_task2_3 | AlmudevarUZ2022 | 47 | 60.21894525601053 ± 0.00029997437080424505 | 89.58 | 77.07 | 47.10 | 49.94 | 48.72 | 51.62 | 70.87 | 52.02 | 68.07 | 55.00 | 65.30 | 54.61 | 70.05 | 55.52 | 82.62 | 56.20 | 60.45 | 52.08 | 71.93 | 60.55 | 80.99 | 60.56 | 71.27 | 59.34 | 90.04 | 77.19 | 62.64 | 57.05 | |
Jalalia_AIT_task2_1 | JalaliaAIT2022 | 77 | 51.338525836051105 ± 0.00029471041821909645 | 51.35 | 55.59 | 44.78 | 49.55 | 41.06 | 50.21 | 58.20 | 51.69 | 60.13 | 55.52 | 58.51 | 53.13 | 48.68 | 51.11 | 63.47 | 52.75 | 52.36 | 50.68 | 59.38 | 57.85 | 63.91 | 57.90 | 54.11 | 51.72 | 58.28 | 55.23 | 48.69 | 50.48 | |
Zorin_AIRI_task2_1 | ZorinAIRI2022 | 84 | 43.16915219364365 ± 0.0002620788289881202 | 37.17 | 49.65 | 45.02 | 49.74 | 43.23 | 49.83 | 40.98 | 49.49 | 38.40 | 50.13 | 40.83 | 48.68 | 39.49 | 48.92 | 54.64 | 52.77 | 60.43 | 51.17 | 54.28 | 50.10 | 56.27 | 50.32 | 46.42 | 48.66 | 63.60 | 53.43 | 47.75 | 51.18 | |
Venkatesh_MERL_task2_1 | VenkateshMERL2022 | 23 | 65.65983067326367 ± 0.0003545938092026855 | 93.88 | 78.67 | 55.53 | 54.33 | 44.50 | 50.84 | 86.47 | 68.54 | 69.94 | 61.64 | 73.64 | 60.70 | 78.51 | 66.08 | 83.66 | 65.19 | 66.33 | 58.03 | 71.27 | 66.13 | 90.57 | 71.18 | 78.26 | 66.73 | 90.16 | 77.60 | 97.50 | 92.18 | |
Venkatesh_MERL_task2_2 | VenkateshMERL2022 | 24 | 65.56948240685708 ± 0.0003696822207797042 | 93.88 | 78.67 | 54.92 | 54.22 | 44.29 | 50.97 | 82.37 | 70.76 | 69.94 | 61.64 | 75.96 | 62.40 | 77.69 | 65.39 | 83.66 | 65.19 | 68.31 | 57.21 | 71.34 | 66.16 | 91.20 | 75.71 | 78.26 | 66.73 | 90.00 | 79.55 | 97.52 | 92.72 | |
Venkatesh_MERL_task2_3 | VenkateshMERL2022 | 8 | 67.56529995172603 ± 0.000353158822539437 | 93.88 | 78.67 | 58.23 | 54.73 | 48.17 | 50.34 | 86.76 | 79.43 | 72.54 | 61.86 | 73.64 | 60.70 | 83.72 | 62.93 | 83.66 | 65.19 | 66.38 | 57.27 | 64.72 | 61.61 | 87.47 | 70.70 | 73.16 | 57.59 | 90.16 | 77.60 | 97.35 | 90.76 | |
Venkatesh_MERL_task2_4 | VenkateshMERL2022 | 10 | 67.49394662130544 ± 0.0003867088205398765 | 93.30 | 75.47 | 57.30 | 54.93 | 46.93 | 50.33 | 86.34 | 78.47 | 71.96 | 64.26 | 75.94 | 64.29 | 83.05 | 64.01 | 83.68 | 66.36 | 66.58 | 57.84 | 65.70 | 62.78 | 88.73 | 70.12 | 78.38 | 62.40 | 90.00 | 77.46 | 97.09 | 89.83 |
Supplementary metrics (recall, precision, and F1 score)
Rank | Submission Information | Evaluation Dataset | ||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Submission Code |
Technical Report |
Official Rank |
ToyCar (F1 score) |
ToyCar (Recall) |
ToyCar (Precision) |
ToyTrain (F1 score) |
ToyTrain (Recall) |
ToyTrain (Precision) |
Fan (F1 score) |
Fan (Recall) |
Fan (Precision) |
Gearbox (F1 score) |
Gearbox (Recall) |
Gearbox (Precision) |
Bearing (F1 score) |
Bearing (Recall) |
Bearing (Precision) |
Slider (F1 score) |
Slider (Recall) |
Slider (Precision) |
Valve (F1 score) |
Valve (Recall) |
Valve (Precision) |
|
DCASE2022_baseline_task2_AE | DCASE2022baseline2022 | 75 | 51.71 | 44.04 | 62.60 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 67.93 | 98.27 | 51.91 | 58.33 | 61.81 | 55.22 | 55.30 | 53.97 | 56.70 | 29.78 | 20.49 | 54.46 | |
DCASE2022_baseline_task2_MNV2 | DCASE2022baseline2022 | 68 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 55.14 | 52.94 | 57.52 | 43.31 | 34.77 | 57.41 | 42.52 | 31.43 | 65.69 | 0.00 | 0.00 | 0.00 | |
Bai_JLESS_task2_1 | BaiJLESS2022 | 26 | 85.28 | 94.80 | 77.50 | 0.00 | 0.00 | 0.00 | 25.73 | 16.50 | 58.35 | 66.70 | 62.01 | 72.15 | 64.14 | 71.76 | 57.98 | 71.97 | 80.66 | 64.97 | 61.21 | 49.20 | 80.99 | |
Bai_JLESS_task2_2 | BaiJLESS2022 | 44 | 19.27 | 12.61 | 40.86 | 0.00 | 0.00 | 0.00 | 38.92 | 32.20 | 49.17 | 68.37 | 64.70 | 72.48 | 64.50 | 67.22 | 61.98 | 71.97 | 80.66 | 64.97 | 58.65 | 45.67 | 81.94 | |
Bai_JLESS_task2_3 | BaiJLESS2022 | 42 | 85.28 | 94.80 | 77.50 | 0.00 | 0.00 | 0.00 | 25.73 | 16.50 | 58.35 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 25.16 | 16.42 | 53.79 | 61.21 | 49.20 | 80.99 | |
Kuroyanagi_NU-HDL_task2_1 | KuroyanagiNU-HDL2022 | 20 | 67.62 | 67.98 | 67.26 | 52.40 | 50.68 | 54.24 | 31.77 | 22.69 | 52.97 | 80.93 | 80.74 | 81.13 | 73.37 | 72.57 | 74.18 | 72.58 | 71.16 | 74.05 | 83.96 | 82.07 | 85.94 | |
Kuroyanagi_NU-HDL_task2_2 | KuroyanagiNU-HDL2022 | 5 | 75.29 | 74.95 | 75.63 | 52.38 | 49.58 | 55.52 | 54.05 | 49.01 | 60.26 | 80.57 | 80.34 | 80.81 | 68.50 | 68.20 | 68.80 | 76.59 | 76.34 | 76.84 | 82.14 | 80.83 | 83.49 | |
Kuroyanagi_NU-HDL_task2_3 | KuroyanagiNU-HDL2022 | 52 | 74.50 | 73.93 | 75.09 | 55.19 | 55.27 | 55.12 | 49.37 | 43.82 | 56.54 | 81.83 | 81.65 | 82.00 | 67.15 | 66.81 | 67.50 | 57.13 | 56.72 | 57.55 | 83.21 | 82.87 | 83.55 | |
Kuroyanagi_NU-HDL_task2_4 | KuroyanagiNU-HDL2022 | 12 | 77.19 | 76.92 | 77.46 | 17.37 | 10.25 | 57.02 | 34.24 | 26.92 | 47.03 | 79.13 | 78.87 | 79.40 | 70.43 | 69.99 | 70.88 | 73.28 | 72.57 | 74.02 | 82.69 | 81.64 | 83.76 | |
LEE_KNU_task2_1 | LEEKNU2022 | 82 | 57.67 | 65.42 | 51.56 | 32.71 | 26.10 | 43.79 | 60.10 | 72.26 | 51.44 | 50.39 | 50.78 | 50.00 | 60.40 | 79.18 | 48.82 | 47.79 | 46.75 | 48.88 | 50.26 | 49.83 | 50.70 | |
Narita_AIT_task2_1 | NaritaAIT2022 | 70 | 66.67 | 100.00 | 50.00 | 66.67 | 100.00 | 50.00 | 66.89 | 99.66 | 50.33 | 66.67 | 100.00 | 50.00 | 66.67 | 100.00 | 50.00 | 66.86 | 97.61 | 50.84 | 64.68 | 90.86 | 50.21 | |
Narita_AIT_task2_2 | NaritaAIT2022 | 69 | 66.67 | 100.00 | 50.00 | 66.67 | 100.00 | 50.00 | 66.67 | 100.00 | 50.00 | 66.67 | 100.00 | 50.00 | 66.67 | 100.00 | 50.00 | 66.67 | 100.00 | 50.00 | 66.67 | 100.00 | 50.00 | |
Narita_AIT_task2_3 | NaritaAIT2022 | 76 | 60.16 | 65.30 | 55.77 | 0.00 | 0.00 | 0.00 | 52.47 | 48.64 | 56.96 | 0.00 | 0.00 | 0.00 | 60.95 | 71.53 | 53.10 | 33.02 | 25.11 | 48.19 | 9.20 | 5.20 | 40.00 | |
Narita_AIT_task2_4 | NaritaAIT2022 | 72 | 66.67 | 100.00 | 50.00 | 66.67 | 100.00 | 50.00 | 52.47 | 48.64 | 56.96 | 0.00 | 0.00 | 0.00 | 66.67 | 100.00 | 50.00 | 33.02 | 25.11 | 48.19 | 64.68 | 90.86 | 50.21 | |
Du_NERCSLIP_task2_1 | DuNERCSLIP2022 | 33 | 64.19 | 86.32 | 51.10 | 66.74 | 99.32 | 50.25 | 63.40 | 83.11 | 51.25 | 77.05 | 87.94 | 68.56 | 69.12 | 97.30 | 53.60 | 50.17 | 39.31 | 69.29 | 0.00 | 0.00 | 0.00 | |
Du_NERCSLIP_task2_2 | DuNERCSLIP2022 | 37 | 64.60 | 91.19 | 50.01 | 66.74 | 99.32 | 50.25 | 63.40 | 83.11 | 51.25 | 77.75 | 92.31 | 67.16 | 69.12 | 97.30 | 53.60 | 31.38 | 21.35 | 59.14 | 0.00 | 0.00 | 0.00 | |
Du_NERCSLIP_task2_3 | DuNERCSLIP2022 | 35 | 64.60 | 91.19 | 50.01 | 66.67 | 100.00 | 50.00 | 41.12 | 33.74 | 52.64 | 77.33 | 93.36 | 66.00 | 69.14 | 98.99 | 53.12 | 31.38 | 21.35 | 59.14 | 0.00 | 0.00 | 0.00 | |
Du_NERCSLIP_task2_4 | DuNERCSLIP2022 | 36 | 64.78 | 94.21 | 49.36 | 66.67 | 100.00 | 50.00 | 59.77 | 65.73 | 54.80 | 73.30 | 98.57 | 58.34 | 69.53 | 99.66 | 53.38 | 14.91 | 9.01 | 43.22 | 36.28 | 22.41 | 95.26 | |
Jinhyuk_SNU_task2_1 | JinhyukSNU2022 | 78 | 41.62 | 32.89 | 56.66 | 66.67 | 100.00 | 50.00 | 0.00 | 0.00 | 0.00 | 69.04 | 91.91 | 55.29 | 62.33 | 75.94 | 52.85 | 47.11 | 39.15 | 59.14 | 21.80 | 13.76 | 52.41 | |
Hu_NJU_task2_1 | HuNJU2022 | 65 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 13.47 | 7.66 | 55.52 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 51.63 | 45.59 | 59.51 | 0.00 | 0.00 | 0.00 | |
Wilkinghoff_FKIE_task2_1 | WilkinghoffFKIE2022 | 28 | 67.19 | 100.00 | 50.59 | 67.57 | 98.18 | 51.51 | 56.01 | 63.98 | 49.81 | 66.82 | 100.00 | 50.17 | 67.11 | 100.00 | 50.51 | 68.61 | 99.31 | 52.41 | 74.10 | 98.95 | 59.22 | |
Wilkinghoff_FKIE_task2_2 | WilkinghoffFKIE2022 | 30 | 66.67 | 100.00 | 50.00 | 67.57 | 100.00 | 51.02 | 58.50 | 71.37 | 49.56 | 66.67 | 100.00 | 50.00 | 66.82 | 100.00 | 50.17 | 67.76 | 99.31 | 51.42 | 70.09 | 100.00 | 53.96 | |
Wei_HEU_task2_1 | WeiHEU2022 | 29 | 26.91 | 17.55 | 57.63 | 45.60 | 36.81 | 59.89 | 0.00 | 0.00 | 0.00 | 62.64 | 51.12 | 80.86 | 36.95 | 24.98 | 70.95 | 21.59 | 13.45 | 54.66 | 49.24 | 33.19 | 95.39 | |
Wei_HEU_task2_2 | WeiHEU2022 | 14 | 85.41 | 78.80 | 93.24 | 50.75 | 46.23 | 56.26 | 36.26 | 26.45 | 57.65 | 78.80 | 76.60 | 81.13 | 61.32 | 53.04 | 72.66 | 67.36 | 63.28 | 71.99 | 70.96 | 67.71 | 74.54 | |
Wei_HEU_task2_3 | WeiHEU2022 | 17 | 83.83 | 77.33 | 91.53 | 50.75 | 46.23 | 56.26 | 25.16 | 16.23 | 55.97 | 79.34 | 75.92 | 83.08 | 36.95 | 24.98 | 70.95 | 66.31 | 62.21 | 70.99 | 73.76 | 69.71 | 78.30 | |
Guan_HEU_task2_1 | GuanHEU2022 | 22 | 0.00 | 0.00 | 0.00 | 59.58 | 64.75 | 55.17 | 48.79 | 47.55 | 50.08 | 73.33 | 95.65 | 59.45 | 67.47 | 96.70 | 51.80 | 0.00 | 0.00 | 0.00 | 76.52 | 95.87 | 63.67 | |
Guan_HEU_task2_2 | GuanHEU2022 | 19 | 73.71 | 100.00 | 58.37 | 0.00 | 0.00 | 0.00 | 48.79 | 47.55 | 50.08 | 75.65 | 93.16 | 63.67 | 68.22 | 94.30 | 53.44 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | |
Guan_HEU_task2_3 | GuanHEU2022 | 11 | 74.83 | 82.49 | 68.48 | 53.93 | 45.71 | 65.75 | 54.13 | 56.84 | 51.67 | 68.22 | 95.93 | 52.93 | 15.03 | 8.41 | 70.26 | 54.20 | 52.72 | 55.76 | 77.57 | 93.60 | 66.23 | |
Guan_HEU_task2_4 | GuanHEU2022 | 6 | 78.27 | 93.46 | 67.33 | 55.12 | 47.75 | 65.18 | 54.13 | 56.84 | 51.67 | 72.66 | 95.58 | 58.60 | 16.21 | 9.34 | 61.31 | 64.39 | 65.53 | 63.29 | 78.75 | 92.22 | 68.71 | |
Li_CTRI_task2_1 | LiCTRI2022 | 59 | 69.87 | 74.12 | 66.09 | 39.57 | 34.10 | 47.12 | 0.00 | 0.00 | 0.00 | 54.27 | 43.07 | 73.34 | 67.26 | 80.16 | 57.94 | 29.58 | 19.25 | 63.87 | 0.00 | 0.00 | 0.00 | |
Morita_SECOM_task2_1 | MoritaSECOM2022 | 15 | 69.69 | 56.47 | 90.99 | 0.00 | 0.00 | 0.00 | 15.71 | 9.09 | 57.80 | 42.25 | 29.92 | 71.86 | 0.00 | 0.00 | 0.00 | 67.97 | 64.01 | 72.44 | 78.36 | 82.04 | 75.00 | |
Morita_SECOM_task2_2 | MoritaSECOM2022 | 25 | 87.86 | 94.76 | 81.89 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 68.12 | 56.18 | 86.48 | 0.00 | 0.00 | 0.00 | 61.80 | 55.09 | 70.36 | 69.20 | 72.10 | 66.53 | |
Morita_SECOM_task2_3 | MoritaSECOM2022 | 21 | 69.69 | 56.47 | 90.99 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 68.12 | 56.18 | 86.48 | 0.00 | 0.00 | 0.00 | 61.80 | 55.09 | 70.36 | 76.67 | 79.12 | 74.38 | |
Morita_SECOM_task2_4 | MoritaSECOM2022 | 16 | 69.69 | 56.47 | 90.99 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 68.12 | 56.18 | 86.48 | 0.00 | 0.00 | 0.00 | 61.80 | 55.09 | 70.36 | 0.00 | 0.00 | 0.00 | |
Yamashita_GU_task2_1 | YamashitaGU2022 | 53 | 57.90 | 49.51 | 69.72 | 44.19 | 34.76 | 60.63 | 32.08 | 24.53 | 46.34 | 56.42 | 49.95 | 64.82 | 37.78 | 27.93 | 58.38 | 62.71 | 53.52 | 75.71 | 67.46 | 55.67 | 85.60 | |
Yamashita_GU_task2_2 | YamashitaGU2022 | 61 | 61.39 | 49.75 | 80.16 | 43.87 | 36.71 | 54.51 | 35.18 | 27.21 | 49.77 | 63.93 | 56.20 | 74.13 | 40.10 | 30.81 | 57.43 | 60.61 | 53.71 | 69.55 | 44.54 | 36.45 | 57.23 | |
Yamashita_GU_task2_3 | YamashitaGU2022 | 39 | 61.39 | 49.75 | 80.16 | 44.19 | 34.76 | 60.63 | 35.00 | 26.11 | 53.10 | 63.93 | 56.20 | 74.13 | 57.97 | 50.23 | 68.55 | 60.61 | 53.71 | 69.55 | 67.46 | 55.67 | 85.60 | |
Yamashita_GU_task2_4 | YamashitaGU2022 | 48 | 56.57 | 45.85 | 73.84 | 40.26 | 30.90 | 57.73 | 0.00 | 0.00 | 0.00 | 63.93 | 56.20 | 74.13 | 56.52 | 48.65 | 67.45 | 62.90 | 55.01 | 73.41 | 67.46 | 55.67 | 85.60 | |
CHO_SG_task2_1 | CHOSG2022 | 67 | 69.65 | 84.99 | 59.01 | 66.33 | 90.90 | 52.21 | 63.47 | 87.45 | 49.81 | 66.56 | 96.20 | 50.89 | 0.00 | 0.00 | 0.00 | 62.14 | 85.54 | 48.79 | 65.45 | 93.44 | 50.37 | |
CHO_SG_task2_2 | CHOSG2022 | 81 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | |
Li_JAIST_task2 | LiJAIST2022 | 83 | 66.58 | 96.96 | 50.69 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 14.90 | 9.71 | 32.03 | 64.53 | 92.72 | 49.48 | 0.00 | 0.00 | 0.00 | 30.99 | 19.20 | 80.23 | |
Gou_UESTC_task2_1 | GouUESTC2022 | 56 | 63.10 | 62.91 | 63.29 | 46.35 | 42.30 | 51.26 | 40.05 | 32.91 | 51.13 | 72.48 | 72.02 | 72.96 | 53.96 | 52.28 | 55.76 | 60.61 | 60.25 | 60.97 | 59.78 | 59.10 | 60.48 | |
Gou_UESTC_task2_2 | GouUESTC2022 | 58 | 54.55 | 54.04 | 55.08 | 46.99 | 43.58 | 50.97 | 46.50 | 41.88 | 52.26 | 66.16 | 66.09 | 66.22 | 54.79 | 54.04 | 55.57 | 59.96 | 59.61 | 60.31 | 71.51 | 70.75 | 72.29 | |
Gou_UESTC_task2_3 | GouUESTC2022 | 60 | 67.24 | 66.60 | 67.90 | 48.83 | 45.22 | 53.06 | 45.61 | 43.84 | 47.53 | 65.85 | 65.13 | 66.59 | 63.22 | 62.93 | 63.51 | 60.06 | 58.40 | 61.82 | 51.54 | 49.72 | 53.49 | |
Gou_UESTC_task2_4 | GouUESTC2022 | 51 | 63.10 | 62.91 | 63.29 | 46.35 | 42.30 | 51.26 | 45.61 | 43.84 | 47.53 | 72.48 | 72.02 | 72.96 | 54.16 | 52.31 | 56.14 | 65.07 | 72.26 | 59.18 | 71.51 | 70.75 | 72.29 | |
PENG_NJUPT_task2_1 | PENGNJUPT2022 | 57 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 54.22 | 51.13 | 57.70 | 67.33 | 78.30 | 59.06 | 76.81 | 90.14 | 66.92 | 65.46 | 57.15 | 76.59 | |
PENG_NJUPT_task2_2 | PENGNJUPT2022 | 62 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 59.45 | 62.59 | 56.61 | 45.07 | 36.98 | 57.69 | 63.66 | 68.06 | 59.79 | 79.67 | 92.28 | 70.09 | 54.23 | 49.16 | 60.45 | |
Nejjar_ETH_task2_1 | NejjarETH2022 | 32 | 72.84 | 98.95 | 57.63 | 40.59 | 32.74 | 53.40 | 0.00 | 0.00 | 0.00 | 64.47 | 81.18 | 53.47 | 66.74 | 100.00 | 50.08 | 66.67 | 100.00 | 50.00 | 37.11 | 26.87 | 60.00 | |
Nejjar_ETH_task2_2 | NejjarETH2022 | 54 | 72.84 | 98.95 | 57.63 | 40.59 | 32.74 | 53.40 | 0.00 | 0.00 | 0.00 | 64.47 | 81.18 | 53.47 | 66.74 | 100.00 | 50.08 | 66.67 | 100.00 | 50.00 | 37.11 | 26.87 | 60.00 | |
Verbitskiy_DS_task2_1 | VerbitskiyDS2022 | 27 | 66.08 | 72.32 | 60.83 | 56.54 | 64.15 | 50.55 | 21.91 | 13.70 | 54.72 | 78.58 | 79.34 | 77.84 | 60.34 | 57.99 | 62.89 | 65.83 | 59.64 | 73.45 | 83.13 | 82.92 | 83.34 | |
Verbitskiy_DS_task2_2 | VerbitskiyDS2022 | 40 | 66.08 | 72.32 | 60.83 | 46.83 | 43.54 | 50.65 | 21.91 | 13.70 | 54.72 | 72.63 | 69.00 | 76.66 | 60.34 | 57.99 | 62.89 | 61.77 | 53.13 | 73.75 | 83.13 | 82.92 | 83.34 | |
Verbitskiy_DS_task2_3 | VerbitskiyDS2022 | 34 | 72.73 | 80.20 | 66.54 | 55.17 | 59.16 | 51.69 | 27.32 | 18.27 | 54.08 | 72.63 | 69.00 | 76.66 | 60.34 | 57.99 | 62.89 | 61.77 | 53.13 | 73.75 | 83.37 | 81.88 | 84.90 | |
Verbitskiy_DS_task2_4 | VerbitskiyDS2022 | 31 | 73.33 | 80.77 | 67.14 | 54.58 | 55.20 | 53.98 | 27.22 | 18.31 | 53.02 | 72.83 | 69.38 | 76.64 | 60.81 | 57.70 | 64.26 | 61.77 | 53.13 | 73.75 | 83.89 | 82.54 | 85.29 | |
Cohen_Technion_task2_1 | CohenTechnion2022 | 64 | 62.99 | 60.04 | 66.25 | 50.78 | 50.56 | 51.00 | 41.77 | 36.70 | 48.47 | 62.74 | 62.26 | 63.22 | 56.95 | 53.40 | 61.01 | 52.88 | 49.10 | 57.31 | 61.00 | 60.03 | 62.00 | |
Cohen_Technion_task2_2 | CohenTechnion2022 | 63 | 62.28 | 58.49 | 66.58 | 49.46 | 46.88 | 52.33 | 45.49 | 41.23 | 50.74 | 64.84 | 64.57 | 65.11 | 59.83 | 58.20 | 61.55 | 60.24 | 58.87 | 61.68 | 54.21 | 54.14 | 54.27 | |
Deng_THU_task2_1 | DengTHU2022 | 9 | 66.67 | 100.00 | 50.00 | 29.52 | 19.25 | 63.29 | 40.77 | 37.08 | 45.27 | 68.13 | 97.86 | 52.26 | 61.40 | 61.45 | 61.34 | 69.01 | 98.62 | 53.07 | 79.86 | 99.66 | 66.62 | |
Deng_THU_task2_2 | DengTHU2022 | 18 | 66.67 | 100.00 | 50.00 | 48.03 | 40.70 | 58.57 | 45.50 | 44.71 | 46.33 | 68.13 | 97.86 | 52.26 | 60.53 | 60.09 | 60.99 | 68.27 | 98.95 | 52.11 | 78.23 | 100.00 | 64.24 | |
Deng_THU_task2_3 | DengTHU2022 | 13 | 66.67 | 100.00 | 50.00 | 55.17 | 57.24 | 53.25 | 41.69 | 39.29 | 44.40 | 69.25 | 97.55 | 53.68 | 56.72 | 50.67 | 64.42 | 68.93 | 98.24 | 53.09 | 79.86 | 99.66 | 66.62 | |
Deng_THU_task2_4 | DengTHU2022 | 7 | 67.64 | 100.00 | 51.11 | 37.00 | 28.94 | 51.28 | 43.49 | 38.91 | 49.30 | 70.70 | 95.25 | 56.21 | 66.55 | 98.57 | 50.24 | 68.75 | 98.24 | 52.88 | 77.22 | 94.72 | 65.17 | |
Liu_BUPT_task2_1 | LiuBUPT2022 | 55 | 12.24 | 6.97 | 50.13 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 14.99 | 8.57 | 59.60 | 8.52 | 4.55 | 66.09 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | |
Liu_BUPT_task2_2 | LiuBUPT2022 | 49 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 14.99 | 8.57 | 59.60 | 8.52 | 4.55 | 66.09 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | |
Kazakova_ITMO_task2_1 | KazakovaITMO2022 | 80 | 64.79 | 92.43 | 49.87 | 63.01 | 79.09 | 52.37 | 55.94 | 63.85 | 49.77 | 66.70 | 84.90 | 54.93 | 53.50 | 68.94 | 43.71 | 49.85 | 60.32 | 42.48 | 63.49 | 75.63 | 54.71 | |
Kazakova_ITMO_task2_2 | KazakovaITMO2022 | 86 | 27.48 | 19.11 | 48.91 | 28.93 | 21.75 | 43.20 | 25.86 | 17.13 | 52.67 | 37.28 | 31.66 | 45.34 | 12.75 | 7.85 | 33.78 | 25.50 | 16.76 | 53.25 | 0.00 | 0.00 | 0.00 | |
Kazakova_ITMO_task2_3 | KazakovaITMO2022 | 85 | 64.79 | 92.43 | 49.87 | 63.01 | 79.09 | 52.37 | 55.94 | 63.85 | 49.77 | 37.28 | 31.66 | 45.34 | 53.50 | 68.94 | 43.71 | 25.50 | 16.76 | 53.25 | 0.00 | 0.00 | 0.00 | |
Kodua_ITMO_task2_1 | KoduaITMO2022 | 74 | 7.02 | 3.75 | 55.05 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 20.07 | 11.93 | 63.05 | 16.27 | 9.12 | 75.43 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | |
Liu_CQUPT_task2_1 | LiuCQUPT2022 | 2 | 77.69 | 75.86 | 79.62 | 64.69 | 71.15 | 59.30 | 41.39 | 32.52 | 56.91 | 79.35 | 78.82 | 79.89 | 65.25 | 63.81 | 66.75 | 70.66 | 68.49 | 72.98 | 76.73 | 75.45 | 78.04 | |
Liu_CQUPT_task2_2 | LiuCQUPT2022 | 4 | 77.69 | 75.86 | 79.62 | 64.69 | 71.15 | 59.30 | 41.39 | 32.52 | 56.91 | 79.35 | 78.82 | 79.89 | 65.25 | 63.81 | 66.75 | 69.59 | 68.42 | 70.79 | 76.82 | 75.54 | 78.15 | |
Liu_CQUPT_task2_3 | LiuCQUPT2022 | 3 | 77.69 | 75.86 | 79.62 | 64.69 | 71.15 | 59.30 | 41.39 | 32.52 | 56.91 | 79.35 | 78.82 | 79.89 | 65.25 | 63.81 | 66.75 | 70.52 | 68.76 | 72.38 | 29.54 | 19.13 | 64.85 | |
Liu_CQUPT_task2_4 | LiuCQUPT2022 | 1 | 77.69 | 75.86 | 79.62 | 64.69 | 71.15 | 59.30 | 45.11 | 37.68 | 56.19 | 79.35 | 78.82 | 79.89 | 65.25 | 63.81 | 66.75 | 70.52 | 68.76 | 72.38 | 71.03 | 68.47 | 73.80 | |
Siang_NTHU_task2_1 | SiangNTHU2022 | 79 | 66.74 | 100.00 | 50.08 | 65.15 | 95.93 | 49.33 | 56.77 | 67.61 | 48.93 | 57.32 | 59.96 | 54.90 | 60.10 | 77.17 | 49.22 | 58.21 | 65.32 | 52.50 | 66.03 | 93.61 | 51.00 | |
Tozicka_NSW_task2_1 | TozickaNSW2022 | 38 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | |
Tozicka_NSW_task2_2 | TozickaNSW2022 | 41 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | |
Tozicka_NSW_task2_3 | TozickaNSW2022 | 46 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | |
Tozicka_NSW_task2_4 | TozickaNSW2022 | 43 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | |
Almudevar_UZ_task2_1 | AlmudevarUZ2022 | 50 | 72.41 | 68.56 | 76.72 | 49.88 | 48.76 | 51.06 | 48.70 | 44.02 | 54.49 | 69.17 | 68.87 | 69.47 | 64.26 | 60.94 | 67.97 | 62.46 | 59.29 | 65.99 | 61.14 | 60.38 | 61.91 | |
Almudevar_UZ_task2_2 | AlmudevarUZ2022 | 45 | 79.18 | 76.79 | 81.72 | 48.67 | 45.94 | 51.74 | 46.85 | 43.62 | 50.60 | 71.63 | 71.16 | 72.09 | 61.79 | 58.21 | 65.85 | 64.93 | 64.03 | 65.86 | 62.56 | 61.63 | 63.53 | |
Almudevar_UZ_task2_3 | AlmudevarUZ2022 | 47 | 79.18 | 76.79 | 81.72 | 48.86 | 44.39 | 54.32 | 49.36 | 45.19 | 54.39 | 66.22 | 65.52 | 66.94 | 64.26 | 60.94 | 67.97 | 64.13 | 62.93 | 65.38 | 61.60 | 59.98 | 63.31 | |
Jalalia_AIT_task2_1 | JalaliaAIT2022 | 77 | 36.45 | 27.61 | 53.63 | 0.00 | 0.00 | 0.00 | 14.09 | 8.37 | 44.56 | 37.97 | 30.24 | 51.03 | 59.71 | 67.08 | 53.79 | 27.09 | 16.71 | 71.46 | 10.14 | 5.60 | 53.76 | |
Zorin_AIRI_task2_1 | ZorinAIRI2022 | 84 | 7.46 | 3.89 | 88.89 | 0.00 | 0.00 | 0.00 | 12.32 | 7.09 | 46.88 | 4.05 | 2.18 | 27.91 | 12.78 | 7.21 | 56.46 | 0.00 | 0.00 | 0.00 | 6.48 | 3.48 | 47.06 | |
Venkatesh_MERL_task2_1 | VenkateshMERL2022 | 23 | 81.19 | 100.00 | 68.34 | 39.09 | 28.29 | 63.21 | 0.00 | 0.00 | 0.00 | 75.69 | 94.52 | 63.12 | 68.43 | 97.86 | 52.61 | 67.73 | 77.42 | 60.19 | 0.00 | 0.00 | 0.00 | |
Venkatesh_MERL_task2_2 | VenkateshMERL2022 | 24 | 78.33 | 100.00 | 64.38 | 42.04 | 32.52 | 59.44 | 59.27 | 72.85 | 49.96 | 71.05 | 92.11 | 57.83 | 67.49 | 97.86 | 51.51 | 70.12 | 84.80 | 59.77 | 74.94 | 83.65 | 67.87 | |
Venkatesh_MERL_task2_3 | VenkateshMERL2022 | 8 | 81.19 | 100.00 | 68.34 | 42.50 | 31.38 | 65.84 | 58.09 | 69.75 | 49.78 | 0.00 | 0.00 | 0.00 | 67.10 | 96.92 | 51.31 | 67.88 | 77.42 | 60.43 | 75.78 | 82.49 | 70.08 | |
Venkatesh_MERL_task2_4 | VenkateshMERL2022 | 10 | 79.58 | 100.00 | 66.08 | 38.29 | 27.31 | 64.07 | 59.51 | 73.47 | 50.01 | 71.33 | 90.61 | 58.82 | 66.87 | 96.00 | 51.30 | 66.22 | 76.25 | 58.52 | 76.31 | 83.99 | 69.92 |
Domain-wise performance
Rank | Submission Information | Ranking | Eveluation Dataset in Source Domain | Eveluation Dataset in Target Domain | ||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Submission Code |
Technical Report |
Official Rank |
Official Score |
Harmonic mean (AUC, source) |
ToyCar (AUC, source) |
ToyCar (pAUC, source) |
ToyTrain (AUC, source) |
ToyTrain (pAUC, source) |
Fan (AUC, source) |
Fan (pAUC, source) |
Gearbox (AUC, source) |
Gearbox (pAUC, source) |
Bearing (AUC, source) |
Bearing (pAUC, source) |
Slider (AUC, source) |
Slider (pAUC, source) |
Valve (AUC, source) |
Valve (pAUC, source) |
Harmonic mean (AUC, target) |
ToyCar (AUC, target) |
ToyCar (pAUC, target) |
ToyTrain (AUC, target) |
ToyTrain (pAUC, target) |
Fan (AUC, target) |
Fan (pAUC, target) |
Gearbox (AUC, target) |
Gearbox (pAUC, target) |
Bearing (AUC, target) |
Bearing (pAUC, target) |
Slider (AUC, target) |
Slider (pAUC, target) |
Valve (AUC, target) |
Valve (pAUC, target) |
|
DCASE2022_baseline_task2_AE | DCASE2022baseline2022 | 75 | 52.942 | 64.23 | 82.04 | 71.00 | 50.18 | 49.68 | 84.56 | 52.21 | 60.18 | 54.32 | 63.38 | 47.37 | 75.66 | 54.00 | 56.96 | 54.32 | 45.13 | 58.22 | 58.43 | 41.96 | 49.30 | 37.33 | 49.72 | 62.28 | 51.50 | 59.28 | 55.49 | 56.46 | 54.19 | 53.75 | 50.74 | |
DCASE2022_baseline_task2_MNV2 | DCASE2022baseline2022 | 68 | 54.017 | 58.48 | 35.76 | 48.58 | 40.98 | 48.89 | 79.78 | 65.95 | 78.92 | 48.84 | 57.22 | 50.74 | 94.60 | 52.21 | 72.28 | 69.79 | 50.49 | 44.54 | 54.53 | 53.92 | 51.42 | 46.88 | 53.48 | 47.99 | 48.43 | 58.43 | 52.45 | 58.44 | 53.25 | 72.87 | 64.31 | |
Bai_JLESS_task2_1 | BaiJLESS2022 | 26 | 63.949 | 74.16 | 86.32 | 68.84 | 42.48 | 49.26 | 80.40 | 52.32 | 62.42 | 50.16 | 65.04 | 51.63 | 96.10 | 57.95 | 92.60 | 65.16 | 59.35 | 93.43 | 77.92 | 54.15 | 51.63 | 42.91 | 50.43 | 74.03 | 66.28 | 63.82 | 56.71 | 74.40 | 66.87 | 81.49 | 69.49 | |
Bai_JLESS_task2_2 | BaiJLESS2022 | 44 | 61.102 | 68.53 | 62.56 | 60.11 | 56.89 | 51.42 | 80.08 | 62.74 | 67.12 | 51.89 | 67.54 | 51.42 | 96.10 | 57.95 | 68.66 | 60.32 | 57.23 | 51.87 | 52.11 | 49.65 | 52.12 | 49.46 | 54.53 | 74.11 | 70.53 | 64.43 | 56.56 | 74.40 | 66.87 | 80.38 | 68.44 | |
Bai_JLESS_task2_3 | BaiJLESS2022 | 42 | 62.208 | 73.27 | 86.32 | 68.84 | 42.48 | 49.26 | 80.40 | 52.32 | 68.52 | 47.47 | 57.74 | 48.63 | 93.20 | 61.79 | 92.60 | 65.16 | 58.69 | 93.43 | 77.92 | 54.15 | 51.63 | 42.91 | 50.43 | 71.69 | 51.29 | 70.79 | 56.91 | 63.04 | 54.31 | 81.49 | 69.49 | |
Kuroyanagi_NU-HDL_task2_1 | KuroyanagiNU-HDL2022 | 20 | 66.503 | 81.35 | 65.68 | 53.79 | 52.20 | 49.53 | 96.42 | 57.11 | 93.24 | 47.58 | 85.30 | 55.37 | 99.80 | 59.53 | 98.10 | 93.37 | 59.88 | 65.34 | 76.55 | 57.60 | 52.43 | 44.55 | 51.85 | 80.30 | 69.86 | 77.34 | 62.96 | 77.83 | 58.68 | 91.43 | 82.22 | |
Kuroyanagi_NU-HDL_task2_2 | KuroyanagiNU-HDL2022 | 5 | 68.223 | 79.36 | 75.46 | 56.11 | 54.18 | 50.84 | 87.90 | 60.16 | 94.74 | 47.37 | 77.96 | 48.11 | 90.54 | 52.11 | 95.78 | 92.26 | 66.21 | 83.04 | 76.55 | 55.19 | 51.98 | 53.46 | 51.79 | 82.72 | 70.37 | 71.27 | 60.36 | 78.90 | 59.45 | 90.11 | 81.63 | |
Kuroyanagi_NU-HDL_task2_3 | KuroyanagiNU-HDL2022 | 52 | 59.034 | 50.86 | 72.36 | 54.26 | 54.04 | 54.95 | 91.54 | 56.79 | 95.14 | 49.21 | 68.78 | 48.00 | 6.22 | 47.37 | 99.54 | 98.21 | 66.47 | 77.17 | 76.82 | 55.67 | 52.36 | 54.65 | 54.42 | 86.61 | 74.43 | 74.79 | 58.64 | 79.49 | 59.31 | 89.12 | 82.14 | |
Kuroyanagi_NU-HDL_task2_4 | KuroyanagiNU-HDL2022 | 12 | 67.142 | 82.00 | 72.04 | 56.00 | 54.96 | 49.47 | 89.72 | 59.95 | 89.78 | 47.53 | 71.84 | 51.53 | 88.18 | 51.53 | 99.04 | 96.58 | 60.88 | 88.96 | 78.69 | 44.34 | 51.83 | 49.80 | 55.23 | 82.54 | 67.39 | 77.94 | 58.10 | 80.10 | 61.81 | 91.25 | 86.36 | |
LEE_KNU_task2_1 | LEEKNU2022 | 82 | 49.267 | 49.88 | 58.05 | 50.35 | 43.80 | 50.28 | 59.96 | 49.49 | 43.56 | 48.99 | 58.56 | 50.40 | 49.61 | 49.39 | 46.26 | 48.98 | 47.71 | 52.49 | 51.93 | 45.32 | 49.70 | 48.07 | 50.43 | 51.15 | 49.60 | 45.84 | 49.83 | 47.06 | 49.25 | 49.98 | 52.38 | |
Narita_AIT_task2_1 | NaritaAIT2022 | 70 | 53.512 | 55.48 | 84.74 | 62.89 | 60.00 | 53.37 | 97.80 | 65.68 | 84.46 | 71.74 | 69.92 | 49.42 | 99.66 | 56.16 | 100.00 | 93.95 | 51.87 | 55.61 | 51.89 | 45.86 | 50.62 | 55.24 | 53.18 | 52.76 | 53.23 | 53.21 | 49.83 | 44.11 | 50.13 | 46.77 | 54.21 | |
Narita_AIT_task2_2 | NaritaAIT2022 | 69 | 53.850 | 56.58 | 84.04 | 73.63 | 61.46 | 52.21 | 97.44 | 64.84 | 73.36 | 62.47 | 69.30 | 50.95 | 99.76 | 58.74 | 90.74 | 66.47 | 52.20 | 57.82 | 54.07 | 49.35 | 50.04 | 53.91 | 52.88 | 50.61 | 52.60 | 54.30 | 50.42 | 45.14 | 50.57 | 48.22 | 51.46 | |
Narita_AIT_task2_3 | NaritaAIT2022 | 76 | 52.359 | 54.77 | 84.44 | 56.63 | 58.12 | 52.42 | 90.36 | 56.63 | 65.14 | 57.95 | 71.58 | 54.11 | 78.64 | 55.63 | 85.90 | 74.47 | 50.45 | 54.19 | 51.41 | 47.93 | 50.16 | 50.54 | 52.31 | 48.68 | 50.95 | 53.42 | 50.49 | 47.27 | 50.10 | 46.18 | 52.01 | |
Narita_AIT_task2_4 | NaritaAIT2022 | 72 | 53.274 | 55.57 | 84.04 | 73.63 | 61.46 | 52.21 | 90.36 | 56.63 | 65.14 | 57.95 | 69.30 | 50.95 | 78.64 | 55.63 | 100.00 | 93.95 | 51.46 | 57.82 | 54.07 | 49.35 | 50.04 | 50.54 | 52.31 | 48.68 | 50.95 | 54.30 | 50.42 | 47.27 | 50.10 | 46.77 | 54.21 | |
Du_NERCSLIP_task2_1 | DuNERCSLIP2022 | 33 | 63.303 | 76.37 | 64.12 | 49.58 | 55.92 | 48.32 | 74.18 | 50.32 | 96.30 | 51.00 | 60.56 | 49.47 | 96.70 | 50.84 | 94.42 | 92.84 | 59.07 | 63.75 | 61.45 | 58.60 | 52.41 | 50.14 | 51.86 | 80.18 | 62.57 | 68.60 | 55.28 | 63.50 | 57.78 | 85.85 | 73.17 | |
Du_NERCSLIP_task2_2 | DuNERCSLIP2022 | 37 | 62.580 | 75.77 | 53.54 | 48.89 | 56.46 | 49.21 | 74.18 | 50.32 | 96.52 | 50.42 | 60.56 | 49.47 | 97.90 | 51.00 | 94.40 | 92.84 | 57.61 | 56.62 | 59.97 | 59.65 | 53.01 | 50.14 | 51.86 | 79.27 | 63.38 | 68.60 | 55.28 | 63.67 | 55.82 | 86.57 | 75.73 | |
Du_NERCSLIP_task2_3 | DuNERCSLIP2022 | 35 | 62.787 | 76.82 | 53.54 | 48.89 | 56.46 | 48.89 | 81.10 | 50.05 | 96.88 | 50.47 | 64.18 | 49.37 | 97.90 | 51.00 | 94.40 | 92.84 | 57.59 | 56.62 | 59.97 | 59.97 | 53.81 | 50.33 | 51.78 | 79.58 | 62.71 | 69.01 | 54.79 | 63.67 | 55.82 | 86.57 | 75.73 | |
Du_NERCSLIP_task2_4 | DuNERCSLIP2022 | 36 | 62.749 | 75.12 | 48.20 | 49.11 | 51.82 | 50.95 | 85.74 | 50.58 | 96.40 | 49.53 | 66.06 | 49.47 | 100.00 | 53.74 | 94.58 | 92.95 | 58.47 | 52.15 | 59.35 | 60.07 | 54.25 | 53.05 | 53.15 | 78.23 | 60.84 | 68.73 | 52.44 | 67.79 | 56.28 | 87.45 | 77.96 | |
Jinhyuk_SNU_task2_1 | JinhyukSNU2022 | 78 | 51.027 | 62.46 | 78.94 | 59.42 | 49.94 | 49.11 | 83.32 | 55.89 | 54.60 | 51.68 | 63.42 | 47.47 | 72.58 | 52.63 | 49.56 | 51.89 | 42.54 | 48.24 | 53.02 | 41.54 | 49.02 | 37.54 | 49.99 | 59.26 | 51.19 | 58.75 | 55.24 | 56.33 | 53.87 | 49.17 | 50.79 | |
Hu_NJU_task2_1 | HuNJU2022 | 65 | 55.075 | 61.16 | 73.22 | 51.32 | 42.83 | 49.47 | 61.70 | 52.47 | 62.46 | 53.47 | 66.43 | 51.74 | 62.44 | 52.00 | 55.78 | 50.84 | 51.90 | 68.57 | 57.76 | 52.03 | 51.30 | 47.19 | 50.92 | 63.95 | 54.54 | 61.74 | 57.88 | 59.17 | 52.78 | 44.99 | 49.43 | |
Wilkinghoff_FKIE_task2_1 | WilkinghoffFKIE2022 | 28 | 63.724 | 85.37 | 97.18 | 68.42 | 77.68 | 52.21 | 77.30 | 49.05 | 96.24 | 58.37 | 82.56 | 61.53 | 95.26 | 52.53 | 98.36 | 90.47 | 54.12 | 76.92 | 65.05 | 58.65 | 57.02 | 36.85 | 48.84 | 80.48 | 58.72 | 74.70 | 66.23 | 63.35 | 55.67 | 82.23 | 67.37 | |
Wilkinghoff_FKIE_task2_2 | WilkinghoffFKIE2022 | 30 | 63.557 | 85.03 | 97.26 | 70.53 | 81.82 | 52.47 | 72.44 | 50.21 | 92.74 | 60.84 | 80.54 | 60.26 | 93.40 | 52.16 | 98.84 | 89.11 | 54.13 | 75.08 | 64.51 | 58.04 | 57.48 | 38.59 | 48.55 | 78.47 | 58.27 | 73.72 | 65.33 | 64.15 | 55.33 | 79.64 | 65.87 | |
Wei_HEU_task2_1 | WeiHEU2022 | 29 | 63.603 | 77.89 | 67.98 | 54.79 | 67.75 | 54.47 | 76.27 | 50.26 | 93.58 | 51.11 | 71.10 | 52.74 | 95.84 | 56.00 | 91.68 | 92.42 | 58.46 | 73.79 | 62.78 | 62.06 | 53.53 | 41.57 | 51.93 | 76.70 | 58.63 | 66.97 | 55.09 | 68.85 | 55.47 | 84.64 | 77.94 | |
Wei_HEU_task2_2 | WeiHEU2022 | 14 | 67.122 | 73.31 | 97.22 | 92.16 | 62.38 | 48.95 | 93.66 | 52.47 | 93.66 | 66.42 | 72.40 | 49.00 | 76.40 | 52.47 | 93.74 | 79.58 | 67.96 | 97.16 | 87.40 | 56.13 | 54.05 | 45.19 | 51.58 | 83.66 | 59.84 | 70.09 | 60.63 | 74.01 | 63.04 | 82.83 | 65.07 | |
Wei_HEU_task2_3 | WeiHEU2022 | 17 | 66.652 | 76.76 | 96.16 | 90.26 | 62.38 | 48.95 | 93.82 | 52.58 | 94.66 | 61.05 | 71.10 | 52.74 | 82.04 | 52.89 | 94.90 | 85.95 | 64.51 | 95.82 | 86.69 | 56.13 | 54.05 | 44.77 | 49.59 | 83.59 | 62.43 | 66.97 | 55.09 | 73.25 | 62.35 | 84.73 | 67.84 | |
Guan_HEU_task2_1 | GuanHEU2022 | 22 | 66.395 | 77.50 | 95.16 | 84.95 | 64.80 | 48.68 | 98.14 | 56.05 | 95.86 | 72.79 | 77.88 | 49.89 | 84.10 | 53.21 | 94.28 | 77.84 | 63.51 | 87.99 | 77.55 | 57.01 | 54.07 | 44.74 | 50.95 | 81.44 | 61.36 | 70.58 | 61.17 | 72.61 | 62.01 | 80.23 | 62.49 | |
Guan_HEU_task2_2 | GuanHEU2022 | 19 | 66.532 | 75.96 | 94.82 | 90.21 | 64.72 | 48.74 | 98.14 | 56.05 | 94.92 | 67.47 | 71.70 | 48.89 | 79.94 | 51.84 | 93.64 | 74.79 | 65.15 | 93.99 | 84.84 | 57.03 | 54.08 | 44.74 | 50.95 | 81.99 | 59.74 | 71.47 | 60.09 | 72.43 | 62.29 | 80.16 | 60.91 | |
Guan_HEU_task2_3 | GuanHEU2022 | 11 | 67.419 | 81.36 | 88.12 | 63.05 | 70.70 | 49.74 | 98.00 | 56.21 | 95.68 | 58.63 | 69.24 | 50.00 | 93.86 | 56.26 | 97.00 | 91.79 | 63.89 | 84.46 | 72.73 | 60.31 | 56.32 | 50.92 | 53.25 | 79.04 | 58.34 | 66.81 | 53.64 | 72.31 | 62.40 | 86.65 | 76.48 | |
Guan_HEU_task2_4 | GuanHEU2022 | 6 | 68.037 | 81.43 | 92.62 | 77.16 | 70.24 | 49.63 | 98.00 | 56.21 | 94.92 | 56.00 | 68.54 | 49.74 | 91.24 | 52.79 | 97.20 | 92.58 | 64.88 | 90.56 | 78.67 | 60.18 | 56.41 | 50.92 | 53.25 | 79.32 | 58.52 | 66.98 | 53.35 | 72.59 | 62.49 | 87.87 | 77.01 | |
Li_CTRI_task2_1 | LiCTRI2022 | 59 | 58.128 | 66.11 | 88.76 | 64.53 | 50.86 | 49.53 | 81.94 | 52.84 | 73.54 | 48.74 | 56.36 | 47.79 | 94.64 | 55.89 | 71.32 | 68.95 | 55.06 | 73.69 | 58.38 | 48.36 | 50.01 | 36.43 | 49.65 | 68.24 | 54.17 | 64.90 | 51.70 | 61.11 | 50.98 | 84.17 | 74.01 | |
Morita_SECOM_task2_1 | MoritaSECOM2022 | 15 | 66.825 | 77.14 | 78.18 | 65.95 | 58.38 | 49.74 | 86.84 | 57.53 | 82.70 | 56.21 | 75.26 | 57.00 | 99.26 | 49.42 | 98.94 | 93.42 | 63.80 | 96.04 | 76.34 | 58.29 | 55.91 | 46.37 | 51.62 | 79.37 | 60.46 | 73.82 | 64.58 | 64.55 | 65.62 | 82.62 | 66.03 | |
Morita_SECOM_task2_2 | MoritaSECOM2022 | 25 | 65.088 | 80.14 | 89.42 | 75.68 | 60.98 | 50.84 | 87.16 | 52.11 | 90.70 | 67.95 | 75.88 | 50.00 | 98.84 | 53.05 | 97.74 | 89.53 | 58.23 | 94.62 | 77.94 | 61.36 | 53.87 | 41.79 | 50.00 | 88.43 | 73.64 | 65.73 | 54.17 | 62.09 | 64.44 | 69.49 | 63.39 | |
Morita_SECOM_task2_3 | MoritaSECOM2022 | 21 | 66.399 | 80.90 | 78.18 | 65.95 | 60.98 | 50.84 | 87.16 | 52.11 | 90.70 | 67.95 | 74.82 | 51.53 | 98.84 | 53.05 | 98.20 | 92.16 | 59.96 | 96.04 | 76.34 | 61.36 | 53.87 | 41.79 | 50.00 | 88.43 | 73.64 | 71.63 | 62.15 | 62.09 | 64.44 | 77.68 | 65.88 | |
Morita_SECOM_task2_4 | MoritaSECOM2022 | 16 | 66.724 | 78.14 | 78.18 | 65.95 | 60.98 | 50.84 | 88.40 | 56.16 | 90.70 | 67.95 | 74.82 | 51.53 | 98.84 | 53.05 | 96.86 | 86.68 | 62.35 | 96.04 | 76.34 | 61.36 | 53.87 | 42.70 | 50.83 | 88.43 | 73.64 | 71.63 | 62.15 | 62.09 | 64.44 | 79.72 | 64.57 | |
Yamashita_GU_task2_1 | YamashitaGU2022 | 53 | 58.941 | 69.82 | 83.64 | 75.11 | 65.86 | 47.79 | 72.36 | 52.16 | 59.82 | 62.95 | 57.86 | 47.37 | 78.50 | 53.00 | 95.52 | 84.26 | 53.58 | 60.46 | 57.33 | 54.01 | 49.47 | 39.02 | 49.13 | 69.41 | 54.65 | 55.72 | 50.88 | 71.26 | 61.68 | 84.44 | 71.18 | |
Yamashita_GU_task2_2 | YamashitaGU2022 | 61 | 57.965 | 69.08 | 92.04 | 81.00 | 66.56 | 49.42 | 81.56 | 54.16 | 74.82 | 63.84 | 64.68 | 47.37 | 76.84 | 53.47 | 58.04 | 54.95 | 52.44 | 74.41 | 66.35 | 50.92 | 49.75 | 39.85 | 49.64 | 75.78 | 59.03 | 55.76 | 51.92 | 68.58 | 58.77 | 56.95 | 51.31 | |
Yamashita_GU_task2_3 | YamashitaGU2022 | 39 | 62.387 | 73.21 | 92.04 | 81.00 | 65.86 | 47.79 | 83.10 | 51.89 | 74.82 | 63.84 | 64.54 | 47.47 | 76.84 | 53.47 | 95.52 | 84.26 | 58.70 | 74.41 | 66.35 | 54.01 | 49.47 | 42.92 | 49.49 | 75.78 | 59.03 | 62.67 | 53.67 | 68.58 | 58.77 | 84.44 | 71.18 | |
Yamashita_GU_task2_4 | YamashitaGU2022 | 48 | 60.209 | 73.22 | 89.46 | 78.74 | 66.18 | 47.89 | 83.00 | 51.84 | 74.82 | 63.84 | 64.06 | 47.37 | 78.88 | 53.47 | 95.52 | 84.26 | 53.57 | 65.30 | 60.22 | 53.72 | 49.56 | 35.25 | 49.30 | 75.78 | 59.03 | 61.83 | 53.36 | 70.68 | 61.76 | 84.44 | 71.18 | |
CHO_SG_task2_1 | CHOSG2022 | 67 | 54.469 | 54.69 | 65.78 | 57.95 | 51.42 | 51.26 | 57.86 | 50.26 | 62.26 | 50.63 | 54.34 | 54.84 | 46.18 | 49.68 | 34.82 | 49.32 | 55.93 | 73.73 | 56.30 | 55.46 | 51.70 | 48.17 | 50.89 | 63.84 | 56.48 | 52.32 | 50.61 | 48.02 | 50.96 | 60.29 | 55.57 | |
CHO_SG_task2_2 | CHOSG2022 | 81 | 49.288 | 47.39 | 36.10 | 49.53 | 46.46 | 49.95 | 38.80 | 49.16 | 54.80 | 51.89 | 45.10 | 48.53 | 54.34 | 52.37 | 47.66 | 49.47 | 49.91 | 53.26 | 51.42 | 50.04 | 50.64 | 48.20 | 51.32 | 49.00 | 51.97 | 51.42 | 50.66 | 43.95 | 49.90 | 50.75 | 49.82 | |
Li_JAIST_task2 | LiJAIST2022 | 83 | 48.939 | 47.60 | 69.62 | 57.16 | 50.66 | 52.26 | 69.06 | 49.63 | 45.36 | 47.63 | 63.54 | 51.89 | 28.64 | 47.37 | 91.76 | 89.21 | 46.65 | 56.81 | 56.35 | 47.23 | 50.12 | 42.20 | 50.18 | 44.19 | 48.71 | 41.04 | 50.03 | 33.28 | 48.10 | 81.19 | 73.56 | |
Gou_UESTC_task2_1 | GouUESTC2022 | 56 | 58.723 | 64.02 | 59.78 | 53.84 | 56.22 | 53.05 | 79.76 | 53.42 | 79.74 | 58.42 | 75.88 | 51.11 | 93.48 | 58.79 | 64.92 | 55.47 | 57.11 | 71.05 | 68.33 | 50.33 | 50.02 | 48.22 | 51.06 | 78.30 | 67.54 | 52.77 | 50.78 | 60.60 | 57.83 | 60.73 | 51.92 | |
Gou_UESTC_task2_2 | GouUESTC2022 | 58 | 58.434 | 61.07 | 50.28 | 48.74 | 51.40 | 49.32 | 70.84 | 52.47 | 71.10 | 62.32 | 73.56 | 55.68 | 86.42 | 59.79 | 79.64 | 68.16 | 57.98 | 64.23 | 60.85 | 47.56 | 49.96 | 46.33 | 52.74 | 71.46 | 62.26 | 54.72 | 51.20 | 57.37 | 55.02 | 81.41 | 68.49 | |
Gou_UESTC_task2_3 | GouUESTC2022 | 60 | 58.093 | 61.64 | 74.34 | 48.63 | 46.98 | 49.53 | 76.68 | 51.89 | 65.44 | 63.95 | 73.64 | 51.68 | 87.96 | 59.21 | 50.40 | 48.68 | 56.55 | 70.23 | 66.90 | 50.94 | 51.57 | 39.79 | 48.94 | 76.81 | 67.51 | 64.47 | 56.57 | 63.64 | 58.45 | 56.83 | 55.23 | |
Gou_UESTC_task2_4 | GouUESTC2022 | 51 | 59.169 | 65.32 | 59.78 | 53.84 | 56.22 | 53.05 | 76.68 | 51.89 | 79.74 | 58.42 | 44.14 | 48.47 | 76.72 | 52.89 | 79.64 | 68.16 | 56.06 | 71.05 | 68.33 | 50.33 | 50.02 | 39.79 | 48.94 | 78.30 | 67.54 | 61.19 | 53.14 | 58.57 | 53.83 | 81.41 | 68.49 | |
PENG_NJUPT_task2_1 | PENGNJUPT2022 | 57 | 58.473 | 66.63 | 49.90 | 51.42 | 41.24 | 48.00 | 77.20 | 52.63 | 85.36 | 51.63 | 72.08 | 48.95 | 94.44 | 57.68 | 85.62 | 86.21 | 54.47 | 59.04 | 58.73 | 50.29 | 51.68 | 47.80 | 50.87 | 55.18 | 51.03 | 60.85 | 55.92 | 72.58 | 58.68 | 76.67 | 68.80 | |
PENG_NJUPT_task2_2 | PENGNJUPT2022 | 62 | 57.379 | 64.33 | 47.50 | 48.21 | 39.04 | 48.26 | 80.66 | 52.00 | 89.26 | 58.05 | 62.70 | 50.42 | 98.50 | 56.32 | 92.66 | 91.42 | 53.53 | 54.55 | 58.16 | 55.53 | 52.31 | 46.50 | 50.90 | 52.83 | 49.96 | 65.74 | 52.44 | 73.46 | 61.98 | 59.86 | 65.62 | |
Nejjar_ETH_task2_1 | NejjarETH2022 | 32 | 63.346 | 76.52 | 91.24 | 78.37 | 51.64 | 51.21 | 87.80 | 55.89 | 73.74 | 57.37 | 74.62 | 57.68 | 98.50 | 54.32 | 76.64 | 57.79 | 58.08 | 89.31 | 79.95 | 64.20 | 54.87 | 44.92 | 49.84 | 71.32 | 60.03 | 67.91 | 58.82 | 55.80 | 52.42 | 76.21 | 63.47 | |
Nejjar_ETH_task2_2 | NejjarETH2022 | 54 | 58.850 | 60.87 | 91.24 | 78.37 | 51.64 | 51.21 | 12.20 | 48.89 | 73.74 | 57.37 | 74.62 | 57.68 | 98.50 | 54.32 | 76.64 | 57.79 | 57.63 | 89.31 | 79.95 | 64.20 | 54.87 | 43.80 | 48.60 | 71.32 | 60.03 | 67.91 | 58.82 | 55.80 | 52.42 | 76.21 | 63.47 | |
Verbitskiy_DS_task2_1 | VerbitskiyDS2022 | 27 | 63.831 | 74.27 | 72.54 | 53.63 | 48.32 | 47.89 | 88.80 | 54.05 | 85.18 | 67.89 | 69.80 | 60.16 | 95.26 | 55.11 | 98.58 | 94.42 | 58.57 | 71.44 | 68.38 | 53.01 | 51.68 | 40.77 | 50.37 | 86.71 | 76.45 | 64.63 | 56.49 | 69.23 | 57.79 | 91.49 | 78.54 | |
Verbitskiy_DS_task2_2 | VerbitskiyDS2022 | 40 | 62.382 | 75.58 | 72.54 | 53.63 | 41.62 | 50.05 | 88.80 | 54.05 | 89.32 | 63.58 | 69.80 | 60.16 | 94.18 | 54.47 | 98.58 | 94.42 | 55.63 | 71.44 | 68.38 | 54.66 | 51.18 | 40.77 | 50.37 | 77.04 | 63.37 | 64.63 | 56.49 | 63.34 | 55.80 | 91.49 | 78.54 | |
Verbitskiy_DS_task2_3 | VerbitskiyDS2022 | 34 | 62.903 | 74.41 | 70.38 | 48.63 | 46.94 | 48.63 | 80.30 | 52.95 | 89.32 | 63.58 | 69.80 | 60.16 | 94.18 | 54.47 | 98.22 | 94.21 | 57.24 | 78.39 | 69.04 | 54.97 | 53.51 | 39.96 | 50.07 | 77.04 | 63.37 | 64.63 | 56.49 | 63.34 | 55.80 | 93.05 | 82.18 | |
Verbitskiy_DS_task2_4 | VerbitskiyDS2022 | 31 | 63.393 | 75.74 | 72.32 | 48.63 | 42.28 | 49.74 | 80.78 | 53.32 | 89.32 | 63.26 | 70.54 | 61.68 | 94.18 | 54.47 | 98.08 | 94.11 | 57.49 | 79.40 | 69.67 | 57.84 | 52.78 | 39.89 | 50.11 | 77.03 | 63.27 | 67.41 | 56.99 | 63.34 | 55.80 | 93.00 | 84.19 | |
Cohen_Technion_task2_1 | CohenTechnion2022 | 64 | 55.640 | 69.03 | 81.38 | 60.21 | 50.48 | 48.26 | 67.22 | 49.63 | 78.92 | 52.53 | 74.37 | 49.52 | 84.96 | 49.37 | 74.00 | 56.95 | 49.05 | 70.10 | 58.63 | 53.12 | 51.74 | 41.67 | 49.53 | 60.03 | 49.82 | 55.82 | 57.37 | 50.75 | 49.33 | 64.82 | 53.45 | |
Cohen_Technion_task2_2 | CohenTechnion2022 | 63 | 56.763 | 66.63 | 89.24 | 68.37 | 45.92 | 49.63 | 80.92 | 52.84 | 67.60 | 52.21 | 64.02 | 53.95 | 86.58 | 55.84 | 58.90 | 55.11 | 51.69 | 73.37 | 64.92 | 51.53 | 50.10 | 39.78 | 49.64 | 67.29 | 51.79 | 64.48 | 59.98 | 60.59 | 53.11 | 54.38 | 51.13 | |
Deng_THU_task2_1 | DengTHU2022 | 9 | 67.527 | 75.30 | 77.08 | 57.84 | 59.64 | 52.95 | 79.24 | 64.11 | 98.18 | 52.42 | 66.62 | 53.21 | 96.56 | 51.32 | 97.66 | 92.95 | 64.63 | 80.03 | 69.77 | 63.60 | 56.06 | 40.33 | 51.61 | 85.96 | 74.89 | 66.52 | 59.29 | 79.31 | 70.61 | 92.22 | 85.08 | |
Deng_THU_task2_2 | DengTHU2022 | 18 | 66.533 | 77.65 | 77.62 | 58.42 | 56.98 | 52.74 | 84.02 | 59.58 | 98.18 | 52.42 | 72.28 | 54.68 | 97.28 | 51.42 | 98.84 | 92.74 | 60.74 | 80.69 | 70.73 | 63.03 | 55.68 | 36.97 | 50.92 | 85.96 | 74.89 | 65.25 | 59.73 | 78.24 | 68.43 | 92.23 | 85.15 | |
Deng_THU_task2_3 | DengTHU2022 | 13 | 67.134 | 74.45 | 76.22 | 57.26 | 53.26 | 52.95 | 77.10 | 61.63 | 98.34 | 52.47 | 67.52 | 53.53 | 96.28 | 51.32 | 97.66 | 92.95 | 64.28 | 78.78 | 69.46 | 60.92 | 55.14 | 40.41 | 51.50 | 87.32 | 76.61 | 66.91 | 59.29 | 79.62 | 70.74 | 92.22 | 85.08 | |
Deng_THU_task2_4 | DengTHU2022 | 7 | 67.624 | 77.16 | 81.74 | 59.00 | 47.74 | 51.21 | 86.20 | 51.47 | 95.14 | 57.32 | 72.20 | 52.00 | 95.96 | 51.26 | 96.86 | 92.58 | 64.52 | 89.96 | 72.81 | 63.25 | 54.23 | 40.56 | 49.73 | 83.61 | 68.59 | 71.12 | 62.32 | 79.17 | 71.43 | 88.37 | 82.59 | |
Liu_BUPT_task2_1 | LiuBUPT2022 | 55 | 58.807 | 72.12 | 50.12 | 51.21 | 46.58 | 48.42 | 78.46 | 53.37 | 92.94 | 48.11 | 83.68 | 52.53 | 98.44 | 58.42 | 90.92 | 92.32 | 51.98 | 57.89 | 52.46 | 45.58 | 50.07 | 42.17 | 49.55 | 66.36 | 51.65 | 67.75 | 58.33 | 67.64 | 60.11 | 84.07 | 78.48 | |
Liu_BUPT_task2_2 | LiuBUPT2022 | 49 | 59.768 | 74.74 | 81.20 | 68.37 | 52.68 | 49.63 | 77.84 | 53.05 | 92.94 | 48.11 | 83.68 | 52.53 | 98.44 | 58.42 | 90.92 | 92.32 | 51.93 | 50.43 | 56.06 | 55.91 | 51.61 | 39.61 | 49.73 | 66.36 | 51.65 | 67.75 | 58.33 | 67.64 | 60.11 | 84.07 | 78.48 | |
Kazakova_ITMO_task2_1 | KazakovaITMO2022 | 80 | 49.896 | 47.71 | 61.86 | 54.79 | 57.76 | 50.95 | 35.72 | 48.89 | 60.82 | 50.00 | 43.08 | 51.95 | 62.30 | 51.58 | 65.88 | 55.74 | 50.64 | 57.69 | 54.94 | 50.12 | 50.63 | 52.66 | 50.79 | 51.33 | 49.47 | 38.92 | 49.50 | 37.07 | 49.64 | 62.23 | 55.73 | |
Kazakova_ITMO_task2_2 | KazakovaITMO2022 | 86 | 37.563 | 32.78 | 55.58 | 48.21 | 59.02 | 52.47 | 24.84 | 48.58 | 40.30 | 47.68 | 46.78 | 52.47 | 26.02 | 49.21 | 7.18 | 47.58 | 33.92 | 28.82 | 47.56 | 48.31 | 50.34 | 50.96 | 50.47 | 56.50 | 52.28 | 48.82 | 52.28 | 53.84 | 53.03 | 16.57 | 47.97 | |
Kazakova_ITMO_task2_3 | KazakovaITMO2022 | 85 | 39.579 | 33.94 | 61.86 | 54.79 | 57.76 | 50.95 | 35.72 | 48.89 | 40.30 | 47.68 | 43.08 | 51.95 | 26.02 | 49.21 | 7.18 | 47.58 | 37.40 | 57.69 | 54.94 | 50.12 | 50.63 | 52.66 | 50.79 | 56.50 | 52.28 | 38.92 | 49.50 | 53.84 | 53.03 | 16.57 | 47.97 | |
Kodua_ITMO_task2_1 | KoduaITMO2022 | 74 | 52.963 | 61.04 | 59.42 | 52.00 | 48.78 | 48.00 | 74.18 | 49.47 | 71.56 | 54.63 | 71.68 | 62.32 | 85.06 | 48.84 | 67.00 | 53.16 | 48.03 | 52.64 | 51.28 | 48.85 | 49.57 | 37.72 | 48.82 | 57.17 | 51.10 | 54.18 | 53.17 | 60.10 | 53.77 | 60.20 | 51.65 | |
Liu_CQUPT_task2_1 | LiuCQUPT2022 | 2 | 70.167 | 81.06 | 88.18 | 80.05 | 61.76 | 57.95 | 90.00 | 65.05 | 96.68 | 68.26 | 71.90 | 54.21 | 95.36 | 68.21 | 94.38 | 82.26 | 67.55 | 88.51 | 82.19 | 72.50 | 61.82 | 50.05 | 54.04 | 84.19 | 63.47 | 68.27 | 54.50 | 72.37 | 63.69 | 84.85 | 74.11 | |
Liu_CQUPT_task2_2 | LiuCQUPT2022 | 4 | 69.704 | 80.71 | 88.18 | 80.05 | 61.76 | 57.95 | 90.00 | 65.05 | 96.68 | 68.26 | 71.90 | 54.21 | 96.92 | 66.47 | 95.40 | 75.74 | 67.45 | 88.51 | 82.19 | 72.50 | 61.82 | 50.05 | 54.04 | 84.19 | 63.47 | 68.27 | 54.50 | 71.45 | 63.07 | 83.51 | 67.50 | |
Liu_CQUPT_task2_3 | LiuCQUPT2022 | 3 | 69.787 | 79.83 | 88.18 | 80.05 | 61.76 | 57.95 | 90.00 | 65.05 | 96.68 | 68.26 | 71.90 | 54.21 | 95.16 | 68.37 | 87.60 | 80.05 | 67.22 | 88.51 | 82.19 | 72.50 | 61.82 | 50.05 | 54.04 | 84.19 | 63.47 | 68.27 | 54.50 | 75.57 | 66.00 | 75.22 | 72.71 | |
Liu_CQUPT_task2_4 | LiuCQUPT2022 | 1 | 70.975 | 77.34 | 88.18 | 80.05 | 61.76 | 57.95 | 74.35 | 72.26 | 96.68 | 68.26 | 71.90 | 54.21 | 95.16 | 68.37 | 93.80 | 81.68 | 72.12 | 88.51 | 82.19 | 72.50 | 61.82 | 54.84 | 55.06 | 84.19 | 63.47 | 68.27 | 54.50 | 75.57 | 66.00 | 82.13 | 74.04 | |
Siang_NTHU_task2_1 | SiangNTHU2022 | 79 | 50.591 | 51.47 | 63.22 | 55.79 | 50.38 | 50.84 | 49.12 | 52.21 | 57.00 | 49.21 | 49.62 | 51.84 | 52.56 | 50.61 | 81.94 | 62.95 | 48.58 | 56.88 | 51.82 | 43.27 | 49.31 | 45.80 | 50.08 | 52.45 | 50.46 | 36.59 | 50.73 | 59.85 | 52.26 | 57.31 | 57.59 | |
Tozicka_NSW_task2_1 | TozickaNSW2022 | 38 | 62.500 | 71.02 | 90.20 | 81.74 | 61.58 | 50.89 | 86.34 | 62.42 | 91.82 | 81.47 | 59.08 | 50.92 | 83.04 | 49.66 | 86.72 | 86.74 | 58.34 | 87.90 | 73.72 | 57.85 | 51.03 | 38.83 | 50.48 | 86.36 | 70.56 | 60.61 | 56.09 | 59.71 | 52.69 | 72.33 | 68.01 | |
Tozicka_NSW_task2_2 | TozickaNSW2022 | 41 | 62.322 | 70.96 | 90.76 | 79.84 | 60.22 | 50.21 | 89.86 | 65.16 | 84.96 | 73.89 | 58.86 | 51.56 | 83.82 | 48.42 | 86.72 | 86.74 | 58.17 | 90.04 | 72.60 | 55.10 | 50.16 | 39.48 | 50.99 | 83.36 | 71.89 | 65.22 | 56.95 | 57.28 | 51.32 | 72.33 | 68.01 | |
Tozicka_NSW_task2_3 | TozickaNSW2022 | 46 | 60.462 | 69.73 | 89.28 | 77.37 | 56.54 | 49.79 | 88.06 | 56.26 | 84.94 | 78.89 | 60.06 | 52.92 | 81.74 | 49.32 | 86.72 | 86.74 | 54.65 | 75.83 | 73.08 | 52.32 | 49.98 | 38.87 | 50.32 | 83.35 | 66.73 | 60.01 | 57.55 | 55.29 | 52.21 | 72.33 | 68.01 | |
Tozicka_NSW_task2_4 | TozickaNSW2022 | 43 | 62.197 | 72.55 | 90.76 | 79.84 | 61.58 | 50.89 | 88.06 | 56.26 | 91.82 | 81.47 | 59.08 | 50.92 | 83.82 | 48.42 | 86.72 | 86.74 | 57.09 | 90.04 | 72.60 | 57.85 | 51.03 | 38.87 | 50.32 | 86.36 | 70.56 | 60.61 | 56.09 | 57.28 | 51.32 | 72.33 | 68.01 | |
Almudevar_UZ_task2_1 | AlmudevarUZ2022 | 50 | 59.377 | 71.63 | 92.68 | 70.47 | 53.64 | 47.63 | 89.36 | 50.32 | 83.72 | 47.89 | 78.62 | 55.63 | 94.92 | 48.05 | 73.62 | 61.05 | 55.41 | 81.02 | 68.07 | 51.19 | 49.90 | 43.76 | 49.16 | 71.58 | 53.34 | 66.29 | 54.87 | 59.17 | 54.56 | 63.02 | 52.78 | |
Almudevar_UZ_task2_2 | AlmudevarUZ2022 | 45 | 60.605 | 72.09 | 93.90 | 79.68 | 52.52 | 49.74 | 91.00 | 60.89 | 87.82 | 49.05 | 73.42 | 51.21 | 95.06 | 49.79 | 79.98 | 62.74 | 55.86 | 88.76 | 76.56 | 46.42 | 49.85 | 43.76 | 51.06 | 73.36 | 57.92 | 63.36 | 55.04 | 64.62 | 57.16 | 65.93 | 54.79 | |
Almudevar_UZ_task2_3 | AlmudevarUZ2022 | 47 | 60.219 | 73.43 | 93.90 | 79.68 | 52.84 | 50.47 | 90.36 | 58.74 | 82.22 | 47.58 | 78.62 | 55.63 | 94.46 | 49.05 | 68.60 | 55.63 | 54.99 | 88.76 | 76.56 | 46.10 | 49.84 | 44.61 | 50.40 | 68.96 | 53.00 | 66.29 | 54.87 | 61.50 | 55.88 | 70.34 | 55.49 | |
Jalalia_AIT_task2_1 | JalaliaAIT2022 | 77 | 51.339 | 61.03 | 79.60 | 67.00 | 48.52 | 49.68 | 84.08 | 52.89 | 59.10 | 54.47 | 62.92 | 47.74 | 75.94 | 52.53 | 52.02 | 52.89 | 43.61 | 47.95 | 53.76 | 44.09 | 49.52 | 37.25 | 49.70 | 58.03 | 51.17 | 59.60 | 57.39 | 55.95 | 53.26 | 48.06 | 50.76 | |
Zorin_AIRI_task2_1 | ZorinAIRI2022 | 84 | 43.169 | 36.40 | 34.54 | 47.95 | 50.80 | 49.79 | 20.68 | 48.47 | 25.56 | 47.58 | 33.32 | 50.00 | 45.60 | 48.68 | 42.58 | 49.68 | 45.84 | 37.74 | 50.01 | 44.02 | 49.73 | 55.28 | 50.12 | 46.60 | 49.89 | 39.61 | 50.16 | 40.00 | 48.68 | 38.93 | 48.77 | |
Venkatesh_MERL_task2_1 | VenkateshMERL2022 | 23 | 65.660 | 78.99 | 87.40 | 74.21 | 63.48 | 50.63 | 72.96 | 52.42 | 88.16 | 55.74 | 67.50 | 52.58 | 99.88 | 56.58 | 98.08 | 73.95 | 59.32 | 95.30 | 79.63 | 54.17 | 55.13 | 41.28 | 50.53 | 86.13 | 71.84 | 70.45 | 63.84 | 69.96 | 61.60 | 75.50 | 64.70 | |
Venkatesh_MERL_task2_2 | VenkateshMERL2022 | 24 | 65.569 | 78.79 | 87.40 | 74.21 | 62.34 | 49.95 | 74.08 | 52.58 | 77.28 | 67.47 | 67.50 | 52.58 | 99.88 | 54.00 | 97.40 | 71.47 | 58.83 | 95.30 | 79.63 | 53.65 | 55.16 | 40.99 | 50.66 | 83.48 | 71.45 | 70.45 | 63.84 | 72.48 | 64.41 | 74.66 | 64.29 | |
Venkatesh_MERL_task2_3 | VenkateshMERL2022 | 8 | 67.565 | 80.30 | 87.40 | 74.21 | 66.68 | 51.26 | 73.38 | 51.32 | 80.94 | 68.74 | 75.50 | 48.95 | 99.88 | 56.58 | 94.90 | 59.95 | 62.69 | 95.30 | 79.63 | 56.79 | 55.48 | 45.07 | 50.15 | 88.02 | 81.98 | 71.97 | 65.30 | 69.96 | 61.60 | 81.80 | 63.56 | |
Venkatesh_MERL_task2_4 | VenkateshMERL2022 | 10 | 67.494 | 79.90 | 83.04 | 64.21 | 66.46 | 50.84 | 71.98 | 51.68 | 82.52 | 67.05 | 75.10 | 52.58 | 99.88 | 62.53 | 93.60 | 64.26 | 62.10 | 95.67 | 78.22 | 55.76 | 55.83 | 43.88 | 50.07 | 87.15 | 81.24 | 71.36 | 67.24 | 72.46 | 64.66 | 81.22 | 63.96 |
System characteristics
Summary of the submitted system characteristics.
Rank |
Submission Code |
Technical Report |
Classifier |
System Complexity |
Acoustic Feature |
Data Augmentation |
Decision Making |
System Embeddings |
Subsystem Conut |
External Data Usage |
Front End System |
---|---|---|---|---|---|---|---|---|---|---|---|
75 | DCASE2022_baseline_task2_AE | DCASE2022baseline2022 | AE | 269992 | log-mel energies | ||||||
68 | DCASE2022_baseline_task2_MNV2 | DCASE2022baseline2022 | MNV2 | 269992 | log-mel energies | ||||||
26 | Bai_JLESS_task2_1 | BaiJLESS2022 | CNN, Transformer, ensemble, LOF | 567110 | spectrogram, log-mel energies | fmix, mixup | vote | 3 | |||
44 | Bai_JLESS_task2_2 | BaiJLESS2022 | CNN, ensemble, LOF | 995590 | log-mel energies | fmix, mixup | vote | 3 | |||
42 | Bai_JLESS_task2_3 | BaiJLESS2022 | CNN, ensemble, LOF | 567110 | spectrogram | fmix, mixup | vote | 3 | |||
20 | Kuroyanagi_NU-HDL_task2_1 | KuroyanagiNU-HDL2022 | GMM,LOF,KNN,CNN,Transformer,Conformer | 48789168 | spectrogram | mixup,gaussian noise | average | PyTorch Image Models | 20 | pre-trained model,Audioseet | |
5 | Kuroyanagi_NU-HDL_task2_2 | KuroyanagiNU-HDL2022 | GMM,LOF,KNN,CNN,Transformer,Conformer | 48789168 | spectrogram | mixup,gaussian noise | average | PyTorch Image Models | 20 | pre-trained model,Audioseet | |
52 | Kuroyanagi_NU-HDL_task2_3 | KuroyanagiNU-HDL2022 | GMM,LOF,KNN,CNN,Transformer,Conformer | 48789168 | spectrogram | mixup,gaussian noise | average | PyTorch Image Models | 20 | pre-trained model,Audioseet | |
12 | Kuroyanagi_NU-HDL_task2_4 | KuroyanagiNU-HDL2022 | GMM,LOF,KNN,CNN,Transformer,Conformer | 48789168 | spectrogram | mixup,gaussian noise | average | PyTorch Image Models | 20 | pre-trained model,Audioseet | |
82 | LEE_KNU_task2_1 | LEEKNU2022 | Contrastive learning, k-NN | 11496000 | log-mel energies | Harmonics modification, Temporal masking, F0 masking, | |||||
70 | Narita_AIT_task2_1 | NaritaAIT2022 | EfficientNet-B1, Mahalanobis | 7794184 | log-mel energies | Mixup, Time Masking, Frequency Masking | PyTorch Image Models (EfficientNet-B1) | pre-trained model | |||
69 | Narita_AIT_task2_2 | NaritaAIT2022 | EfficientNet-B1, Mahalanobis | 7794184 | log-mel energies | Mixup, Frequency Masking | PyTorch Image Models (EfficientNet-B1) | pre-trained model | |||
76 | Narita_AIT_task2_3 | NaritaAIT2022 | EfficientNet-B1, Mahalanobis | 7794184 | log-mel energies | Mixup, Time Masking, Frequency Masking, SevenBandParametricEQ | PyTorch Image Models (EfficientNet-B1) | pre-trained model | |||
72 | Narita_AIT_task2_4 | NaritaAIT2022 | EfficientNet-B1, Mahalanobis | 23382552 | log-mel energies | Mixup, Time Masking, Frequency Masking, SevenBandParametricEQ | PyTorch Image Models (EfficientNet-B1) | 3 | pre-trained model | ||
33 | Du_NERCSLIP_task2_1 | DuNERCSLIP2022 | AE, Section ID classification, binary classification, cosine distance, ensemble | 182M | log-mel energies, STFT, raw waveform | mixup, TFmask, volume perturbation | weighted average | SSAST traind on AudioSet data | 4 | simulation of anomalous samples, pre-trained model | |
37 | Du_NERCSLIP_task2_2 | DuNERCSLIP2022 | AE, Section ID classification, binary classification, cosine distance, ensemble | 183M | log-mel energies, STFT, raw waveform | mixup, TFmask, volume perturbation | weighted average | SSAST traind on AudioSet data | 4 | simulation of anomalous samples, pre-trained model | |
35 | Du_NERCSLIP_task2_3 | DuNERCSLIP2022 | AE, Section ID classification, binary classification, cosine distance, ensemble | 184M | log-mel energies, STFT, raw waveform | mixup, TFmask, volume perturbation | weighted average | SSAST traind on AudioSet data | 4 | simulation of anomalous samples, pre-trained model | |
36 | Du_NERCSLIP_task2_4 | DuNERCSLIP2022 | AE, Section ID classification, binary classification, cosine distance, ensemble | 185M | log-mel energies, STFT, raw waveform | mixup, TFmask, volume perturbation | weighted average | SSAST traind on AudioSet data | 4 | simulation of anomalous samples, pre-trained model | |
78 | Jinhyuk_SNU_task2_1 | JinhyukSNU2022 | AE | 585648 | log-mel energies | ||||||
65 | Hu_NJU_task2_1 | HuNJU2022 | VAE, CNN, ensemble, KNN | 2489877 | log-mel energies, spectrogram, raw waveform | mixup | weighted average | 2 | |||
28 | Wilkinghoff_FKIE_task2_1 | WilkinghoffFKIE2022 | CNN, GMM, ensemble | 977811200 | log-mel energies, magnitude spectrum | mixup | sum | 40 | |||
30 | Wilkinghoff_FKIE_task2_2 | WilkinghoffFKIE2022 | CNN, GMM, ensemble | 977811200 | log-mel energies, magnitude spectrum | mixup | sum | 40 | |||
29 | Wei_HEU_task2_1 | WeiHEU2022 | CNN, ArcFace | 1347392 | log-mel spectrogram, raw waveform | ||||||
14 | Wei_HEU_task2_2 | WeiHEU2022 | clustering, ensemble | 0 | log-mel spectrogram | average | 2 | ||||
17 | Wei_HEU_task2_3 | WeiHEU2022 | CNN, ArcFace, clustering, ensemble | 1347392 | log-mel spectrogram, raw waveform | average | 2 | ||||
22 | Guan_HEU_task2_1 | GuanHEU2022 | GMM | 33024 | log-mel spectrogram | ||||||
19 | Guan_HEU_task2_2 | GuanHEU2022 | GMM | 33024 | log-mel spectrogram | SMOTE | |||||
11 | Guan_HEU_task2_3 | GuanHEU2022 | GMM, CNN, ensemble | 4204083 | log-mel spectrogram, raw waveform | weighted | 4 | ||||
6 | Guan_HEU_task2_4 | GuanHEU2022 | GMM, CNN, ensemble | 4204083 | log-mel spectrogram, raw waveform | SMOTE | weighted | 4 | |||
59 | Li_CTRI_task2_1 | LiCTRI2022 | AE, K-NN | 74779233 | log-mel energies | maximum | ResNet38, MobileNetV2 | 3 | pre-trained model | ||
15 | Morita_SECOM_task2_1 | MoritaSECOM2022 | CNN, k-NN | 994048 | spectrogram | ||||||
25 | Morita_SECOM_task2_2 | MoritaSECOM2022 | CNN, k-NN | 994048 | PCEN | ||||||
21 | Morita_SECOM_task2_3 | MoritaSECOM2022 | CNN, k-NN | 994048 | spectrogram, PCEN, HPSS | ||||||
16 | Morita_SECOM_task2_4 | MoritaSECOM2022 | CNN, k-NN, LOF, GMM | 994048 | spectrogram, PCEN, HPSS | ||||||
53 | Yamashita_GU_task2_1 | YamashitaGU2022 | IDNN, CNN | 5602001 | log spectrogram | ||||||
61 | Yamashita_GU_task2_2 | YamashitaGU2022 | U-Net | 2164433 | log spectrogram | ||||||
39 | Yamashita_GU_task2_3 | YamashitaGU2022 | AE,IDNN,CNN,U-Net,ensemble | 11201923 | log spectrogram | 3 | |||||
48 | Yamashita_GU_task2_4 | YamashitaGU2022 | AE,IDNN,CNN,U-Net,ensemble | 13930196 | log spectrogram, log-mel energies | 4 | |||||
67 | CHO_SG_task2_1 | CHOSG2022 | CNN, ArcFace | 907520 | log-mel energies, raw waveform | mixup | |||||
81 | CHO_SG_task2_2 | CHOSG2022 | CNN, ArcFace | 907520 | log-mel energies, raw waveform | mixup | |||||
83 | Li_JAIST_task2 | LiJAIST2022 | AE | 269992 | temporal modulation features on the gammatone auditory filterban | ||||||
56 | Gou_UESTC_task2_1 | GouUESTC2022 | CNN, LOF | 28579432 | spectrogram | ||||||
58 | Gou_UESTC_task2_2 | GouUESTC2022 | CNN, LOF | 28579432 | spectrogram | ||||||
60 | Gou_UESTC_task2_3 | GouUESTC2022 | CNN, LOF | 28579432 | spectrogram | ||||||
51 | Gou_UESTC_task2_4 | GouUESTC2022 | AE, CNN, LOF | 28847360 | spectrogram, log-mel energies | ||||||
57 | PENG_NJUPT_task2_1 | PENGNJUPT2022 | MobileNetV2 | 713910 | Fast spectral coherence | ||||||
62 | PENG_NJUPT_task2_2 | PENGNJUPT2022 | MobileNetV2 | 713910 | Fast spectral coherence,Wavelet packet energy,log-Mel | ||||||
32 | Nejjar_ETH_task2_1 | NejjarETH2022 | CNN, k-NN | 1453440 | log-mel energies | mixup | |||||
54 | Nejjar_ETH_task2_2 | NejjarETH2022 | CNN, k-NN | 1453440 | log-mel energies | mixup | |||||
27 | Verbitskiy_DS_task2_1 | VerbitskiyDS2022 | CNN, ArcFace, k-NN | 6009730 | log-mel energies | temporal cropping, SpecAgment | |||||
40 | Verbitskiy_DS_task2_2 | VerbitskiyDS2022 | CNN, ArcFace, k-NN | 6009730 | log-mel energies, MFCC, GFCC | temporal cropping, SpecAgment | |||||
34 | Verbitskiy_DS_task2_3 | VerbitskiyDS2022 | CNN, ArcFace, k-NN, ensemble | 12019460 | log-mel energies, MFCC | temporal cropping, SpecAgment | average | 2 | |||
31 | Verbitskiy_DS_task2_4 | VerbitskiyDS2022 | CNN, ArcFace, k-NN, ensemble | 18029190 | log-mel energies, MFCC | temporal cropping, SpecAgment | average | 3 | |||
64 | Cohen_Technion_task2_1 | CohenTechnion2022 | AE, Spectral Clustering, OCSVM | 269992 | log-mel enegries | ||||||
63 | Cohen_Technion_task2_2 | CohenTechnion2022 | Spectral Clustering, OCSVM | 861 | MFCC | ||||||
9 | Deng_THU_task2_1 | DengTHU2022 | ensemble | 192720000 | spectrogram, log-mel energies | average | 6 | AudioSet | |||
18 | Deng_THU_task2_2 | DengTHU2022 | ensemble | 192720000 | spectrogram, log-mel energies | average | 6 | ||||
13 | Deng_THU_task2_3 | DengTHU2022 | ensemble | 97130000 | spectrogram, log-mel energies | average | 5 | ||||
7 | Deng_THU_task2_4 | DengTHU2022 | ensemble | 1320000 | spectrogram, log-mel energies | average | 2 | ||||
55 | Liu_BUPT_task2_1 | LiuBUPT2022 | CNN | 1316042 | log-mel energies | mixup | average | ||||
49 | Liu_BUPT_task2_2 | LiuBUPT2022 | CNN, normalizing flow | 1316042 | log-mel energies | mixup | average | OpenL3 | |||
80 | Kazakova_ITMO_task2_1 | KazakovaITMO2022 | AE | 101270 | mel spectrogram | ||||||
86 | Kazakova_ITMO_task2_2 | KazakovaITMO2022 | MobileNetV2 | 590000 | STFT | TimeStretch, PitchShift | pre-trained model | ||||
85 | Kazakova_ITMO_task2_3 | KazakovaITMO2022 | MobileNetV2, AE | 691270 | STFT, mel spectrogram | TimeStretch, PitchShift | pre-trained model | ||||
74 | Kodua_ITMO_task2_1 | KoduaITMO2022 | CNN | 4796303 | raw waveform | PANNS MobileNetV1 | pre-trained model | ||||
2 | Liu_CQUPT_task2_1 | LiuCQUPT2022 | CNN | 1215430 | log-mel energies | HPSS | |||||
4 | Liu_CQUPT_task2_2 | LiuCQUPT2022 | CNN | 1215430 | log-mel energies | HPSS | |||||
3 | Liu_CQUPT_task2_3 | LiuCQUPT2022 | CNN | 1215430 | log-mel energies | HPSS | |||||
1 | Liu_CQUPT_task2_4 | LiuCQUPT2022 | CNN | 1215430 | log-mel energies | HPSS | |||||
79 | Siang_NTHU_task2_1 | SiangNTHU2022 | Time-dilated CNN, self-attention | 653860 | log-mel spectrogram | frequency masking | |||||
38 | Tozicka_NSW_task2_1 | TozickaNSW2022 | AE, energy, LOF | 7946361 | raw waveform, spectrogram | OpenL3 | 2 | pre-trained OpenL3 | |||
41 | Tozicka_NSW_task2_2 | TozickaNSW2022 | AE, energy, LOF | 8042361 | raw waveform, spectrogram | OpenL3 | 2 | pre-trained OpenL3 | |||
46 | Tozicka_NSW_task2_3 | TozickaNSW2022 | AE, energy, KNN | 7946361 | raw waveform, spectrogram | OpenL3 | 2 | pre-trained OpenL3 | |||
43 | Tozicka_NSW_task2_4 | TozickaNSW2022 | AE, energy, LOF, KNN | 24420841 | raw waveform, spectrogram | OpenL3 | 4 | pre-trained OpenL3 | |||
50 | Almudevar_UZ_task2_1 | AlmudevarUZ2022 | Transformer, ArcFace, k-NN | 21797760 | log-mel energies | mixup | |||||
45 | Almudevar_UZ_task2_2 | AlmudevarUZ2022 | Transformer, ArcFace, k-NN | 21797760 | log-mel energies | mixup | |||||
47 | Almudevar_UZ_task2_3 | AlmudevarUZ2022 | Transformer, ArcFace, k-NN | 21797760 | log-mel energies | mixup | |||||
77 | Jalalia_AIT_task2_1 | JalaliaAIT2022 | AE | 188680 | log-mel energies | ||||||
84 | Zorin_AIRI_task2_1 | ZorinAIRI2022 | LOF | log-mel energies | mixup, random crop, resize | ||||||
23 | Venkatesh_MERL_task2_1 | VenkateshMERL2022 | CNN, k-NN | 1062938 | spectrogram | ||||||
24 | Venkatesh_MERL_task2_2 | VenkateshMERL2022 | CNN, AE, k-NN | 1609938 | spectrogram | weighted average | 2 | ||||
8 | Venkatesh_MERL_task2_3 | VenkateshMERL2022 | CNN, k-NN | 1062938 | spectrogram | ||||||
10 | Venkatesh_MERL_task2_4 | VenkateshMERL2022 | CNN, k-NN | 1062938 | spectrogram |
Technical reports
Vision Transformer based embeddings extractor for Unsupervised Anomalous Sound Detection under Domain Generalization
Antonio Almudevar, Alfonso Ortega, Luis Vicente, Antonio Miguel, Eduardo Lleida
University of Zaragoza, Zaragoza, Spain
Almudevar_UZ_task2_1 Almudevar_UZ_task2_2 Almudevar_UZ_task2_3
Vision Transformer based embeddings extractor for Unsupervised Anomalous Sound Detection under Domain Generalization
Antonio Almudevar, Alfonso Ortega, Luis Vicente, Antonio Miguel, Eduardo Lleida
University of Zaragoza, Zaragoza, Spain
Abstract
Anomalous sound detection (ASD) is the task of identifying if a sound is normal or anomalous with respect to a given reference. In most scenarios, we have a large amount of normal data to design our model, but little or no anomalous data. When this situation occurs, the problem can be approached in an unsupervised manner, i.e., only normal data is used for design. In this report we present a solution for the DCASE2022 task 2 (Unsupervised Anomalous Sound Detection for Machine Condition Monitoring Applying Domain Generalization Techniques), which aims to address the ASD problem under domain generalization. This means that the data to develop the system belongs to the source domain, while the test data can belong to this domain or to a different one (target domain). The presented solution proposes an embeddings extractor based on a Vision Transformer (ViT) and makes use of the k-Nearest-Neighbor (k-NN) algorithm to obtain the anomaly score.
System characteristics
Classifier | ArcFace, Transformer, k-NN |
System complexity | 21797760 parameters |
Acoustic features | log-mel energies |
Data augmentation | mixup |
JLESS SUBMISSION TO DCASE2022 TASK2: BATCH MIXING STRATEGY BASED METHOD WITH ANOMALY DETECTOR FOR ANOMALOUS SOUND DETECTION
Jisheng Bai, Yafei Jia, Siwei Huang, Mou Wang, Jianfeng Chen
Joint Laboratory of Environmental Sound Sensing, School of Marine Science and Technology, Northwestern Polytechnical University, Xi'an, China
Bai_JLESS_task2_1 Bai_JLESS_task2_2 Bai_JLESS_task2_3
JLESS SUBMISSION TO DCASE2022 TASK2: BATCH MIXING STRATEGY BASED METHOD WITH ANOMALY DETECTOR FOR ANOMALOUS SOUND DETECTION
Jisheng Bai, Yafei Jia, Siwei Huang, Mou Wang, Jianfeng Chen
Joint Laboratory of Environmental Sound Sensing, School of Marine Science and Technology, Northwestern Polytechnical University, Xi'an, China
Abstract
Anomaly detection has a wide range of applications such as finding fraud cases in industry or indicating network intrusion in network security. Anomalous sound detection (ASD) for machine condition monitoring can detect anomalies in advance and prevent causing damage. However, the operational conditions of machines often change, leading to the different acoustic characteristics between training and test data. Domain generalization techniques are required to adapt the model to different conditions. In this paper, we present a self-supervised method for ASD using batch mixing strategy with margin loss and anomaly detector. The proposed batch mixing strategy randomly mixes the data from source and target domains in a mini-batch to adapt the model between different domains. Moreover, we adopt a self-supervised method using machine IDs with additive angular margin loss to extract acoustic representations. Finally, we use the acoustic representations to train anomaly detectors to detect anomalous sound. Experimental results on the development dataset of DCASE2022 taks2 show that our method outperforms the baseline systems.
System characteristics
Classifier | CNN, LOF, Transformer, ensemble |
System complexity | 567110, 995590 parameters |
Acoustic features | log-mel energies, spectrogram |
Data augmentation | fmix, mixup |
Decision making | vote |
Subsystem count | 3 |
Self-Supervised Learning Methods using ST-gram for Anomaly Machine Sound Detection
Wonki Cho
Dept of Computer Engineering, Sogang University, Seoul, Republic of Korea
CHO_SG_task2_1 CHO_SG_task2_2
Self-Supervised Learning Methods using ST-gram for Anomaly Machine Sound Detection
Wonki Cho
Dept of Computer Engineering, Sogang University, Seoul, Republic of Korea
Abstract
It is difficult to apply supervised learning to anomaly detection due to absence of abnormal data. Therefore, in anomaly detection, Therefore, we use unsupervised anomaly detection, which assumes that most of the data is a normal sample and learns without obtaining a label. In this paper, A self-supervised learning method is proposed for unsupervised anomaly detection. This network performs self-supervised classification using metadata associated with the audio files and compare to labeled normal and abnormal. To better extract the characteristics of the machine sounds, We adopt ST-gram for Spectral-temporal feature fusion and compared performance with some CNN Networks for DCASE 2022 Challenge task2: Unsupervised Anomaly Sound Detection for Machine Condition Monitoring Applying Domain Generalization Techniques. As a result, Our method wasn’t performed well except ’Slider’ machine sounds.
System characteristics
Classifier | ArcFace, CNN |
System complexity | 907520 parameters |
Acoustic features | log-mel energies, raw waveform |
Data augmentation | mixup |
UNSUPERVISED ANOMALOUS DETECTION BASED ON RIEMANNIAN GEOMETRY
Or Cohen, Yahav Vinokur, Asaf Arad, Dolev Vaknin, Shahaf-Yaron Peleg, Alon Amar
The Andrew and Erna Viterbi Faculty of Electrical and Computer Engineering, Technion, Israel Institute of Technology., Haifa, Israel and Acoustics Research Center, Haifa, Israel
Cohen_Technion_task2_1 Cohen_Technion_task2_2
UNSUPERVISED ANOMALOUS DETECTION BASED ON RIEMANNIAN GEOMETRY
Or Cohen, Yahav Vinokur, Asaf Arad, Dolev Vaknin, Shahaf-Yaron Peleg, Alon Amar
The Andrew and Erna Viterbi Faculty of Electrical and Computer Engineering, Technion, Israel Institute of Technology., Haifa, Israel and Acoustics Research Center, Haifa, Israel
Abstract
This technical report presents our proposed algorithms for the task 2 of the DCASE2022 challenge, which is unsupervised anomalous sound detection for machine condition monitoring by applying domain generalization techniques. We suggest two methods for feature extraction. The first method is based on extracting features using the latent space of an Autoencoder, and the second method is based on using the Mel-frequency cepstral coefficients (MFCC) to represent the signal. We represent the features using symmetric positive-definite (SPD) matrices. As there maybe a domain shift between the train data and the test data, we first performed spectral clustering given the Riemannian distances between the SPD matrices. A one class SVM is then trained on each of the centers of the clusters and is used to detect the anomalies in the data.
System characteristics
Classifier | AE, OCSVM, Spectral Clustering |
System complexity | 269992, 861 parameters |
Acoustic features | MFCC, log-mel enegries |
Description and Discussion on DCASE 2022 Challenge Task 2: Unsupervised Anomalous Sound Detection for Machine Condition Monitoring Applying Domain Generalization Techniques
Kota Dohi, Keisuke Imoto, Noboru Harada, Daisuke Niizumi, Yuma Koizumi, Tomoya Nishida, Harsh Purohit, Takashi Endo, Masaaki Yamamoto, Yohei Kawaguchi
Research and Development Group, Hitachi, Ltd., Tokyo, Japan and Doshisha University, Kyoto, Japan and Google LLC, Tokyo, Japan and NTT Communication Science Labs, Kanagawa, Japan
DCASE2022_baseline_task2_AE DCASE2022_baseline_task2_MNV2
Description and Discussion on DCASE 2022 Challenge Task 2: Unsupervised Anomalous Sound Detection for Machine Condition Monitoring Applying Domain Generalization Techniques
Kota Dohi, Keisuke Imoto, Noboru Harada, Daisuke Niizumi, Yuma Koizumi, Tomoya Nishida, Harsh Purohit, Takashi Endo, Masaaki Yamamoto, Yohei Kawaguchi
Research and Development Group, Hitachi, Ltd., Tokyo, Japan and Doshisha University, Kyoto, Japan and Google LLC, Tokyo, Japan and NTT Communication Science Labs, Kanagawa, Japan
Abstract
We present the task description of the Detection and Classification of Acoustic Scenes and Events (DCASE) 2022 Challenge Task 2: "Unsupervised anomalous sound detection (ASD) for machine condition monitoring applying domain generalization techniques". Domain shifts are a critical problem for the application of ASD systems. Because domain shifts can change the acoustic characteristics of data, a model trained in a source domain performs poorly for a target domain. In DCASE 2021 Challenge Task 2, we organized an ASD task for handling domain shifts. In this task, it was assumed that the occurrences of domain shifts are known. However, in practice, the domain of each sample may not be given, and the domain shifts can occur implicitly. In 2022 Task 2, we focus on domain generalization techniques that detects anomalies regardless of the domain shifts. Specifically, the domain of each sample is not given in the test data and only one threshold is allowed for all domains.
System characteristics
Classifier | AE, MNV2 |
System complexity | 269992 parameters |
Acoustic features | log-mel energies |
AITHU SYSTEM FOR UNSUPERVISED ANOMALOUS DETECTION OF MACHINE WORKING STATUS VIA SOUNDING
Jia Liu, Yufeng Deng, Anbai Jiang, Yuchen Duan, Jitao Ma, Xuchu Chen, Pingyi Fan, Cheng Lu, Wei-Qiang Zhang
Department of Electronic Engineering, Tsinghua University, Beijing, China and Tsinghua University, Beijing, China
Deng_THU_task2_1 Deng_THU_task2_2 Deng_THU_task2_3 Deng_THU_task2_4
AITHU SYSTEM FOR UNSUPERVISED ANOMALOUS DETECTION OF MACHINE WORKING STATUS VIA SOUNDING
Jia Liu, Yufeng Deng, Anbai Jiang, Yuchen Duan, Jitao Ma, Xuchu Chen, Pingyi Fan, Cheng Lu, Wei-Qiang Zhang
Department of Electronic Engineering, Tsinghua University, Beijing, China and Tsinghua University, Beijing, China
Abstract
This report describes the AITHU system for the DCASE 2022 Challenge Task 2, which aims to detect anomalous machine status via sounding by using machine learning methods, where the training dataset itself does not contain any examples of anomalies. We build six subsystems, including three self-supervised classification methods, two probabilistic methods and one generative adversarial network (GAN) based method. Our final submission are four ensemble systems, which are different combinations of the six subsystems. The best official score of the ensemble systems can achieve 86.81% on the development dataset, whereas the corresponding Autoencoder-based baseline and the MobileNetV2-based baseline are with scores of 52.61% and 56.01%, respectively.
System characteristics
Classifier | ensemble |
System complexity | 1320000, 192720000, 97130000 parameters |
Acoustic features | log-mel energies, spectrogram |
Decision making | average |
Subsystem count | 2, 5, 6 |
External data usage | AudioSet |
ENSEMBLE OF MULTIPLE ANOMALY DETECTORS UNDER DOMAIN GENERALIZATION CONDITIONS
Shuxian Wang, Yajian Wang, Diyuan Liu, Fan Chu, Yunqing Li, Jia Pan, Jun Du, Tian Gao, Qing Wang
Department of Electronic Engineering and Information Science, University of Science and Technology of China, Hefei, China and iFLYTEK, Hefei, China and University of Science and Technology of China, Hefei, China
Du_NERCSLIP_task2_1 Du_NERCSLIP_task2_2 Du_NERCSLIP_task2_3 Du_NERCSLIP_task2_4
ENSEMBLE OF MULTIPLE ANOMALY DETECTORS UNDER DOMAIN GENERALIZATION CONDITIONS
Shuxian Wang, Yajian Wang, Diyuan Liu, Fan Chu, Yunqing Li, Jia Pan, Jun Du, Tian Gao, Qing Wang
Department of Electronic Engineering and Information Science, University of Science and Technology of China, Hefei, China and iFLYTEK, Hefei, China and University of Science and Technology of China, Hefei, China
Abstract
This technical report outlines our solution to DCASE 2022 Challenge Task 2, Unsupervised Anomalous Sound Detection for Machine Condition Monitoring Applying Domain Generalization Techniques. The goal is to detect recordings that contain anomalous machine sounds in the test set using only normal sound data in the training set. Our approaches are based on an ensemble of a self-supervised classifier model, an autoencoder, a binary classification model that utilizes task irrelevant outliers as pseudo-anomalous data and a distance metric based model.
System characteristics
Classifier | AE, Section ID classification, binary classification, cosine distance, ensemble |
System complexity | 182M, 183M, 184M, 185M parameters |
Acoustic features | STFT, log-mel energies, raw waveform |
Data augmentation | mixup, TFmask, volume perturbation |
Decision making | weighted average |
System embeddings | SSAST traind on AudioSet data |
Subsystem count | 4 |
External data usage | simulation of anomalous samples, pre-trained model |
UNSUPERVISED ANOMALOUS SOUND DETECTION USING FEATURE EXTRACTOR AND ANOMALY DETECTOR
Jiacheng Gou, Chuang Shi, Huiyong Li
School of Information and Communication Engineering, University of Electronic Science and Technology of China, Chengdu, China and University of Electronic Science and Technology of China, Chengdu, China
Gou_UESTC_task2_1 Gou_UESTC_task2_2 Gou_UESTC_task2_3 Gou_UESTC_task2_4
UNSUPERVISED ANOMALOUS SOUND DETECTION USING FEATURE EXTRACTOR AND ANOMALY DETECTOR
Jiacheng Gou, Chuang Shi, Huiyong Li
School of Information and Communication Engineering, University of Electronic Science and Technology of China, Chengdu, China and University of Electronic Science and Technology of China, Chengdu, China
Abstract
This report proposes an anomalous sound detection method based on feature extraction and anomaly detection for DCASE 2022 task 2. In order to recognize the anomaly sound when only the normal sound is used as the training data, we use the clip of the spectrogram and the corresponding section name to train a feature extractor to generate the features of the normal sound. Then the anomaly detector is used to calculate the intensity of anomaly between the test sound features and the normal sound features, to provide the anomaly score of the test sound. In view of the domain generalization, the source domain and target domain select different shifts when clipping spectrum, and select different anomaly detectors based on whether the sound belongs to the source domain or target domain.
System characteristics
Classifier | AE, CNN, LOF |
System complexity | 28579432, 28847360 parameters |
Acoustic features | log-mel energies, spectrogram |
The DCASE2022 Challenge Task 2 System: Anomalous Sound Detection with Self-supervised Attribute Classification and GMM-based Clustering
Feiyang Xiao, Youde Liu, Jian Guan, Yuming Wei, Qiaoxi Zhu, Tieran Zheng, Jiqing Han
College of Computer Science and Technology, Harbin Engineering University, Harbin, China and Harbin Institute of Technology, Harbin, China and Harbin Engineering University, Harbin, China and University of Technology Sydney, Ultimo, Australia
Guan_HEU_task2_1 Guan_HEU_task2_2 Guan_HEU_task2_3 Guan_HEU_task2_4
The DCASE2022 Challenge Task 2 System: Anomalous Sound Detection with Self-supervised Attribute Classification and GMM-based Clustering
Feiyang Xiao, Youde Liu, Jian Guan, Yuming Wei, Qiaoxi Zhu, Tieran Zheng, Jiqing Han
College of Computer Science and Technology, Harbin Engineering University, Harbin, China and Harbin Institute of Technology, Harbin, China and Harbin Engineering University, Harbin, China and University of Technology Sydney, Ultimo, Australia
Abstract
This report describes our submission for DCASE2022 Challenge Task 2, an ensemble system for unsupervised anomalous sound detection (ASD), allowing domain shifts. It integrates two domain generalization methods, a self-supervised attribute classification and a GMM-based clustering for unsupervised ASD. Experiments were conducted on the development dataset of DCASE2022 Challenge Task 2. The results show that our ensemble system can achieve 88.5% in average AUC under the source domain, 78.5% in average AUC under the target domain, and 68.8% in average pAUC.
System characteristics
Classifier | CNN, GMM, ensemble |
System complexity | 33024, 4204083 parameters |
Acoustic features | log-mel spectrogram, raw waveform |
Data augmentation | SMOTE |
Decision making | weighted |
Subsystem count | 4 |
AN ENSEMBLE METHOD FOR UNSUPERVISED ANOMALOUS SOUND DETECTION
Qinwen Hu, Kai Chen, Jing Lu
Key Laboratory of Modern Acoustics,, Nanjing University, Nanjing, China and Nanjing University, Nanjing, China
Hu_NJU_task2_1
AN ENSEMBLE METHOD FOR UNSUPERVISED ANOMALOUS SOUND DETECTION
Qinwen Hu, Kai Chen, Jing Lu
Key Laboratory of Modern Acoustics,, Nanjing University, Nanjing, China and Nanjing University, Nanjing, China
Abstract
This report describes our submitted system for DCASE2022 challenge task2 (Unsupervised Anomalous Sound Detection for Machine Condition Monitoring Applying Domain Generalization Techniques) [1] in detail. The system is composed of two modules, a hierarchical recurrent variational autoencoder and a self-supervised classifier, and the final score is a weighted average over the normalized results of the two systems. The anomaly scores are all calculated in the latent/embedding space.
System characteristics
Classifier | CNN, KNN, VAE, ensemble |
System complexity | 2489877 parameters |
Acoustic features | log-mel energies, raw waveform, spectrogram |
Data augmentation | mixup |
Decision making | weighted average |
Subsystem count | 2 |
DCASE CHALLENGE 2022, TASK 2: VARIATIONAL DENSE AUTOENCODER FOR UNSUPERVISED ANOMALOUS SOUND DETECTION OF MACHINERY
Anahid Jalali, Lam Pham, Clemens Heistracher, Denis Katic, Alexander Schindler
Data Science and Artificial Intelligence, Austrian Institute of Technology (AIT), Vienna, Austria
Jalalia_AIT_task2_1
DCASE CHALLENGE 2022, TASK 2: VARIATIONAL DENSE AUTOENCODER FOR UNSUPERVISED ANOMALOUS SOUND DETECTION OF MACHINERY
Anahid Jalali, Lam Pham, Clemens Heistracher, Denis Katic, Alexander Schindler
Data Science and Artificial Intelligence, Austrian Institute of Technology (AIT), Vienna, Austria
Abstract
In this study, we present an unsupervised anomalous sound detection framework trained on the DCASE2022 audio dataset. We use variational dense autoencoder to reconstruct the machine’s healthy (normal) state and use the reconstruction loss as a threshold for detecting the anomalies in an unsupervised manner. Our framework outperforms DCASE2021 benchmarks in target domains. The dense autoencoder has a harmonic mean of AUC of 72.10% (source), and 45.49% (target) and pAUC of 54.09%. Our framework achieved the harmonic mean AUC of 68.78 and pAUC of 53.96, over all the machines. Our target domain arithmetic average, however, achieved 47.77% (baseline: 45.49%) which shows an slight improved performance from the dense autoencoder.
System characteristics
Classifier | AE |
System complexity | 188680 parameters |
Acoustic features | log-mel energies |
ENSEMBLE METHOD FOR UNSUPERVISED ANOMALOUS SOUND DETECTION
Jinhyuk Park, Sangwoong Yoon, Yonghyeon Lee, Minjun Son, Frank C. Park
Robotics Laboratory, Seoul National University, Seoul, South Korea
Jinhyuk_SNU_task2_1
ENSEMBLE METHOD FOR UNSUPERVISED ANOMALOUS SOUND DETECTION
Jinhyuk Park, Sangwoong Yoon, Yonghyeon Lee, Minjun Son, Frank C. Park
Robotics Laboratory, Seoul National University, Seoul, South Korea
Abstract
We propose an anomalous sound detection method for DCASE2022 Challenge Task2. The method is basically an ensemble of multiple autoencoder-based approaches. The model reconstruct the input Mel spectrogram and decide it is an anomaly if the reconstruction error is higher than a threshold. The area under curve (AUC) performance achieved by the proposed approach is 53.35% on source domain and 43.48% on target domain, and the partial AUC is 48.00%.
System characteristics
Classifier | AE |
System complexity | 585648 parameters |
Acoustic features | log-mel energies |
ANOMALOUS SOUND DETECTION WITH AUTOENCODER AND IMAGE MOBILENET USING OUTLIER EXPOSURE APPROACH
Sophia Kazakova, Andrey Semenov, Andrey Surkov, Sergei Astapov
ITMO University, St. Petersburg, Russia
Abstract
This technical report describes the autoencoder and MobileNetV2-based approach for DCASE 2022 Unsupervised Anomalous Sound Detection for Machine Condition Monitoring Applying Domain Generalization Techniques task [1]. Firstly, a basic Autoencoder-based architecture was developed. Then the outlier exposure approach was tested on DCASE 2022 Challenge Task 2 Development Dataset [2]. Proven its effectiveness, it then was used as a part of an image MobileNetv2 system. To tackle the challenge of domain shift and make the dataset more balanced in terms of source/target classes we used data augmentation with TimeStretch and PitchShift. Audio files then were transformed with STFT and saved as images. The MobileNetv2-based architecture was fine-tuned with those spectrograms and used for anomaly detection.
System characteristics
Classifier | AE, MobileNetV2 |
System complexity | 101270, 590000, 691270 parameters |
Acoustic features | STFT, mel spectrogram |
Data augmentation | TimeStretch, PitchShift |
External data usage | pre-trained model |
ANOMALOUS SOUND DETECTION WITH PANNS MOBILENETV1 EMBEDDINGS
Ilya Kodua, Sophia Kazakova, Andrey Semenov
ITMO University, St. Petersburg, Russia
Abstract
This technical report describes the PANNs MobileNetv1-based approach for DCASE 2022 Unsupervised Anomalous Sound Detection for Machine Condition Monitoring Applying Domain Generalization Techniques task [1]. The objective of this task is to determine whether the sound emitted from the target machine class is normal or anomalous while having only the normal data for training purposes. We extract embeddings using external PANNs MobileNetV1 pre-trained model [2]. For anomaly score assignment we concatenate the embeddings obtained and then calculate the cosine distance to the first nearest neighbor in the embedding space for this sample’s class and section. For this report the GitHub code is available. PANNs embedding extraction is [3], and anomaly score calculation is [4].
System characteristics
Classifier | CNN |
System complexity | 4796303 parameters |
Acoustic features | raw waveform |
System embeddings | PANNS MobileNetV1 |
External data usage | pre-trained model |
Two-stage anomalous sound detection systems using domain generalization and specialization techniques
Ibuki Kuroyanagi, Tomoki Hayashi, Kazuya Takeda, Tomoki Toda
Nagoya University and Human Dataware Lab. Co., Ltd., Nagoya, Japan and Human Dataware Lab. Co., Ltd. and Nagoya University, Nagoya, Japan and Nagoya University, Nagoya, Japan
Kuroyanagi_NU-HDL_task2_1 Kuroyanagi_NU-HDL_task2_2 Kuroyanagi_NU-HDL_task2_3 Kuroyanagi_NU-HDL_task2_4
Judges’ award
Two-stage anomalous sound detection systems using domain generalization and specialization techniques
Ibuki Kuroyanagi, Tomoki Hayashi, Kazuya Takeda, Tomoki Toda
Nagoya University and Human Dataware Lab. Co., Ltd., Nagoya, Japan and Human Dataware Lab. Co., Ltd. and Nagoya University, Nagoya, Japan and Nagoya University, Nagoya, Japan
Abstract
This report proposes anomalous sound detection (ASD) methods using domain generalization and specialization techniques for the DCASE 2022 Challenge Task 2. We propose two-stage ASD systems consisting of an outlier exposure-based feature extractor and an inlier modeling-based anomalous detector in serial. We further employ two approaches to deal with domain shift: a domain generalization approach and a domain specialization approach. Each approach improves performance significantly by adding several techniques to the two-stage ASD systems, such as generating pseudo-target domain data by Mixup and utilizing pseudo-anomalous data from Audioset. Our final systems are obtained by ensembling several systems with several hyperparameters for each approach. The proposed systems achieve 81.15 % in the harmonic mean of all machine types, sections, and domains for the area under the curve (AUC) and partial AUC (p = 0.1) on the development set.
Awards: Judges’ award
System characteristics
Classifier | CNN, Conformer, GMM, KNN, LOF, Transformer |
System complexity | 48789168 parameters |
Acoustic features | spectrogram |
Data augmentation | mixup,gaussian noise |
Decision making | average |
System embeddings | PyTorch Image Models |
Subsystem count | 20 |
External data usage | pre-trained model,Audioseet |
ANOMALOUS SOUND DETECTION USING CONTRASTIVE LEARNING
Seunghyeon Shin, Seokjin Lee
School of Electronic and Electrical Engineering, Kyungpook National University, Daegu, Republic of Korea
LEE_KNU_task2_1
ANOMALOUS SOUND DETECTION USING CONTRASTIVE LEARNING
Seunghyeon Shin, Seokjin Lee
School of Electronic and Electrical Engineering, Kyungpook National University, Daegu, Republic of Korea
Abstract
We propose an unsupervised anomalous sound detection system for DCASE 2022 Task 2. We use self supervised contrastive learning with data augmentation as a feature extractor network. We use three kinds of data augmentation methods for contrastive learning. Then k-Nearest Neighbors are used to compute anomalous scores from extracted feature vectors. As a result, we show the detection performance of 88.58% in Area under Curve(AUC) and 74.40% in partial AUC(pAUC) with hyperparameter fixed.
System characteristics
Classifier | Contrastive learning, k-NN |
System complexity | 11496000 parameters |
Acoustic features | log-mel energies |
Data augmentation | Harmonics modification, Temporal masking, F0 masking, |
ANOMALOUS SOUND DETECTION WITH ENSEMBLE OF CNN-BASED FEATURES AND AUTOENCODER APPROACHES
Xiaoyu Li, Jie Yang, Hao Shen
Department of Big Data and Artificial Intelligence, China Telecom Corporation Research Institute, Beijing, China
Li_CTRI_task2_1
ANOMALOUS SOUND DETECTION WITH ENSEMBLE OF CNN-BASED FEATURES AND AUTOENCODER APPROACHES
Xiaoyu Li, Jie Yang, Hao Shen
Department of Big Data and Artificial Intelligence, China Telecom Corporation Research Institute, Beijing, China
Abstract
This paper introduces a solution with the ensemble of three anomalous sound detection (ASD) methods for the DCASE2022 Challenge Task 2[1] [2 [3]. This task is required to detect unknown anomalous sound basing on normal sound data. The first ASD method is using the audio clip of the machine which is normal, and the section index of audio clip to train the Convolutional Neural Network (CNN). Then, anomalous sound is detected by using feature vectors extracted from CNN. The second ASD method is an OE-based detector that uses MobileNetV2. The third ASD method is an IM-based detector that uses autoencoder (AE). As a result, our method achieves a harmonic mean of 72.70% over of area under the curve (AUC), and 60.35% in partial AUC (pAUC).
System characteristics
Classifier | AE, K-NN |
System complexity | 74779233 parameters |
Acoustic features | log-mel energies |
Decision making | maximum |
System embeddings | ResNet38, MobileNetV2 |
Subsystem count | 3 |
External data usage | pre-trained model |
Unsupervised Anomalous Sound Detection for Machine Condition Monitoring Using Temporal Modulation Features on Gammatone Auditory Filterbank
Kai Li, Quoc-Huy Nguyen, Yasuji Ota, Masashi Unoki
School of Information Science, Japan Advanced Institute of Science and Technology, 1-1 Asahidai, Nomi, Ishikawa, 923-1292, Japan and Japan Advanced Institute of Science and Technology, 1-1 Asahidai, Nomi, Ishikawa, 923-1292, Japan
Li_JAIST_task2
Unsupervised Anomalous Sound Detection for Machine Condition Monitoring Using Temporal Modulation Features on Gammatone Auditory Filterbank
Kai Li, Quoc-Huy Nguyen, Yasuji Ota, Masashi Unoki
School of Information Science, Japan Advanced Institute of Science and Technology, 1-1 Asahidai, Nomi, Ishikawa, 923-1292, Japan and Japan Advanced Institute of Science and Technology, 1-1 Asahidai, Nomi, Ishikawa, 923-1292, Japan
Abstract
Anomalous sound detection (ASD) is a task to identify whether the sound emitted from a target machine is normal or not. Subjectively, timbral attributes, such as sharpness and roughness, are crucial for human beings to distinguish anomalous and normal sounds. However, the feature frequently used in existing methods for ASD is the log-mel-spectrogram, which cannot capture information in the time domain. This paper proposes an ASD method using temporal modulation features on the gammatone auditory filterbank (TMGF) to provide temporal characteristics for machine-learning-based methods. We evaluated the proposed method using the area under the ROC curve (AUC) and the partial area under the ROC curve (pAUC) with sounds recorded from seven kinds of machines. Compared with the baseline method of the DCASE2022 challenge, the proposed method provides a better ability for domain generalization, especially for machine sounds recorded from the valve.
System characteristics
Classifier | AE |
System complexity | 269992 parameters |
Acoustic features | temporal modulation features on the gammatone auditory filterban |
Unsupervised Anomalous Sound Detection Under Domain Shift Conditions Based on MobileFaceNets and Masked Autoregressive Flow
Gang Liu, Yi Liu, Shifang Cai, Minghang Chen
School of Artificial Intelligence, Beijing University of Posts and Telecommunications, Beijing, China and Beijing University of Posts and Telecommunications, Beijing, China
Liu_BUPT_task2_1 Liu_BUPT_task2_2
Unsupervised Anomalous Sound Detection Under Domain Shift Conditions Based on MobileFaceNets and Masked Autoregressive Flow
Gang Liu, Yi Liu, Shifang Cai, Minghang Chen
School of Artificial Intelligence, Beijing University of Posts and Telecommunications, Beijing, China and Beijing University of Posts and Telecommunications, Beijing, China
Abstract
We present our submission to the DCASE2022 Challenge Task 2, which aims to promote research in unsupervised anomalous sound detection under domain shift condition. We propose two architectures to solve this problem, one is a self-supervised model adopting MobileFaceNets, and the other is one density estimation probability distribution model based on Masked Autoregressive Flow.
System characteristics
Classifier | CNN, normalizing flow |
System complexity | 1316042 parameters |
Acoustic features | log-mel energies |
Data augmentation | mixup |
Decision making | average |
System embeddings | OpenL3 |
ROBUST ANOMALY SOUND DETECTION FRAMEWORK FOR MACHINE CONDITION MONITORING
Ying Zeng, Hongqing Liu, Lihua Xue, Yi Zhou, Lu Gan
Chongqing University of Posts and Telecommunications and AI Lab, Xiaomi Corporation, Chongqing and Beijing, China and Chongqing University of Posts and Telecommunications, Chongqing, China and Brunel University, London UB8 3PH, U.K.
Liu_CQUPT_task2_1 Liu_CQUPT_task2_2 Liu_CQUPT_task2_3 Liu_CQUPT_task2_4
ROBUST ANOMALY SOUND DETECTION FRAMEWORK FOR MACHINE CONDITION MONITORING
Ying Zeng, Hongqing Liu, Lihua Xue, Yi Zhou, Lu Gan
Chongqing University of Posts and Telecommunications and AI Lab, Xiaomi Corporation, Chongqing and Beijing, China and Chongqing University of Posts and Telecommunications, Chongqing, China and Brunel University, London UB8 3PH, U.K.
Abstract
This technical report describes our team’s submission to DCASE 2022 Task 2. In this report, we propose a robust training framework for anomalous sound detection, which includes feature preprocessing, model pretraining, joint loss, and anomaly score selection. The experimental results show that our anomalous sound detection model outperforms the official model, with an average performance improvement of 22.08% based on the official scoring method.
System characteristics
Classifier | CNN |
System complexity | 1215430 parameters |
Acoustic features | log-mel energies |
Front end system | HPSS |
COMPARATIVE EXPERIMENTS ON SPECTROGRAM REPRESENTATION FOR ANOMALOUS SOUND DETECTION
Kazuki Morita, Tomohiko Yano, Khai Q. Tran
Intelligent Systems Laboratory, SECOM CO.,LTD., Tokyo, Japan
Morita_SECOM_task2_1 Morita_SECOM_task2_2 Morita_SECOM_task2_3 Morita_SECOM_task2_4
COMPARATIVE EXPERIMENTS ON SPECTROGRAM REPRESENTATION FOR ANOMALOUS SOUND DETECTION
Kazuki Morita, Tomohiko Yano, Khai Q. Tran
Intelligent Systems Laboratory, SECOM CO.,LTD., Tokyo, Japan
Abstract
In this paper, we propose an anomalous sound detection method for DCASE2022task2. This is the task of anomalous sound detection for machine condition monitoring, and it is required to detect unknown anomalous sound only from normal sound data. Our system is based on a submission system for DCASE2021task2, and we newly evaluated variations in the time-frequency representation used in anomalous sound detection. As a result, the proposed method showed a detection performance of 84.80% for source domain and 82.26% for target domain in Area Under Curve (AUC) and 68.65% in partial AUC (pAUC).
System characteristics
Classifier | CNN, GMM, LOF, k-NN |
System complexity | 994048 parameters |
Acoustic features | HPSS, PCEN, spectrogram |
CNN-BASED ANOMALOUS SOUND DETECTION SYSTEM FOR DOMAIN GENERALIZATION
Hiroki Narita, Akira Tamamori
Aichi Institute of Technology, Aichi, Japan
Narita_AIT_task2_1 Narita_AIT_task2_2 Narita_AIT_task2_3 Narita_AIT_task2_4
CNN-BASED ANOMALOUS SOUND DETECTION SYSTEM FOR DOMAIN GENERALIZATION
Hiroki Narita, Akira Tamamori
Aichi Institute of Technology, Aichi, Japan
Abstract
This paper is a technical report for DCASE Challenge 2022 Task 2. Our submitted model consists of a self-supervised CNN model that predicts attribute information. We have ensembled three models but have not changed the architecture, and have achieved performance improvement only by changing the data expansion, training method, and anomaly detection method. Self-supervised learning with label information has been a powerful method in previous anomaly detection competitions, and we argue that it is equally powerful in this competition.
System characteristics
Classifier | EfficientNet-B1, Mahalanobis |
System complexity | 23382552, 7794184 parameters |
Acoustic features | log-mel energies |
Data augmentation | Mixup, Frequency Masking, Mixup, Time Masking, Frequency Masking, Mixup, Time Masking, Frequency Masking, SevenBandParametricEQ |
System embeddings | PyTorch Image Models (EfficientNet-B1) |
Subsystem count | 3 |
External data usage | pre-trained model |
DCASE CHALLENGE 2022: SELF-SUPERVISED LEARNING PRE-TRAINING, TRAINING FOR UNSUPERVISED ANOMALOUS SOUND DETECTION
Ismail Nejjar, Jean Meunier-Pion, Gaetan Frusque, Olga Fink
IMOS, ETHZ & EPFL, Swizerland and EPFL, Lausanne, Swizerland
Nejjar_ETH_task2_1 Nejjar_ETH_task2_2
DCASE CHALLENGE 2022: SELF-SUPERVISED LEARNING PRE-TRAINING, TRAINING FOR UNSUPERVISED ANOMALOUS SOUND DETECTION
Ismail Nejjar, Jean Meunier-Pion, Gaetan Frusque, Olga Fink
IMOS, ETHZ & EPFL, Swizerland and EPFL, Lausanne, Swizerland
Abstract
This technical report presents our proposed approaches for Task 2 of the DCASE 2022 Challenge, Unsupervised anomalous sound detection (ASD) for machine condition monitoring by applying domain generalization techniques. The main objective of this challenge is to detect anomalous machine sounds regardless of the domain shifts. Our approach introduces a two-step learning process, where normal sounds of each specific machine type are used to pretrain a Convolutional Neural Network (CNN) in a self-supervised way. Three objectives are thereby pursued: (1) reveal the impact of attributes on the data by enforcing embeddings in the same batch to be different (2) obtain uncorrelated embedding features containing specific information, (3) respecting defined geometrical constraints between the different domains. The model trained in an unsupervised way is then fine-tuned on the labels of the section indices. Ultimately, anomalous sounds are detected by using the feature vectors extracted from the CNN and applying k-NN to them. As a result, for the development set, it is shown that the presented framework significantly outperforms both baselines.
System characteristics
Classifier | CNN, k-NN |
System complexity | 1453440 parameters |
Acoustic features | log-mel energies |
Data augmentation | mixup |
UNSUPERVISED ABNORMAL SOUND DETECTION BASED ON SPECTRAL COHERENCE AND FEATURE FUSION IN DOMAIN DISPLACEMENT CONDITION
Tao Peng, Rui Qiu, Junyi Zhu, Yao Xiao, Su Wang, Yipeng Zhang, Chenyang Zhu, Shengchen Li, Xi Shao
Telecommunications & Information Engineering, Nanjing, China and School of Advanced Technology, Suzhou, China
PENG_NJUPT_task2_1 PENG_NJUPT_task2_2
UNSUPERVISED ABNORMAL SOUND DETECTION BASED ON SPECTRAL COHERENCE AND FEATURE FUSION IN DOMAIN DISPLACEMENT CONDITION
Tao Peng, Rui Qiu, Junyi Zhu, Yao Xiao, Su Wang, Yipeng Zhang, Chenyang Zhu, Shengchen Li, Xi Shao
Telecommunications & Information Engineering, Nanjing, China and School of Advanced Technology, Suzhou, China
Abstract
The DCASE2022 Challenge Task2 is to develop an unsupervised detection system of anomalous sounds for seven types of machines under domain shifted conditions. In this paper, two systems are proposed: one only uses spectral coherence as feature input and another combines spectral coherence, wavelet and log Mel. It shows that three-feature fusion has significantly improved the results compared with the baseline in general, but sometimes spectral coherence alone can lead to better results. Therefore, we suggest to use both methods in order to get stable results.
System characteristics
Classifier | MobileNetV2 |
System complexity | 713910 parameters |
Acoustic features | Fast spectral coherence, Wavelet packet energy, log-Mel |
OUTLIER-AUGMENTED CONTRASTIVE CLUSTERING FOR ANOMALY SOUND DETECTION WITH UNBALANCED DOMAIN
You-Siang Chen, Mingsian R. Bai
Power Mechanical Engineering, National Tsing Hua University, Hsinchu, Taiwan and National Tsing Hua University, Hsinchu, Taiwan
Siang_NTHU_task2_1
OUTLIER-AUGMENTED CONTRASTIVE CLUSTERING FOR ANOMALY SOUND DETECTION WITH UNBALANCED DOMAIN
You-Siang Chen, Mingsian R. Bai
Power Mechanical Engineering, National Tsing Hua University, Hsinchu, Taiwan and National Tsing Hua University, Hsinchu, Taiwan
Abstract
In this report, we developed a deep neural network (DNN) that can perform the deep clustering for the embedding vectors of machine sounds. The time-dilated convolutional neural network (TDCN) with attention mechanism was exploited to extract important features related to the time sequence. In addition, frequency masking is applied to the non-target sections of the machine sound to further increase the data size of the outliers. The results show that by applying the data augmentation to the outliers, the AUC performance can be improved. Furthermore, the deep clustering is able to contrastively attract and separate the machine sounds with unbalanced domain.
System characteristics
Classifier | Time-dilated CNN, self-attention |
System complexity | 653860 parameters |
Acoustic features | log-mel spectrogram |
Data augmentation | frequency masking |
DADAED - Double Anomaly Detector with AEDiff
Jan Tozicka, Marek Bezusek, Karel Durkota, Michal Linda
NeuronSW SE, Prague, Czech Republic
Tozicka_NSW_task2_1 Tozicka_NSW_task2_2 Tozicka_NSW_task2_3 Tozicka_NSW_task2_4
DADAED - Double Anomaly Detector with AEDiff
Jan Tozicka, Marek Bezusek, Karel Durkota, Michal Linda
NeuronSW SE, Prague, Czech Republic
Abstract
This report describes our submissions to the DCASE 2022 challenge Task 2 "Unsupervised Detection of Anomalous Sounds for Machine Condition Monitoring under Domain Shifted Conditions." Acoustic-based machine condition monitoring is a challenging task with a very unbalanced training dataset. Moreover, due to domain-shift, testing data may come from a different distribution than the training data, which makes the task even more difficult. In this submission, we propose two novel extensions of anomaly detection based on the reconstruction of auto-encoder (AE) network. The first approach uses the raw difference between AE input and its reconstructed output (instead of typical reconstruction error based anomaly detectors). The second approach extends the first approach with an additional anomaly score of autoencoder’s latent vectors. The combination of these two anomaly scores is then used to determine the final anomaly score.
System characteristics
Classifier | AE, KNN, LOF, energy |
System complexity | 24420841, 7946361, 8042361 parameters |
Acoustic features | raw waveform, spectrogram |
System embeddings | OpenL3 |
Subsystem count | 2, 4 |
External data usage | pre-trained OpenL3 |
Disentangled surrogate task learning for improved domain generalization in unsupervised anomalous sound detection
Satvik Venkatesh, Gordon Wichern, Aswin Subramanian, Jonathan Le Roux
Speech & Audio Team, Mitsubishi Electric Research Laboratories (MERL), Cambridge, MA, USA and Mitsubishi Electric Research Laboratories (MERL), Cambridge, MA, USA
Venkatesh_MERL_task2_1 Venkatesh_MERL_task2_2 Venkatesh_MERL_task2_3 Venkatesh_MERL_task2_4
Disentangled surrogate task learning for improved domain generalization in unsupervised anomalous sound detection
Satvik Venkatesh, Gordon Wichern, Aswin Subramanian, Jonathan Le Roux
Speech & Audio Team, Mitsubishi Electric Research Laboratories (MERL), Cambridge, MA, USA and Mitsubishi Electric Research Laboratories (MERL), Cambridge, MA, USA
Abstract
We present our submission to the DCASE2022 Challenge Task 2, which focuses on domain generalization for anomalous sound detection. We investigated a novel multi-task learning framework that disentangles domain-shared features and domain-specific features. Disentanglement leads to better latent features and also increases flexibility in post-processing due to the availability of multiple embedding spaces. Our disentangled model obtains an overall harmonic mean of 74.57% on the development set, surpassing the MobileNetV2 baseline, which obtains 56.01%. Lastly, we explore the use of machine-specific loss functions and domain generalization methods, which improves our overall performance to 76.42%.
System characteristics
Classifier | AE, CNN, k-NN |
System complexity | 1062938, 1609938 parameters |
Acoustic features | spectrogram |
Decision making | weighted average |
Subsystem count | 2 |
Unsupervised Anomalous Sound Detection Using Multiple Time-Frequency Representations
Sergey Verbitskiy, Milana Shkhanukova, Viacheslav Vyshegorodtsev
Deepsound, Novosibirsk, Russia
Verbitskiy_DS_task2_1 Verbitskiy_DS_task2_2 Verbitskiy_DS_task2_3 Verbitskiy_DS_task2_4
Unsupervised Anomalous Sound Detection Using Multiple Time-Frequency Representations
Sergey Verbitskiy, Milana Shkhanukova, Viacheslav Vyshegorodtsev
Deepsound, Novosibirsk, Russia
Abstract
This technical report describes our approach for the DCASE2022 Challenge Task 2. This task aims to continue research on unsupervised anomalous sound detection and develop new high-performing systems for monitoring the condition of machines. In contrast to the DCASE2021 Challenge Task 2, the 2022 task primarily focuses on domain generalization. First and foremost, we propose the idea of using ensembles of 2D CNN-based systems that utilize different time-frequency representations as input features. We use normal sound clips and their section indices to train our anomalous sound detection (ASD) systems for each machine type, and embedding vectors extracted from our CNNs, cosine similarity, and the k-nearest neighbors algorithm (k-NN) to calculate the anomaly scores of test clips. As a result, our method achieves the official score of 0.725 on the development dataset and significantly outperforms the baseline systems.
System characteristics
Classifier | ArcFace, CNN, ensemble, k-NN |
System complexity | 12019460, 18029190, 6009730 parameters |
Acoustic features | GFCC, MFCC, log-mel energies |
Data augmentation | temporal cropping, SpecAgment |
Decision making | average |
Subsystem count | 2, 3 |
Anomalous Sound Detection System with Self-challenge and Metric Evaluation for DCASE2022 Challenge Task 2
Yuming Wei, Jian Guan, Haiyan Lan, Wenwu Wang
College of Computer Science and Technology, Harbin Engineering University, Harbin, China and Centre for Vision, Speech and Signal Processing, University of Surrey, Guildford, UK
Wei_HEU_task2_1 Wei_HEU_task2_2 Wei_HEU_task2_3
Anomalous Sound Detection System with Self-challenge and Metric Evaluation for DCASE2022 Challenge Task 2
Yuming Wei, Jian Guan, Haiyan Lan, Wenwu Wang
College of Computer Science and Technology, Harbin Engineering University, Harbin, China and Centre for Vision, Speech and Signal Processing, University of Surrey, Guildford, UK
Abstract
This technical report describes our submission for DCASE2022 Challenge Task 2. To solve the domain generalization problem in anomalous sound detection (ASD), we present an ensemble system with two proposed unsupervised anomalous sound detection methods, i.e., a self-supervised classifier with a self-challenge strategy, and a distance metric evaluation based method. Experiments conducted show that our ensemble system can achieve an average of 87.07\% in harmonic mean AUC score under the source domain (h-mean AUC-s), and an average of 76.22\% in harmonic mean AUC score under the target domain (h-mean AUC-t), and an average of 66.76\% in harmonic mean pAUC (h-mean pAUC) score.
System characteristics
Classifier | ArcFace, CNN, clustering, ensemble |
System complexity | 0, 1347392 parameters |
Acoustic features | log-mel spectrogram, raw waveform |
Decision making | average |
Subsystem count | 2 |
An Outlier Exposed Anomalous Sound Detection System for Domain Generalization in Machine Condition Monitoring
Kevin Wilkinghoff
Communication Systems, Fraunhofer Institute for Communication, Information Processing and Ergonomics FKIE, Wachtberg, Germany
Wilkinghoff_FKIE_task2_1 Wilkinghoff_FKIE_task2_2
An Outlier Exposed Anomalous Sound Detection System for Domain Generalization in Machine Condition Monitoring
Kevin Wilkinghoff
Communication Systems, Fraunhofer Institute for Communication, Information Processing and Ergonomics FKIE, Wachtberg, Germany
Abstract
Emitted machine sounds can change drastically due to a change in settings of machines or due to varying noise conditions. This is a problem when monitoring the condition of these machines with a trained anomalous sound detection system because after changing the acoustic conditions the normal sounds are often falsely marked as anomalous. The goal of task 2 "Unsupervised Anomalous Sound Detection for Machine Condition Monitoring Applying Domain Generalization Techniques" of the DCASE 2022 challenge is to develop systems that reliably detect anomalous sounds regardless of whether characteristics of machine sounds are changing or not. In this work, a conceptually simple outlier exposed anomalous sound detection system is presented that is specifically designed for domain generalization. To this end, multiple feature representations and carefully designed sub-system architectures are utilized inside a single neural network. Furthermore, a technique called domain mixup is presented to further improve the domain generalization capabilities.
System characteristics
Classifier | CNN, GMM, ensemble |
System complexity | 977811200 parameters |
Acoustic features | log-mel energies, magnitude spectrum |
Data augmentation | mixup |
Decision making | sum |
Subsystem count | 40 |
ANOMALY DETECTION USING AUTOENCODER, IDNN AND U-NET USING ENSEMBLE
Jun’ya Yamashita, Ryosuke Tanaka, Keisuke Ikeda, Shiiya Aoyama, Satoru Hayamizu, Satoshi Tamura
Gifu University, Gifu, Japan
Yamashita_GU_task2_1 Yamashita_GU_task2_2 Yamashita_GU_task2_3 Yamashita_GU_task2_4
ANOMALY DETECTION USING AUTOENCODER, IDNN AND U-NET USING ENSEMBLE
Jun’ya Yamashita, Ryosuke Tanaka, Keisuke Ikeda, Shiiya Aoyama, Satoru Hayamizu, Satoshi Tamura
Gifu University, Gifu, Japan
Abstract
This paper presents our efforts for DCASE 2022 Challenge Task 2. We built several anomaly detectors based on AutoEncoder (AE), Interpolation Deep Neural Network (IDNN) with acoustic noise, U-Net with mask patches. Through experiments using those detection schemes as well as training and development data sets, we found the best model for each machine type is different. We further integrated anomaly scores obtained from every detectors by ensemble technique. Our results show that we could improve Area Under the Curve (AUC) scores particularly for target domains.
System characteristics
Classifier | AE, CNN, IDNN, U-Net, ensemble |
System complexity | 11201923, 13930196, 2164433, 5602001 parameters |
Acoustic features | log spectrogram, log-mel energies |
Subsystem count | 3, 4 |
ANOMALY DETECTION WITH SELF-SUPERVISED AUDIO EMBEDDINGS
Ivan Zorin, Ilya Makarov
Industrial AI, Artificial Intelligence Research Institute, Moscow, Russia
Zorin_AIRI_task2_1
ANOMALY DETECTION WITH SELF-SUPERVISED AUDIO EMBEDDINGS
Ivan Zorin, Ilya Makarov
Industrial AI, Artificial Intelligence Research Institute, Moscow, Russia
Abstract
The majority of approaches to machine condition monitoring via anomalous sound detection are based on supervised learning. The metadata of the datasets is used as data labels for training supervised models. However, data labeling is expensive and often impossible for industries with significant amount of equipment. In this case self-supervised methods could solve the problem since they do not require labeled data. In this work we applied the recent self-supervised approach to compute embeddings of audio signals named BYOL-A and classical machine learning method Local Outlier Factor (LOF) to compute outlier scores for anomalous sounds. The main focus of this work is to not use any labels from the metadata of the datasets and explore a self-supervised learning approach.
System characteristics
Classifier | LOF |
Acoustic features | log-mel energies |
Data augmentation | mixup, random crop, resize |