“…As a result, [Li et al, 2018] 39.0 35.2 48.6 76.1 72.9 81.9 86.2 58.8 37.5 59.1 35.9 35.8 55.9 JAA-Net [Shao et al, 2018] 47.2 44.0 54.9 77.5 74.6 84.0 86.9 61.9 43.6 60.3 42.7 41.9 60.0 LP-Net [Niu et al, 2019] 43.4 38.0 54.2 77.1 76.7 83.8 87.2 63.3 45.3 60.5 48.1 54.2 61.0 ARL [Shao et al, 2019] 45.8 39.8 55.1 75.7 77.2 82.3 86.6 58.8 47.6 62.1 47.4 [55.4] 61.1 SEV-Net [Yang et al, 2021] [58.2] [50.4] 58.3 [81.9] 73.9 [87.8] 87.5 61.6 [52.6] 62.2 44.6 47.6 63.9 FAUDT [Jacob and Stenger, 2021] 51.7 [49.3] [61.0] 77.8 79.5 82.9 86.3 [67.6] 51.9 63.0 43.7 [56.3] 64.2 SRERL [Li et al, 2019a] 46.9 45. [Li et al, 2018] 41.5 26.4 66.4 50.7 [80.5] [89.3] 88.9 15.6 48.5 JAA-Net [Shao et al, 2018] 43.7 46.2 56.0 41.4 44.7 69.6 88.3 58.4 56.0 LP-Net [Niu et al, 2019] 29.9 24.7 72.7 46.8 49.6 72.9 93.8 65.0 56.9 ARL [Shao et al, 2019] 43.9 the categorical cross-entropy loss is introduced as:…”