2022
DOI: 10.48550/arxiv.2203.13072
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Multitask Emotion Recognition Model with Knowledge Distillation and Task Discriminator

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 0 publications
0
1
0
Order By: Relevance
“…Unlike the experimental setup in ABAW and ABAW2, where three different tasks were completed independently, ABAW3 presents an integrated metric and evaluates the performance of all three tasks simultaneously. Deng [3] employs psychological prior knowledge for multi-task estimation, which uses local features for AU recognition and merges the messages of different regions for EXPR and VA. Jeong et al [8] apply the knowledge distillation technique for a better generalization performance and the domain adaptation techniques to improve accuracy in target domains. Savchenko et al [10] used a lightweight EfficientNet model to develop a real-time framework and improve performance by pre-training based on additional data.…”
Section: Related Workmentioning
confidence: 99%
“…Unlike the experimental setup in ABAW and ABAW2, where three different tasks were completed independently, ABAW3 presents an integrated metric and evaluates the performance of all three tasks simultaneously. Deng [3] employs psychological prior knowledge for multi-task estimation, which uses local features for AU recognition and merges the messages of different regions for EXPR and VA. Jeong et al [8] apply the knowledge distillation technique for a better generalization performance and the domain adaptation techniques to improve accuracy in target domains. Savchenko et al [10] used a lightweight EfficientNet model to develop a real-time framework and improve performance by pre-training based on additional data.…”
Section: Related Workmentioning
confidence: 99%