2023
DOI: 10.1109/access.2023.3244390
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Label Multimodal Emotion Recognition With Transformer-Based Fusion and Emotion-Level Representation Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 23 publications
(8 citation statements)
references
References 35 publications
0
2
0
Order By: Relevance
“…Transformers are one of the techniques used in the Deep learning framework. The tex-tual data can be processed by the transformers for recognizing the emotions [31], [32]. Transformers are used in sequence transduction.…”
Section: Deep Learning Techniques Used For Emotion Detectionmentioning
confidence: 99%
See 1 more Smart Citation
“…Transformers are one of the techniques used in the Deep learning framework. The tex-tual data can be processed by the transformers for recognizing the emotions [31], [32]. Transformers are used in sequence transduction.…”
Section: Deep Learning Techniques Used For Emotion Detectionmentioning
confidence: 99%
“…Transformers: [76] showed Emotion Recognition from Videos Using Transformer Models. [32] presented transformer-based fusion and representation learning method to fuse and enrich multimodal features from raw videos for the task of multi-label video emotion recognition. [77] presented bi-modal transformer.…”
Section: Machine Learning Techniques Used For Emotion Detectionmentioning
confidence: 99%
“…where, 𝑦 𝑠 represents the true labels, and 𝑦 𝑠 ̂ represents the correct label predicted by the model. The F-score is a composite metric that is a weighted harmonic average of precision and recall [30]. The F1 score is based on a harmonic average of precision and recall, calculated by:…”
Section: Evaluation Of the Proposed Systemmentioning
confidence: 99%
“…Attention mechanisms and transformer architectures have then emerged as powerful techniques to achieve state-of-theart results in various natural language processing (NLP) tasks, including multi-label emotion classification [12], [20]. Transformers operate by employing self-attention mechanisms, capturing contextual relationships between tokens (words) in a sequence.…”
Section: A Multi-label Emotion Classificationmentioning
confidence: 99%