2021
DOI: 10.3390/s21227530
|View full text |Cite
|
Sign up to set email alerts
|

The Impact of Attention Mechanisms on Speech Emotion Recognition

Abstract: Speech emotion recognition (SER) plays an important role in real-time applications of human-machine interaction. The Attention Mechanism is widely used to improve the performance of SER. However, the applicable rules of attention mechanism are not deeply discussed. This paper discussed the difference between Global-Attention and Self-Attention and explored their applicable rules to SER classification construction. The experimental results show that the Global-Attention can improve the accuracy of the sequentia… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 18 publications
(12 citation statements)
references
References 21 publications
0
8
0
Order By: Relevance
“…Attention mechanisms enable the model to consider the context and significance of a given input feature in relation to others in the sequence. If attention mechanisms are implemented, they improve the performance of deep learningbased speech emotion recognition systems [25]. Among the attention mechanisms mentioned earlier, multi-head attention used in [13] considers global contextual relationships among features in a parallel manner.…”
Section: Related Workmentioning
confidence: 99%
“…Attention mechanisms enable the model to consider the context and significance of a given input feature in relation to others in the sequence. If attention mechanisms are implemented, they improve the performance of deep learningbased speech emotion recognition systems [25]. Among the attention mechanisms mentioned earlier, multi-head attention used in [13] considers global contextual relationships among features in a parallel manner.…”
Section: Related Workmentioning
confidence: 99%
“…We are motivated that the careful use of attention mechanisms in combination with other deep learning techniques allows the model to take advantage of the merits of each. Moreover, it is stated in [23] that global attention mechanisms are suitable for SER.…”
Section: Related Workmentioning
confidence: 99%
“…It takes the feature map as input to obtain a feature vector containing semantic correlation. e attention vector is obtained by (7), and the output 􏽢…”
Section: Multiscale Feature Fusion Attention Mechanismmentioning
confidence: 99%
“…Although high-level features contain rich semantic information, they cannot capture long-term relationships well. Therefore, global pooling [ 4 ], dilated convolution [ 5 ], pyramid pooling [ 6 ], and attention mechanisms [ 7 ] are used to better aggregate context. Deeplabv3+ [ 20 ] fuses features of different scales to refine the object boundaries of the segmentation results.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation