2022
DOI: 10.32604/csse.2022.021635
|View full text |Cite
|
Sign up to set email alerts
|

Emotion Recognition with Capsule Neural Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(6 citation statements)
references
References 47 publications
0
6
0
Order By: Relevance
“…It can be said that the initial successes in speech synthesis are based on hidden Markov models [5][6][7]. Associated with the development of artificial intelligence, speech synthesis models using neural networks have been proposed, such as DeepVoice [15] and DeepVoice 2 [16].…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…It can be said that the initial successes in speech synthesis are based on hidden Markov models [5][6][7]. Associated with the development of artificial intelligence, speech synthesis models using neural networks have been proposed, such as DeepVoice [15] and DeepVoice 2 [16].…”
Section: Related Workmentioning
confidence: 99%
“…We also statistically analyze the differences in emotions according to those characteristic parameters. We used the BKEmo data set to perform speech emotion recognition based on CapsNet [6].…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…From a computational perspective, emotional analysis has been approached from different areas to try to understand how emotions are constituted. This can include using text analysis (Alswaidan & Menai, 2020; Batbaatar, Li, & Ryu, 2019; Gupta, Roy, Batra, & Dubey, 2021; Sundaram, Ahmed, Muqtadeer, & Reddy, 2021), speech (Kumar, Jain, Raman, Roy, & Iwamura, 2021; Li & Xu, 2020; Van, Nguyen, & Le, 2022; Zhou, Liang, Gu, Yin, & Yao, 2022; Zhang & Xue, 2021a, 2021b), music (Du, Li, & Gao, 2020; Putkinen et al, 2021; Xu, Xu, & Zhang, 2021), or a combination of multiple media (Shen, Zheng, & Wang, 2021), among others.…”
Section: Introductionmentioning
confidence: 99%