Wiley Encyclopedia of Electrical and Electronics Engineering 2015
DOI: 10.1002/047134608x.w8271
|View full text |Cite
|
Sign up to set email alerts
|

Human Affect Recognition: Audio‐Based Methods

Abstract: Automatic human affect recognition aims to automatically predict affect‐related information from humans observed for a certain time span. Such computer assessment of human emotion is described for audio‐based methods. This includes first acoustic analysis with suited features and segmentation of the speech signal. Then follows linguistic analysis including speech‐to‐text processing and representation of the text in a feature space, as well as joint acoustic–linguistic processing. Subsequently, data collection … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2015
2015
2015
2015

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 67 publications
0
1
0
Order By: Relevance
“…In fact, they are expected to play a major role that will likely make 'that difference' in future intelligent User Interfaces (IUIs), as they bear the promise to lend them emotional intelligence: Interfaces that 'know' and can react appropriately to, e. g., the satisfaction or anger of their users can lead us away from the often prevailing connotation of 'cold' and 'mechanical' that the interfaces of the current generation are still partially faced with. The information is thereby increasingly accessed from multiple modalities [15,17] -both, in affect recognition and sentiment analysis -and 'in the wild' [33,9], thanks to the availability of increasingly large and realistic resources [24] and improved algorithms [21] including deep learning [26] and long-short-term memory architectures [14] and weakly supervised learning methods [25]. Besides a certain focus on analysis in research to date, a system that emulates emotion is likely to be perceived as even more 'emotionally intelligent' given that it manages to overcome the uncanny valley.…”
Section: Introductionmentioning
confidence: 99%
“…In fact, they are expected to play a major role that will likely make 'that difference' in future intelligent User Interfaces (IUIs), as they bear the promise to lend them emotional intelligence: Interfaces that 'know' and can react appropriately to, e. g., the satisfaction or anger of their users can lead us away from the often prevailing connotation of 'cold' and 'mechanical' that the interfaces of the current generation are still partially faced with. The information is thereby increasingly accessed from multiple modalities [15,17] -both, in affect recognition and sentiment analysis -and 'in the wild' [33,9], thanks to the availability of increasingly large and realistic resources [24] and improved algorithms [21] including deep learning [26] and long-short-term memory architectures [14] and weakly supervised learning methods [25]. Besides a certain focus on analysis in research to date, a system that emulates emotion is likely to be perceived as even more 'emotionally intelligent' given that it manages to overcome the uncanny valley.…”
Section: Introductionmentioning
confidence: 99%