2016
DOI: 10.1007/s11042-016-3405-3
|View full text |Cite
|
Sign up to set email alerts
|

Emotional head motion predicting from prosodic and linguistic features

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2016
2016
2020
2020

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 32 publications
0
3
0
Order By: Relevance
“…These features would allow the preparation of parameters that can improve the classification efficiency of viseme groups consisting of phonemes with a strong involvement of the tongue during the articulation of a given speech fragment. In this context, one could consider using data from a specialized device of electromagnetic articulography with its adequate parametrization or as shown in a recent paper by Yang et al [43], one can also to employ emotional head motion predicting from prosodic and linguistic features or data acquired from a face motion capture device [24].…”
Section: Conclusion and Directions For Further Researchmentioning
confidence: 99%
“…These features would allow the preparation of parameters that can improve the classification efficiency of viseme groups consisting of phonemes with a strong involvement of the tongue during the articulation of a given speech fragment. In this context, one could consider using data from a specialized device of electromagnetic articulography with its adequate parametrization or as shown in a recent paper by Yang et al [43], one can also to employ emotional head motion predicting from prosodic and linguistic features or data acquired from a face motion capture device [24].…”
Section: Conclusion and Directions For Further Researchmentioning
confidence: 99%
“…Three-dimensional head and face motion and the acoustics of a talker producing Japanese sentences were recorded and analyzed previously in this context (Munhall et al 2004). The inherent relationship by building the mapping model between head motions and Chinese speech prosody and linguistic features has been also studied more recently (Minghao et al 2016).…”
Section: Historical Viewmentioning
confidence: 99%
“…Non-verbal cues, e.g., hand gestures and head motions, are used to express feelings, give feedbacks and engage immersive human-human communication. In [14], Yang et al study the relations between speech and head motion and learn a bimodal mapping from speech to head motions. Specifically, they give an interesting investigation to discover what kinds of prosodic and linguistic features have the most significant influence on emotional head motions.…”
Section: Recognizing Humans and Understanding Their Behaviorsmentioning
confidence: 99%