2005
DOI: 10.1121/1.4809086
|View full text |Cite
|
Sign up to set email alerts
|

Some articulatory details of emotional speech

Abstract: Differences in speech articulation among four emotion types, neutral, anger, sadness, and happiness are investigated by analyzing tongue tip, jaw, and lip movement data collected from one male and one female speaker of American English. The data were collected using an electromagnetic articulography (EMA) system while subjects produce simulated emotional speech. Pitch, root-mean-square (rms) energy and the first three formants were estimated for vowel segments. For both speakers, angry speech exhibited the lar… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
27
1
1

Year Published

2007
2007
2021
2021

Publication Types

Select...
6
2
1

Relationship

2
7

Authors

Journals

citations
Cited by 18 publications
(31 citation statements)
references
References 0 publications
2
27
1
1
Order By: Relevance
“…These emotional databases were chosen to span different emotional categories, speakers, genders, and even languages, with the purpose to include, to some extent, the variability found in the pitch. The first database was collected at the University of Southern California (USC) using an electromagnetic articulography (EMA) system [21]. In this database, which will be referred to here on as EMA, one male and two female subjects (two of them with formal theatrical vocal training) read ten sentences five times portraying the emotions sadness, anger, and happiness, in addition to neutral state.…”
Section: B Databasesmentioning
confidence: 99%
“…These emotional databases were chosen to span different emotional categories, speakers, genders, and even languages, with the purpose to include, to some extent, the variability found in the pitch. The first database was collected at the University of Southern California (USC) using an electromagnetic articulography (EMA) system [21]. In this database, which will be referred to here on as EMA, one male and two female subjects (two of them with formal theatrical vocal training) read ten sentences five times portraying the emotions sadness, anger, and happiness, in addition to neutral state.…”
Section: B Databasesmentioning
confidence: 99%
“…Lee et. al., [7] conducted a study of the averaged tongue tip movement velocity for each of four peripheral vowels sounds (/IY/, /AE/, /AA/, /UW/) in American English as a function of four emotions. Results indicated angry speech to be characterized by greater ranges of displacement and velocity, while it was opposite in sad speech.…”
Section: Previous Studies In Speech Emotion Recognitionmentioning
confidence: 99%
“…Similar to the experimental set up used in our previous work [6], three emotional databases are used: USC-EMA [8], EMO-DB [9], and EPSAT [10]. Table 1 gives details about these corpora.…”
Section: Databases and Emotion Detection Classifiersmentioning
confidence: 99%