2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2014
DOI: 10.1109/icassp.2014.6853742
|View full text |Cite
|
Sign up to set email alerts
|

Analysis of laughter and speech-laugh signals using excitation source information

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
11
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 17 publications
(11 citation statements)
references
References 7 publications
0
11
0
Order By: Relevance
“…No clear answer has been given concerning this aspect of amused speech, although Trouvain rejected the hypothesis of a smile-laughter continuum [43]. Dumpala et al found that f0 was higher in laughter than in speech-laughs and higher in speech-laughs than in neutral speech [9].…”
Section: Smiled Speechmentioning
confidence: 99%
See 1 more Smart Citation
“…No clear answer has been given concerning this aspect of amused speech, although Trouvain rejected the hypothesis of a smile-laughter continuum [43]. Dumpala et al found that f0 was higher in laughter than in speech-laughs and higher in speech-laughs than in neutral speech [9].…”
Section: Smiled Speechmentioning
confidence: 99%
“…There has been very little work on synthesis and recognition of speech-laughs and smiled speech. Dumpala et al [9] present a speech-laugh/laughter discrimination system. Oh and Wang [35] tried real-time modulation of neutral speech to make it closer to speech-laughs, based on the variation of characteristics such as pitch, rhythm and tempo.…”
Section: Smiled Speechmentioning
confidence: 99%
“…Speech-laugh (SL) refers to the segments of speech, where laughter co-occurs with neutral speech [16], [17]. Speech-laugh exhibits characteristics of both laughter and neutral speech, but it is very distinct from both, laughter and neutral speech [12], [17], [18]. Hence, speech-laugh forms a separate class, which carries both, linguistic and non-linguistic information [18].…”
Section: Background and Proposed Approachmentioning
confidence: 99%
“…Studies were also made comparing speech-laughs and isolated laughter. Dumpala proposed a feature extraction and comparison successfully discriminating between laughter and speech-laughs [3]. In [4], Menezes exposed an acoustic comparison (formant frequency values and pitch) between neutral speech, speech-laughs and laughter.…”
Section: Introductionmentioning
confidence: 99%