2021
DOI: 10.1101/2021.09.01.458536
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Generalizable Speech Emotion Recognition Model Reveals Depression and Remission

Abstract: Objective: Affective disorders have long been associated with atypical voice patterns, however, current work on automated voice analysis often suffers from small sample sizes and untested generalizability. This study investigated a generalizable approach to aid clinical evaluation of depression and remission from voice. Methods: A Mixture-of-Experts machine learning model was trained to infer happy/sad emotional state using three publicly available emotional speech corpora. We examined the model's predictive … Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
19
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
2

Relationship

3
2

Authors

Journals

citations
Cited by 8 publications
(21 citation statements)
references
References 59 publications
0
19
0
Order By: Relevance
“…The 5 SVM models were used to assess the test sets, and their predictions were combined into a single voting ensemble (Brownlee, 2020; Hansen et al, 2021; Sechidis et al, 2021). Each model made a prediction for each voice recording in the test set, and the ensemble model gave a final predicted class based on the majority of these model votes.…”
Section: Methodsmentioning
confidence: 99%
See 4 more Smart Citations
“…The 5 SVM models were used to assess the test sets, and their predictions were combined into a single voting ensemble (Brownlee, 2020; Hansen et al, 2021; Sechidis et al, 2021). Each model made a prediction for each voice recording in the test set, and the ensemble model gave a final predicted class based on the majority of these model votes.…”
Section: Methodsmentioning
confidence: 99%
“…Each model made a prediction for each voice recording in the test set, and the ensemble model gave a final predicted class based on the majority of these model votes. Note that other systems beyond majority rules have been developed, e.g., Mixture of Experts with weights based on similarity between test and training data (Hansen et al, 2021; Sechidis et al, 2021). Combining or utilizing multiple models within a single model – such as an ensemble model – benefits performance and generalizability, since no two models are likely to overfit in the same way and different models can compensate for each other ‘s biases (Buracas & Albright, 1993; Hong & Page, 2004; Tang et al, 2005).…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations