Proceedings of the 18th ACM International Conference on Multimodal Interaction 2016
DOI: 10.1145/2993148.2993184
|View full text |Cite
|
Sign up to set email alerts
|

Multiscale kernel locally penalised discriminant analysis exemplified by emotion recognition in speech

Abstract: We propose a novel method to learn multiscale kernels with locally penalised discriminant analysis, namely Multiscale-Kernel Locally Penalised Discriminant Analysis (MS-KLPD A). As an exemplary use-case, we apply it to recognise emotions in speech. Specifically, we employ the term of locally penalised discriminant analysis by controlling the weights of marginal sample pairs, while the method learns kernels with multiple scales. Evaluated in a series of experiments on emotional speech corpora, our proposed MS-K… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2018
2018
2019
2019

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 21 publications
0
3
0
Order By: Relevance
“…Specifically, the conventional linear subspace learning methods PCA, LPP [15], LDA, LDE [16] were considered. The recently proposed linear subspace learning methods, LDP [14], GbFA [17], and LPDA [8], were also used for comparison purposes. Furthermore,we tested three kernel subspace learning methods [10] using LDE, GbFA, and LPDA embedding graphs respectively.…”
Section: Rgsr Vs Conventional Subspace Learning Methods and Spectral Regressionmentioning
confidence: 99%
See 2 more Smart Citations
“…Specifically, the conventional linear subspace learning methods PCA, LPP [15], LDA, LDE [16] were considered. The recently proposed linear subspace learning methods, LDP [14], GbFA [17], and LPDA [8], were also used for comparison purposes. Furthermore,we tested three kernel subspace learning methods [10] using LDE, GbFA, and LPDA embedding graphs respectively.…”
Section: Rgsr Vs Conventional Subspace Learning Methods and Spectral Regressionmentioning
confidence: 99%
“…It should be noticed that we only consider the fully supervised case, where each training sample is labelled by a single emotional class. The embedding graphs (G (I ) and G (P ) ) are obtained using preexistent GE based algorithms -FDA [6], [10], LDP [14], LDE [16], GbFA [17], and Locally Penalised Discriminant Analysis (LPDA; [8]). The description of the FDA and LDP embedding graphs can be found in [6], [10], [14] (also introduced in Section II-B).…”
Section: System Setupmentioning
confidence: 99%
See 1 more Smart Citation