IberSPEECH 2018 2018
DOI: 10.21437/iberspeech.2018-36
|View full text |Cite
|
Sign up to set email alerts
|

Bottleneck and Embedding Representation of Speech for DNN-based Language and Speaker Recognition

Abstract: In this manuscript, we summarize the findings presented in Alicia Lozano Diez's Ph.D. Thesis, defended on the 22nd of June, 2018 in Universidad Autonoma de Madrid (Spain). In particular, this Ph.D. Thesis explores different approaches to the tasks of language and speaker recognition, focusing on systems where deep neural networks (DNNs) become part of traditional pipelines, replacing some stages or the whole system itself. First, we present a DNN as classifier for the task of language recognition. Second, we a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2020
2020
2020
2020

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 13 publications
0
1
0
Order By: Relevance
“…Our main contribution is to perform this adaptation for the segment-level embedding extractor. The segment-level embedding extractor of a state-of-the-art language identification system is a neural network [11], which can be used for direct classification [12], or just to extract representations [4] which are then processed by a classifier. Different model-based domain adaptation methods have been introduced in the neural network training literature.…”
Section: Introductionmentioning
confidence: 99%
“…Our main contribution is to perform this adaptation for the segment-level embedding extractor. The segment-level embedding extractor of a state-of-the-art language identification system is a neural network [11], which can be used for direct classification [12], or just to extract representations [4] which are then processed by a classifier. Different model-based domain adaptation methods have been introduced in the neural network training literature.…”
Section: Introductionmentioning
confidence: 99%