2017
DOI: 10.1371/journal.pone.0182580
|View full text |Cite
|
Sign up to set email alerts
|

An analysis of the influence of deep neural network (DNN) topology in bottleneck feature based language recognition

Abstract: Language recognition systems based on bottleneck features have recently become the state-of-the-art in this research field, showing its success in the last Language Recognition Evaluation (LRE 2015) organized by NIST (U.S. National Institute of Standards and Technology). This type of system is based on a deep neural network (DNN) trained to discriminate between phonetic units, i.e. trained for the task of automatic speech recognition (ASR). This DNN aims to compress information in one of its layers, known as b… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
27
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
3
3

Relationship

3
7

Authors

Journals

citations
Cited by 51 publications
(29 citation statements)
references
References 13 publications
0
27
0
Order By: Relevance
“…Finally the output layer O produces the output of the DNN for the target task (for the case of classification, the posterior probability of an input vector to belong to each of the C classes). Reprinted from [ 19 ] under a CC BY license, with permission from Alicia Lozano et. al., original copyright 2017.…”
Section: Deep Neural Network and Speech Processingmentioning
confidence: 99%
“…Finally the output layer O produces the output of the DNN for the target task (for the case of classification, the posterior probability of an input vector to belong to each of the C classes). Reprinted from [ 19 ] under a CC BY license, with permission from Alicia Lozano et. al., original copyright 2017.…”
Section: Deep Neural Network and Speech Processingmentioning
confidence: 99%
“…to the weighted sum and sends the result out of the neuron . The parameters W j (weights matrix) and b j (bias vectors) control the behavior of the DNN models by being adjusted repeatedly until the cost function reaches its minimum value [11].…”
Section: Deep Neural Network (Dnn)mentioning
confidence: 99%
“…Despite the success of BNFs for SID [13,14,15] and LID [16,17,18,19,20,21], the variable length of this frame-wise representation poses a challenge in consequent modeling. The classical i-vector compacts the utterance representation in a fixedlength vector.…”
Section: Utterance Level Representation: Dnn-based Embeddingsmentioning
confidence: 99%