2010
DOI: 10.1109/lsp.2009.2033967
|View full text |Cite
|
Sign up to set email alerts
|

Automatic Optimization of Speech Decoder Parameters

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0
3

Year Published

2012
2012
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 13 publications
(14 citation statements)
references
References 14 publications
0
11
0
3
Order By: Relevance
“…As grid search for such parameters is very costly we have made use of gradient based [32] to find the best possible configurations. As we found clear dependence of such parameters on the acoustic and language models used, optimal operating curves were generated for each acoustic model/language model combination.…”
Section: Decodingmentioning
confidence: 99%
“…As grid search for such parameters is very costly we have made use of gradient based [32] to find the best possible configurations. As we found clear dependence of such parameters on the acoustic and language models used, optimal operating curves were generated for each acoustic model/language model combination.…”
Section: Decodingmentioning
confidence: 99%
“…The proposed method significantly reduces computational costs in compared to [2], and the reduction is even greater compared to grid search. In contrast to [3] and [4], Simplified SPSA takes into account the real-time factor, which is of vital importance for the design of an ASR system.…”
Section: Simplified Simultaneous Perturbation Stochastic Approximatiomentioning
confidence: 99%
“…If causes a deterioration of the objective value, the optimal solution must stay at and at the next iteration obtain the estimation of the loss function with a new according to (2). Without an appropriate step size, the optimal solution will stay at forever, which significantly slows down the rate of convergence of the algorithm [8].…”
Section: Simplified Spsamentioning
confidence: 99%
See 1 more Smart Citation
“…For timeconstrained optimization, we present our own loss function strategies that are real-time factor (RTF) aware and compare the results to the Gradient Descent [4] method.…”
Section: Introductionmentioning
confidence: 99%