2016 IEEE Spoken Language Technology Workshop (SLT) 2016
DOI: 10.1109/slt.2016.7846303
|View full text |Cite
|
Sign up to set email alerts
|

Automated optimization of decoder hyper-parameters for online LVCSR

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 9 publications
0
3
0
Order By: Relevance
“…As a result, optimization using both algorithm performance and hardware cost should be considered, especially for edge devices. Hardware-related costs can be measured in different ways; e.g., through energy consumption (Hernández-Lobato et al 2016) or memory utilization (Chandra and Lane 2016). In many cases, these measures are estimated as a function of the hyperparameters.…”
Section: Multi-objective Hpo: Typical Objectivesmentioning
confidence: 99%
“…As a result, optimization using both algorithm performance and hardware cost should be considered, especially for edge devices. Hardware-related costs can be measured in different ways; e.g., through energy consumption (Hernández-Lobato et al 2016) or memory utilization (Chandra and Lane 2016). In many cases, these measures are estimated as a function of the hyperparameters.…”
Section: Multi-objective Hpo: Typical Objectivesmentioning
confidence: 99%
“…However, it can be inferred that this comparison is not made for an equal budget of candidate solution evaluations and is therefore difficult to generalize from. Chandrashekaran et al [16] report the number of candidate evaluations in a comparative optimization of a speech recognition model. However, the use of a very small population size again limits the generalizability of the conclusions.…”
Section: Applications Of Bayesian Optimizationmentioning
confidence: 99%
“…For the decoder-CBO experiments (CBO-D), we used the manually selected model hyper-parameter configuration and ran 20 iterations of constrained Bayesian Optimization. The choice of 20 rounds of decoder hyper-parameter optimization is based on the methodology tried in [18].…”
Section: Optimization Pipelinementioning
confidence: 99%