2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2014
DOI: 10.1109/icassp.2014.6854535
|View full text |Cite
|
Sign up to set email alerts
|

Efficient lattice rescoring using recurrent neural network language models

Abstract: Recurrent neural network language models (RNNLM) have become an increasingly popular choice for state-of-the-art speech recognition systems due to their inherently strong generalization performance. As these models use a vector representation of complete history contexts, RNNLMs are normally used to rescore N-best lists. Motivated by their intrinsic characteristics, two novel lattice rescoring methods for RNNLMs are investigated in this paper. The first uses an n-gram style clustering of history contexts. The … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
66
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
4
4
1

Relationship

1
8

Authors

Journals

citations
Cited by 74 publications
(67 citation statements)
references
References 19 publications
0
66
0
Order By: Relevance
“…The hybrid input representation can considerably save both training and decoding time while still achieving slightly better recognition accuracy. In the future, we plan to apply more complex lattice re-scoring algorithms, such as the one described in [20], to the hybrid RNNLM to further improve recognition performance. A cache-based RNNLM which can be used straightaway in the first-pass decoding [21] will also be considered.…”
Section: Discussionmentioning
confidence: 99%
“…The hybrid input representation can considerably save both training and decoding time while still achieving slightly better recognition accuracy. In the future, we plan to apply more complex lattice re-scoring algorithms, such as the one described in [20], to the hybrid RNNLM to further improve recognition performance. A cache-based RNNLM which can be used straightaway in the first-pass decoding [21] will also be considered.…”
Section: Discussionmentioning
confidence: 99%
“…As RNNLMs use a vector representation of full histories, they are mostly used for N-best list rescoring. For more efficient lattice rescoring using RNNLMs, approximation schemes, for example, based on clustering among complete histories [14] can be used.…”
Section: Recurrent Neural Network Lmsmentioning
confidence: 99%
“…In recent years, the use of RNNLM have shown significant improvements over the traditional n-gram models (Sundermeyer et al, 2013). (Mikolov et al, 2010) and (Liu et al, 2014) have shown significant improvements in speech recognition accuracy using RNNLM . Shi (2012) also showed the benefits of using RNNLM with contextual and linguistic features.…”
Section: Related Workmentioning
confidence: 99%
“…We have integrated the approach of re-scoring n-best output using RNNLM which has also been shown to be helpful by (Liu et al, 2014). Shi (2012) also showed the benefits of using RNNLM with contextual and linguistic features.…”
Section: Re-ranking Experimentsmentioning
confidence: 99%