2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2012
DOI: 10.1109/icassp.2012.6289044
|View full text |Cite
|
Sign up to set email alerts
|

Performance analysis of Neural Networks in combination with n-gram language models

Abstract: Neural Network language models (NNLMs) have recently become an important complement to conventional n-gram language models (LMs) in speech-to-text systems. However, little is known about the behavior of NNLMs. The analysis presented in this paper aims to understand which types of events are better modeled by NNLMs as compared to n-gram LMs, in what cases improvements are most substantial and why this is the case. Such an analysis is important to take further benefit from NNLMs used in combination with conventi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0
1

Year Published

2013
2013
2024
2024

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(13 citation statements)
references
References 14 publications
0
12
0
1
Order By: Relevance
“…A back-off to lower order distributions is still required. Techniques that can represent n-gram probabilities in a continuous space, such as NNLMs, can alleviate this problem [23,20]. Hence, in order to to leverage the strengths of both models, the combination between paraphrastic LMs and NNLMs is investigated in this paper.…”
Section: Introductionmentioning
confidence: 99%
“…A back-off to lower order distributions is still required. Techniques that can represent n-gram probabilities in a continuous space, such as NNLMs, can alleviate this problem [23,20]. Hence, in order to to leverage the strengths of both models, the combination between paraphrastic LMs and NNLMs is investigated in this paper.…”
Section: Introductionmentioning
confidence: 99%
“…A fair comparison is however possible on smaller setups. The reader may find a discussion for which type of events NNLMs do better that -gram LMs and vice versa in [25]. used, all words are clustered).…”
Section: Nnlm Configurationsmentioning
confidence: 99%
“…n-gram LMs have been the dominant language models during the last several decades [25], [26]. Uni-RNNLMs were shown to present different and complementary modeling ability to n-gram LMs [27], [28]. Improved performance can be obtained by interpolating n-gram and uni-RNNLMs [27].…”
Section: A Language Model Interpolationmentioning
confidence: 99%