2013
DOI: 10.1016/j.csl.2012.06.004
|View full text |Cite
|
Sign up to set email alerts
|

Use of contexts in language model interpolation and adaptation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
28
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
4
3
2

Relationship

2
7

Authors

Journals

citations
Cited by 30 publications
(28 citation statements)
references
References 36 publications
0
28
0
Order By: Relevance
“…These NNLMs are interpolated with the comparable back-off n-gram LMs based on their respective modelling unit, before the two interpolated LMs are finally log-linearly combined. The resulting multi-level NNLM can be used to further improve discrimination [15,16,17]. This requires word level lattices to be first converted to phrase level lattices before the log-linear combination is performed.…”
Section: Paraphrastic Feedforward Nnlmsmentioning
confidence: 99%
“…These NNLMs are interpolated with the comparable back-off n-gram LMs based on their respective modelling unit, before the two interpolated LMs are finally log-linearly combined. The resulting multi-level NNLM can be used to further improve discrimination [15,16,17]. This requires word level lattices to be first converted to phrase level lattices before the log-linear combination is performed.…”
Section: Paraphrastic Feedforward Nnlmsmentioning
confidence: 99%
“…For example, if a NNLM performs better for the events with a particular history count c(h), we could use it to implement context-dependent interpolation weights λ(h) instead of one single λ for each LM, similar as proposed in [11] for the combination of multiple n-gram models built from different sources. However, most cases observed so far when an NNLM does better or worse than in general, as compared to an n-gram model, are conditioned both on the predicted word and its n-gram history.…”
Section: N(h) < θ")mentioning
confidence: 99%
“…All the lattice rescoring experiments in this paper used an on-the-fly lattice expansion algorithm [12] suitable for a wide range of language models including back-off n-grams, feedforward NNLMs, recurrent NNLMs and their interpolated form [11]. A central part of the algorithm requires the LM state representation for the underlying model being used.…”
Section: Lattice Rescoring Using Rnnlmsmentioning
confidence: 99%