2001
DOI: 10.21236/ada460991
|View full text |Cite
|
Sign up to set email alerts
|

On Combining Language Models: Oracle Approach

Abstract: In this paper, we address the problem of combining several language models (LMs). We find that simple interpolation methods, like log-linear and linear interpolation, improve the performance but fall short of the performance of an oracle. The oracle knows the reference word string and selects the word string with the best performance (typically, word or semantic error rate) from a list of word strings, where each word string has been obtained by using a different LM. Actually, the oracle acts like a dynamic co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

1
2
0

Year Published

2001
2001
2002
2002

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 6 publications
(7 reference statements)
1
2
0
Order By: Relevance
“…More recently, we have developed a dialog context dependent language model (LM) combining stochastic context free grammars (SCFGs) and n-grams [6,7]. Based on a spoken language production model in which a user picks a set of concepts with respective values and constructs word sequences using phrase generators associated with each concept in accordance with the dialog context, this LM computes the probability of a word, P(W), as…”
Section: Figure 2 Examples Of Class-based and Grammar-based Language Modelingmentioning
confidence: 99%
“…More recently, we have developed a dialog context dependent language model (LM) combining stochastic context free grammars (SCFGs) and n-grams [6,7]. Based on a spoken language production model in which a user picks a set of concepts with respective values and constructs word sequences using phrase generators associated with each concept in accordance with the dialog context, this LM computes the probability of a word, P(W), as…”
Section: Figure 2 Examples Of Class-based and Grammar-based Language Modelingmentioning
confidence: 99%
“…We have also confirmed, as others have done [18,19,20,21], that semantic knowledge extracted by a parser can be applied to rescore -best word hypotheses from the speech recognizer to improve both WER and overall end-to-end performance.…”
Section: Resultssupporting
confidence: 76%
“…First, it is easier to implement since the actual implementation deals with the logarithm of probabilities. Second, its performance has been found better than the linear interpolation [11,12]. …”
Section: [Date] [Yes]) Each Concept (Except Degenerate Single Word mentioning
confidence: 99%