[Proceedings] ICASSP 91: 1991 International Conference on Acoustics, Speech, and Signal Processing 1991
DOI: 10.1109/icassp.1991.150439
|View full text |Cite
|
Sign up to set email alerts
|

Integration of speech recognition and natural language processing in the MIT VOYAGER system

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

1991
1991
2016
2016

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 26 publications
(16 citation statements)
references
References 8 publications
0
16
0
Order By: Relevance
“…How these constraints are incorporated varies from estimating -gram probabilities from grammar-generated data [70] to computing a linear interpolation of the two models [43]. Most recently, syntactic information has been used specifically to determine equivalence classes on the -gram history, resulting in so-called dependency language models [19], [56], sometimes also referred to as structured language models [20], [42], [66].…”
Section: B Syntactically Driven Span Extensionmentioning
confidence: 99%
“…How these constraints are incorporated varies from estimating -gram probabilities from grammar-generated data [70] to computing a linear interpolation of the two models [43]. Most recently, syntactic information has been used specifically to determine equivalence classes on the -gram history, resulting in so-called dependency language models [19], [56], sometimes also referred to as structured language models [20], [42], [66].…”
Section: B Syntactically Driven Span Extensionmentioning
confidence: 99%
“…The first attempts are based on using context free grammars (CFGs) [27,137,71]. The main contribution of structured LMs is started with Chelba et al [24] in which a dependency grammar framework with maximum entropy models is used to constrain the word prediction by the linguistically related words in the past.…”
Section: Structured Lmsmentioning
confidence: 99%
“…In [137], LMs are trained by generating syntactically plausible sentences by using a natural language component. Jurafsky et al [71] use stochastic CFGs (SCFGs) to extend the corpus for training and interpolates SCFG probabilities with bi-gram probabilities.…”
Section: Language Modelmentioning
confidence: 99%
“…In addition, the use of combined knowledge sources can change computational efficiency by applying language constraints to predict possible next words, thus achieving significant pruning of the recognition search space. In this paper, we report primarily on experiments that change the shape of the search space; this work has been done on our loosely-coupled system using the N-best interface [11]. However, we also report briefly on the status of our experiments on the tight coupling of speech recognition and language understanding.…”
Section: Introductionmentioning
confidence: 99%