1996
DOI: 10.1006/csla.1996.0014
|View full text |Cite
|
Sign up to set email alerts
|

Stochastic automata for language modeling

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
53
0

Year Published

2000
2000
2017
2017

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 73 publications
(53 citation statements)
references
References 22 publications
(18 reference statements)
0
53
0
Order By: Relevance
“…This problem could be solved by setting N to a larger value, but this would increase computation cost substantially. Another approach is to use probabilistic finite-state grammars for the ASR language models [4]. Although this would improve ASR accuracy, some utterances might be misrecognized due to their structural constraints.…”
Section: Conceptmentioning
confidence: 99%
“…This problem could be solved by setting N to a larger value, but this would increase computation cost substantially. Another approach is to use probabilistic finite-state grammars for the ASR language models [4]. Although this would improve ASR accuracy, some utterances might be misrecognized due to their structural constraints.…”
Section: Conceptmentioning
confidence: 99%
“…The word trigram model and phone 5-gram language models in these recognizers were constructed using the stochastic language modeling techniques described by Riccardi, Pieraccini, and Bocchieri (1996).…”
Section: Utterance Featuresmentioning
confidence: 99%
“…The most successful parsing algorithm for finding the most likely sequence of spoken words is Viterbi decoding along with beam search [14]. A unified formalism for building a wide class of language models are the variable multi-gram stochastic automata [12].…”
Section: Introductionmentioning
confidence: 99%