2006
DOI: 10.1145/1147954.1147959
|View full text |Cite
|
Sign up to set email alerts
|

Probabilistic parsing strategies

Abstract: We present new results on the relation between purely symbolic contextfree parsing strategies and their probabilistic counter-parts. Such parsing strategies are seen as constructions of push-down devices from grammars. We show that preservation of probability distribution is possible under two conditions, viz. the correct-prefix property and the property of strong predictiveness. These results generalize existing results in the literature that were obtained by considering parsing strategies in isolation. From … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
14
0
1

Year Published

2007
2007
2014
2014

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 13 publications
(15 citation statements)
references
References 46 publications
(70 reference statements)
0
14
0
1
Order By: Relevance
“…Lang (1974) showed that the cubic tabular method of Earley can be naturally applied to PDAs; others give the weighted generalizations (Stolcke 1995;Nederhof and Satta 2006). Earley's algorithm has its analogs in the algorithm in Figure 9: the scan step corresponds to taking a non-parenthesis transition at line 10, the predict step to taking an open parenthesis at lines 14-15, and the complete step to taking the closed parentheses at lines 16-18.…”
Section: Figurementioning
confidence: 99%
“…Lang (1974) showed that the cubic tabular method of Earley can be naturally applied to PDAs; others give the weighted generalizations (Stolcke 1995;Nederhof and Satta 2006). Earley's algorithm has its analogs in the algorithm in Figure 9: the scan step corresponds to taking a non-parenthesis transition at line 10, the predict step to taking an open parenthesis at lines 14-15, and the complete step to taking the closed parentheses at lines 16-18.…”
Section: Figurementioning
confidence: 99%
“…The results cited above were obtained with more or less explicitly stated assumptions on the grammar; sometimes chain and epsilon rules are not allowed or the grammar must be unambiguous. In [6,24,25] these restrictions are lifted. The authors prove that the relative frequency, the expectation maximization and a new cross-entropy minimization approach each yield a consistent grammar without restrictions on the grammar.…”
Section: Training and Consistency Of Stochastic Context-free Grammarsmentioning
confidence: 99%
“…The placeholders of the probabilities p 11 , p 16 , p 21 , p 22 , p 24 are now replaced by the functionŝ 11 (n) +c d, 11 (n) ,p d, 16 (n) := c d, 16 (n)/ c d, 16 (n) +c d, 16 (n) , 22 (n) +c d, 22 (n) , 11 (n) ∼ a 11 · n · ln(n) + b 11 · n + c 11 ,c d,11 (n) ∼ā 11 · n · ln(n) +b 11 · n +c 11 , c d, 16 (n) ∼ a 16 · n · ln(n) + b 16 · n + c 16 ,c d, 16 (n) ∼ā 16 · n +b 16 , c d, 21 (n) ∼ a 21 · n + b 21 · ln(n) + c 21 ,c d,21 (n) ∼ā 21 · n · ln(n) +b 21 · n +c 21 · ln(n) +d 21 , 24 (n) ∼ a 24 · n + b 24 ,c d, 24 (n) ∼ā 24 · n +b 24 .…”
Section: Fig 14unclassified
See 2 more Smart Citations